IBM Design - 2015
Watson Hello is an app to showcase the capabilities of IBM Watson's Language Processing APIs.
The proof of concept was developed over the course of three weeks with a multidisciplinary team consisting of a User Researcher (me), three Designers, two Visual Designers, and a Front End Developer. As the User Researcher, I was responsible for coordinating generative interviews, facilitating design thinking, and testing with users, though I also created many of the wireframes.
At the start of the project we were introduced to stakeholders from the Watson organization. They gave us some background on the technology and the goals of the project.
As an analougue to Epics in Agile methodology, IBM Design uses Hills – high-level statements of intent written as meaningful user outcomes. This allows the business, technical, and design team to align around a high-level why before diving into the how. Stories can then be written that focus on specific parts of the user experience and sized by design and development.
The Original Hill
A consumer can communicate in-person with non-native speakers with their mobile device at the speed of thought, and give Watson feedback along the way.
By externalizing explicit and implicit assumptions about our process we reveal known unknowns that we can then validate through research.
With those we could then talk about what knowledge we needed to have a working understanding of our users that would serve as the foundation of our design. Post-its allow us to quickly iterate on the questions and methods in which we would collect the data.
The foundation of every sucessful product starts with a solid understanding of the user. In the early stage of product development, open-ended generative interviews are great for gathering a multitude of perspectives and potential use cases.
Within a week we interviewed 7 people, spanning frequent travellers, Peace Corps volunteers, a Doctors Without Borders worker, a physician that frequently had patients that did not speak english, and employees at multinational corporations who had to communicate in a non-native language daily.
After synthesizing the data, we could map the user scenarios to our specific strengths of and constraints of our technology and resources. We chose to follow up with three specific user types to create as-is journey maps. By designing for specific extreme examples we could both highlight the unique strengths of Watson’s NLP and make a product that could be generalizable to a majority of users.
Drawn from a mix of observed latent needs and current successful strategies, we developed 3 Principles that would guide our design.
Reviewing all the interview transcripts, we created Need Statements which represented a user with a problem with a clear outcome.
e.g. “Marion, the knowledge worker, needs to feel confident in the validity of a translation so that she can keep her focus on the task at hand”
This allows us to capture all the stories in a standardized way that we can then prioritize against technical scope. Over the course of a couple days the team would separate and reconvene multiple times as we diverged to quickly iterate individually and converge on interesting ideas. To communicate quickly and broadly we used post-it storyboards that we then refer to in metaphors like "a tamagotchi for instant translation" or "a spotify playlist of vocabulary words."
With this collection of ideas we could then work with our SMEs and stakeholders to map each concept with technical feasibility and user/business impact. In this way we align and focus around truly impactful, buildable features.
Real Time Translation
This would allow users to communicate naturally and face-to-face. The automated nature of this feature and the display of both languages keeps the user in context. The ability to go back and re-read a conversation allows the user to review and learn directly from their experiences.
Watson language translation, speech-to-text, text-to-speech and language detection APIs
This would allow the user to gain a deeper understanding of the language and culture and the person they were communicating with. Smart Words would allow the user to gain cultural context and could be reviewed at a later time.
Watson language translation, speech-to-text, tradeoff analytics APIs
Test & Iterate
After mocking up the main interactions with paper, we then tested with users. Our UX designer and Front End Developer made a interactive prototype with Facebook’s Origami.
After receiving positive feedback from users and stakeholders we packaged the research, designs, and code for handoff. Specific concerns we decided were out of scope for MVP were noted and annotated.
The next feature areas to explore would be:
Privacy settings, especially for patient doctor confidentiality/HIPAA rules.
Local storage for places with unreliable cell coverage.
Coordinating with the Watson Core Platform team on collecting and utilizaing identity sanitized data to train the dataset to be more accurate.