UXJosh ShiauUX

Watson For Drug Discovery

UXJosh ShiauUX
Watson For Drug Discovery

Watson Health – 2016

Watson for Drug Discovery (WDD) leverages machine learning to process Medline, full-text medical journals, and Patent databases to understand the relationships between specific genes, diseases, and drugs. Pharmaceutical companies and academic institutions buy licenses for their researchers to find and learn about new potential drug candidates.

In July of 2016, I was brought on the team as Lead User Researcher build out the research program for the alpha launch and envision what it might look like after.

 

The Problem

In order to effectively prioritize and design, the Design Team needed a system to gather evaluative, generative, and strategic user feedback that would scale with our user base and fit in Continuous Delivery.

 

Evaluative Research

As-Is

Previously, the Design Team would sit-in on checkpoint sessions between the clients and the Sales Team to gauge user response. An online wiki was used to host user feedback forms.

  • With no active engagement and incentive, feedback forms were rarely used.
  • Checkpoint calls could be insightful but were generally geared toward solving specific issues the client was having with the tool rather than producing a holistic understanding of the whole user experience.  
  • Checkpoint calls were with a panel of researchers, so depending on the personality of the researchers, individual concerns were often magnified or sidelined, and so did not provide an accurate representation of user satisfaction and needs.
  • With this method, it was tough to develop high-level insights or a nuanced understanding of the user, their process, or latent needs.

Now

A Sponsor User Program and newly instrumented metrics were constructed to bring a rich, nuanced understanding of types of user behaviors. 36 interviews and months of engagement metrics later, the Design Team has deep insight into the health of the product that it shares with other teams.

  • I worked with our product owner to construct a Sponsor User program, in which the Design Team would introduce themselves with every new client kickoff.
  • The interviews are one-on-one open-ended feedback sessions held during the beginning (day 1), learning (day 10), and established use (day 30) periods to create a longitudinal understanding of each user and their journey.
  • The 1-10-30 interview format lends itself to more exploration of individual issues and outcomes (one uncovered latent need later became the basis for a new key feature, uncovered outcomes were used to create a new case study for the sales team).
  • Instrumenting our product with metrics gives us granular information about how each individual user is interacting with the product, and interviews can be used to uncover why. The data can be used to group users into different cohorts (e.g. disease researcher vs. drug researchers) to study different user behaviors.
The chart was used to present to the product owner and sales team get buy-in to the sponsor user program.

The chart was used to present to the product owner and sales team get buy-in to the sponsor user program.

Understanding different levels of user engagements - Low engagement users have a sharp drop-off 4 weeks after gaining access to the tool.

Understanding different levels of user engagements - Low engagement users have a sharp drop-off 4 weeks after gaining access to the tool.

The drug discovery process often takes many years and includes many rounds of exploration. This process maps out where current users were finding success or dead-ends.

The drug discovery process often takes many years and includes many rounds of exploration. This process maps out where current users were finding success or dead-ends.

 

Generative Research

As-Is

The Design Team would schedule design review sessions through a user recruitment agency to validate designs.

  • Some users may not have enough experience in these hyper-specific domains or have experience working with a drug discovery research tool to answer questions or be really indicative of actual users.
  • Time was wasted with each new interviewee to explain the aim, interface, and workflow context to brand new users.
  • Each interview cost around $500, which was expensive for the team to maintain.

Now

The Design Team leverages user relationships from the 1-10-30’s to conduct surveys and prototype reviews with actual users with relevant concerns.

  • Real users speak to exactly how new features would positively or negatively affect their workflow.
  • With the breadth of knowledge we got from in-person interviews, we split up the users by behaviors, which we found roughly corresponded to their domain (gene, disease, or drug researcher)

An example of a new feature that came out of an offhand comment: The “Picking up where I left off” Feature: 

  1. 1-10-30 Day feedback illuminated the need across multiple users. The interviews also provided the context and validation to prioritize it as an urgent feature in the roadmap.
  2. I constructed an As-is and To-be Process Map.
  3. I conducted product research to see how other precedents of interactions within other file management systems like Google Drive, Dropbox, and Box.
  4. I built and tested a clickable InVision prototype for feedback with users who had mentioned a need.
  5. I gathered critique and validation for design concepts with a follow-up survey.
 

Strategic Research

As-Is

User research was conducted in a waterfall manner – generative research took place only before designing; evaluative research took place only after. After features were built and pushed, there was no insight into how frequently certain features were used.

Now

Ongoing user research ensures that the Design Team and OM are aligned around the most urgent and impactful user needs. Structured user feedback plays a large role in continuous delivery and (re)prioritization of features.

- Example: user feedback received about a latent need arose during a Day 1 interview was validated with other users and prioritized into a feature.
- Working alongside users builds client trust and satisfaction with the product development process, and allows the Design Team to tap their expertise in future feature development.
- Success metrics collected from Day 30 informed a case study to inform and empower the business and sales team.

The user stories encompassed in the new feature.

The user stories encompassed in the new feature.

This Process Map shows an idea of what the streamlined future process (now implemented) could look like.

This Process Map shows an idea of what the streamlined future process (now implemented) could look like.

Scoping our initial MVP. We would instrument their usage of the tool to measure the value to users before adding more robust capabilities to the feature.

Scoping our initial MVP. We would instrument their usage of the tool to measure the value to users before adding more robust capabilities to the feature.

An example showing how you can use the feature to pick up where you left off. This allowed users to create, edit, and use sets of genes, diseases, and drugs in each of the 6 tools in WDD.

An example showing how you can use the feature to pick up where you left off. This allowed users to create, edit, and use sets of genes, diseases, and drugs in each of the 6 tools in WDD.

This Case Study came out insights from the 1-10-30 day interviews and served as an example of how the tool could be used to identify potential drugs for development with only of the fraction of the time, effort, and investment. 2 research papers were later presented on using Watson for Drug Discovery at medical conferences.

This Case Study came out insights from the 1-10-30 day interviews and served as an example of how the tool could be used to identify potential drugs for development with only of the fraction of the time, effort, and investment. 2 research papers were later presented on using Watson for Drug Discovery at medical conferences.