Sarah Caplan

Sarah CaplanSarah CaplanSarah Caplan
  • Home
  • Resume
  • Portfolio
    • Google RITE Study
    • New Managers Study
  • More
    • Home
    • Resume
    • Portfolio
      • Google RITE Study
      • New Managers Study

Sarah Caplan

Sarah CaplanSarah CaplanSarah Caplan
  • Home
  • Resume
  • Portfolio
    • Google RITE Study
    • New Managers Study

Summary

While at Google I conducted a 6-round qualitative Rapid, Iterative Testing & Evaluation (RITE) study with task-based prototype evaluations and follow-up questions. This evaluative study looked at the Support team's newly designed chatbot.

Timeline

Timeline

Timeline

  • 3-month project
  • 2-week sprints
  • 6 rounds of testing
  • Jan-Mar 2022

Team

Timeline

Timeline

  • 1 UX Researcher (me)
  • 5 UX Designers
  • 2 PMs
  • 1 UX Writer

Context

When Google’s customers need help or have a question about one of Google’s hundreds of products, they use Support's services. Customers can visit Support through web-based and product based-experiences, via desktop computers or mobile phones. With so many ways to get help, customers often get lost in a maze of options.

Research Goals

Research Questions

Research Questions

Google had developed proprietary AI technology to power a chatbot experience. The new chatbot was critical in driving down support costs while helping customers reach the best support outcomes.
 

My aims as a researcher were to:

  • Uncover risks and usability issues in the chatbot’s design


  • Foster team alignment and decision making throughout design iterations


  • Coordinate an involved testing schedule to help team meet a Pilot launch date

Research Questions

Research Questions

Research Questions

  • What are users' expectations about the chatbot once they find it?


  • Then, if they find it, how do they interact with the chatbot when it asks them to enter their problem?


  • In a particular scenario, what are users’ reactions when they don’t see what they might expect?



Study Details

I chose this method for a couple of reasons:

  • It could quickly and iteratively cover many task-based permutations, i.e. the most common support scenarios on desktop vs mobile, or product-based vs web-based permutations;
  • It could unearth the most common design risks in each testing round;
  • It could create a testing cadence to support the Design team’s tight turnaround on design explorations.


I chose this number of participants for a couple of reasons:

  • Usability testing best practices tell us that we will find 80% of a design’s issues within 5 participants per persona;
  • I originally wanted to recruit based on Support’s two most commonly observed personas (representing >50% of customers in the US), but that would have required 10-12 participants;
  • Because of the iterative nature of the RITE approach, the tight timeline, and concerns about scope, I decided to recruit for a more general “Google customer” over any specific personas;
  • And I frequently assumed a 20% no-show rate so I over-recruited by 1 participant to be safe--since we were on a tight timeline (2 weeks per testing round) I needed to make sure I got the requisite data in the time I had available.


Data Collection

  • I created a note-taking matrix that mapped our research questions to tasks and interview questions.
  • I then asked designers to join sessions and take notes.
  • I also recorded the interviews.

Analysis

  • Following each interview, where I had asked at least 1 designer to join and take notes, I held a “brain dump” session.
  • In that session the designers and I discussed what we observed from the participant’s behavior, noting anything that surprised us and any top-of-mind takeaways we had.
  • After completing all of the interviews for a round of testing I returned to the recordings, primed with the group’s insights, and I pulled out clips and quotes to demonstrate what we uncovered.


Involving the design team in the analysis process helped me to further several of the project’s goals:

  • It demonstrated shared ownership of the data between research and design
  • It organized and streamlined data collection which helped the project stay on schedule

Synthesis

  • For each research question, I coupled the team’s insights with more details on the participants’ behavior.
  • I drafted a presentation with a section for each research question and answers from insights.
  • I met with the Sr UX Designer to iterate on the presentation and incorporate her recommended design changes based on the research findings

Copyright © 2024 Caplan Research - All Rights Reserved.

This website uses cookies.

We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.

Accept