heard-casestudy-coverimage_.png
 

Finding a therapist can be hard and many factors come into play when deciding if they're the right fit. Heard, a mental health start-up is trying to simplify the process by doing all the grunt work for you. The idea is simple — people fill out a short survey with questions related to their mental health and Heard uses those answers to connect people with therapists that match their preferences. The success of this relied on people completing an onboarding survey. Heard brought me on to improve their onboarding so users were more likely to complete the process and find a perfect match.

Project overview

I led the redesign of the onboarding experience while collaborating with the founders and developer to make sure the business goals aligned with the customer goals. Our primary goal with this project was to launch a new survey so users could easily get matched with a therapist. In addition to that were higher-level goals for the business — increase user retention, consolidate onboarding into a streamlined experience, enable users to choose the correct path, and help users to understand what we were asking of them.

 
 

MY ROLE
Lead Product Designer
User Testing
Interaction Design

TEAM
Founders / Developer / Designer

Time Frame
Jan 20’ - Feb 20’

 
 
 

Kickoff

We began by defining deliverables and discussing the main goals of the project so we could align on expectations and come up with a shared vision.

 
 
heard process.png
 
 

Evaluate

I conducted a heuristic evaluation of Heard’s previous onboarding survey, followed by a competitive analysis of similar survey types on other sites. I also asked participants to complete the survey to see where the pain points were.

The previous survey was a total of 30 screens, a few of those are seen below.

 
 
heard previous design.png
 
 
 

 

Research Insights

The results from my analysis showed several issues with the overall experience. We collectively went through the notes to prioritize pain points and form actionable insights.

Establish Trust

We looked at the metrics and could see that some users were dropping off once they were asked to provide their email. At this point in the survey, we were already asking for personal info without earning their trust. The user had no idea how long the survey would take and the pages up until this point did not provide much information. The redesign needed to include important information up-front.

Experience needs to be inclusive

The survey asked personal questions and we didn’t want users to feel unrepresented by the multiple-choice answers we provided. But at the same time, we wanted to provide options rather than having to recall everything on their own. Providing an exhaustive list of answers for every multiple choice question didn’t make sense so we brainstormed different ways to solve this challenge.

Flexibility

There were forms throughout the survey that users were forced to fill out even if it didn’t apply to them. More flexibility was crucial to give the user more control so they can complete this survey.

Streamline the process

The survey as-is was too long, related questions and forms were on separate screens. We needed to streamline the process by combining related information and text forms if possible.

 
 
 
 

Prototype

Before I could jump into designing, we discussed the improvements as a team. Prioritizing what was critical to gain alignment and drive decision making. We decided on our critical items based on insights and I was able to move forward with the prototype.

 
 
 

 

Visual Design

The Heard team changed their branding right before I was brought on, the new design moved away from the bright blue to warm colors that invoked a calmer reaction. The bright blue might resonate with a younger generation but they were realizing that they had a large user base that was older. I used the new branding as a visual guideline for the redesign.

visual design.png
 
 
 
 

Framework & Consistency

I came up with a few options for the layout and ended up choosing version three where the progress bar was at the bottom of the page. This layout would be used on every page of the survey. I wanted users to focus on the question first and then quickly recognize where they could choose answers.

layout.png
 
 
 

User Journey Progress

We decided early on that establishing the length of the onboarding process would be a starting point for gaining the user’s trust. The original idea would be a progress bar showing how much of the survey they had already completed. We ended up moving away from that and opted for a tab design that highlighted the current section they were on.

 
 
progress bar.png
 
 
 

Provide Helpful Information

Heard was currently not able to match users that wanted to pay for therapy with insurance and they were only available in California, Oregon, and Washington. The old survey did not make users aware of these factors until screen six and eight. These critical items affect the overall experience so moving them to the beginning of the survey was a priority. The payment information moved to the first screen also including an option to join a waitlist so they would be informed once insurance payments were available.

We decided to dedicate the third page of the survey to the location information. Users would enter in their zip code and if it was outside Heard’s locations, a pop-up would appear with the option to join a waitlist. We did this for a few reasons. Heard wanted to target interested users and having their location helps define which states they would add in the future. We didn’t want to include too many what-if scenarios on the first screen that would overwhelm the user and moving the location specifics after the email form aligned with business goals.

 
 
screen 1-2 of onboarding.png
 
 
 

Flexible & Inclusive

Everyone has different experiences related to mental health and we needed to take that into account when asking questions and providing multiple choice answers. Rather than forcing users to fill out forms even if they weren't applicable, we gave them an option to choose "yes" or "no" when asked if they wanted to include specifics.

On every page of the survey, users had to choose an answer before moving on. We did this to make the process as streamline as possible. Since we provided other ways to skip text forms this made the most sense.

 
 
preferences screen.png
 
 
 

It would be difficult to provide every possible answer for certain questions but we wanted to make sure the answers fit everyone’s needs. We gave users the option to choose “other” for some of the questions so they weren’t forced to choose something even if it didn’t apply to them.

 
 
whats on your mind screen.png
 
 
 

A/B test

We discussed the prototype as a team to see if there were any additional changes to add before user testing. We thought that users might want to know the section number they were on not just the section title. We couldn't decide if that would help or make the process more confusing. So I decided to make a second version of the prototype and include section numbers next to the header and progress bar but everything else would be identical.

For the test, participants would go through both prototypes and I could hear their thoughts about the entire process and ask if they had any thoughts about the section numbers. The main purpose of the usability test was to ask people what they’re thinking and why they’re doing certain things — qualitative data was the focus. But the addition of this simple a/b test was a great way to test one specific thing at the end.

 
 
section number prototype.png
 
 
 
 

Validating assumptions with users

At this point I had a final prototype and we could test to see if our assumptions were correct. After conducting 30 minute usability tests with 6 users, I observed:

01.
All participants mentioned that the layout was intuitive and easy to follow. The process was clear and they were able to focus their attention on the questions.

02.
Most participants noticed the payment information on the start page, stating this was great to include and they liked that there was a waitlist option. We were giving them important information at the beginning that might impact whether they could use heard - we were building trust with participants.

03.
Participants mentioned that they enjoyed the conversational aspect of the header copy placed above each question. This made the survey more fun and not just the standard form you would fill out at a doctor’s office.

04.
We tried to give the users control and make the available answers inclusive. But realized that participants were thrown off by the option to choose “other” and not have the opportunity to fill in what that meant for them. We were trying to represent real experiences though the answers we provided however we couldn't include every possible option.

04.
Participants had the option to add specifics for certain questions, a text form would pop up if they chose “yes” when asked if they wanted to include more information. A few participants mentioned they weren’t sure what specifics they would include here so instead they hit “no” and proceeded to the next question.

05.
When asked about the section numbers from the second version of the prototype, none of the participants noticed them. After pointing them out they stated that they understood the progress tabs without the numbers. With those results, we decided to not include them in the final design.

 
 
 

Design Iteration

Feedback reminded me of our original goals — simplify and allow flexibility so everyone can choose answers that apply to them. We discussed the usability feedback as a team and decided to change the way users can provide additional information if they wanted to.

We needed to give users the option to expand on what “other” means for them. I kept the button on the page but changed it to a text form, that way users could click on it and fill in their response. Rather than choosing that option and wondering why they weren't able to expand on their answer. We also changed the copy under the question to let them know they could provide more specifics on upcoming screens as well.

 
 
Heard prototype changes copy.png
 
 
 

We provided text forms for the questions that gave the option to add more specific information. But participants mentioned that they were unsure of what they would expand on. I revisited those screens and added additional text under the question that showed an example of additional information they could add. This allowed users to see this and then possibly recall specifics that they might want to include.

 
 
Heard prototype changes text form.png
 
 

The final design consisted of 19 screens dedicated to the survey and a starting screen which was the landing page of the Heard site. The survey before being brought on to the project was 30 screens total. We were able to condense the information and make the experience less overwhelming.

 

 
prototype screens _heard.png
 
 
 
 

Interactions and hand-off

The final file for hand-off included all survey screens, different button states, and the visual style guide. I also created interactions for button components and the progress tabs to show how things would animated when hovering or clicking an item.

 
 
HEARD_button interaction.gif
HEARD_save button hover.gif
 
 
HEARD_progress bar interaction_.gif
 
 
 

Conclusion

This was a really exciting project for me to work on as it provided real value to the company, involved user testing, and interaction design. I learned some important takeaways from this project related to product and business processes. It was a fun opportunity to be so closely involved with the core team and I hope to continue designing for the mental healthcare sector in the future as it becomes more accessible to everyone.

The new survey is not on the site just yet but you can view the final prototype here →

 

 👋angelicaguildner@gmail.com