Creation Flow MVP Usability Testing
At the end of 2018 I began a unique two-month project where I was embedded as a UX Researcher on a team at Microsoft. I worked closely with several designers, a product manager, and two program managers who all shared a common mission to tackle important issues facing two key products in the organization. After many conversations with team members and key stakeholders and reviewing previous research to better understand the problem space, I developed and executed two key research studies; one generative and one evaluative.
Creation Flow MVP Usability Testing was the second study I developed and executed.
Telemetry data collected shows that end users are starting a creation flow within the product but most are abandoning the process before completion. The goal of this research was to evaluate the usability of a newly-designed creation flow; one that is believed to help end users complete the process. Overall, we wanted to ensure that end users understand the tool they are creating and that they are able to successfully complete the flow and use the newly-created tool.
It should be noted that the tool created in this flow has similar characteristics to another tool that can also be created and utilized within this product. This, more simplistic tool can serve a few of the same functions but lacks the robust features found in the tool being tested. At an organizational level the desire is for end users to use the more robust tool so they can put to use more of the features available to them.
Usability testing of the new MVP design was conducted alongside the existing creation flow. A total of ten remote participants, recruited through UserTesting, completed two key tasks in both designs. The order in which participants used the prototypes was counterbalanced to help reduce bias.
Primary Research Questions
Are end users able to discover the entry point of the creation flow? Are they able to successfully complete the flow, use the tool they’ve just created, and re-access the tool? Lastly, do end users understand exactly what they’ve created and what is their perceived value of the tool?
Stakeholders for the product had a hypothesis that end users were abandoning the creation flow for one of two reasons: (1) end users clicked on the entry point not knowing what it did, or (2) there was confusion at some point while they were working through the flow. One designer developed a new, differently-worded, entry point and a new, more simplified flow. This would be the MVP (minimum viable product) of the new flow while the designer continued to add new features aimed to support the creation of the tool. The team (a design director, product manager, and the designer) believed that these changes would help reduce confusion and flow abandonment.
I developed a study to evaluate the usability of this new MPV flow and to assess if the messaging presented during the flow was clear and understandable. Using a prototype of the MVP design, participants completed two key tasks: (1) create the tool and use it, and (2) re-access the newly created tool from a location other than where they created it. In addition to testing the MVP, participants completed the same tasks on a prototype version of the existing creation flow. This was to ensure both designs were shown in the same fidelity.
In order to switch to the new MVP flow, we needed to answer, at a minimum, four basic questions. Can end users find the entry point? Can they create the tool? Do they understand what they created? And, can they re-access it to use again later?
At the conclusion of testing one finding in particular emerged which contradicted the team’s belief that the MVP flow would reduce confusion and abandonment. When completing the first task (create the tool and use it), all ten participants opted to use the more simplistic tool. They all explained that this was the tool they most commonly used to accomplish the task. Interestingly, when prompted to go through the MVP flow and the existing flow, all participants were able to successfully complete the flow and use the newly created tool. Therefore, my primary recommendation is to better educate end users on the benefits and advantages of using the more robust tool over the more simplistic tool.
Additionally, nearly all participants expressed some level of confusion as to what exactly they’d created when using the MVP flow. Several participants were unclear what differentiated this tool from the simple tool they’re accustomed to using. The others believed that this tool was different but were unable to articulate what exactly made it different.
“Whenever I’m presented with a list of five different ways to [do] the same thing I get a little bit like, ‘which one should I use?’ When there are many options to do the same thing, I guess it’s good because it’s flexible, but it’s also like, ‘which one is the best?’ I want to be told which one to use.” - Participant 8
When asked re-access the tool they’d created from a different location, all but one participant went directly to the correct location.
In the end the MVP flow had mixed success as participants were initially looking for the entry point for creating a different tool but, once directed to try another way to accomplish the task, they were able to find the entry point and easily complete the creation flow. Despite the ease at which participants completed the flow, there was a general lack of confusion as to what exactly they created and what made it better than the tool they’re familiar with. Lastly, participants knew exactly where to find the newly created tool relieving any doubts of re-accessing issues.
Aspects of the MVP design had promising results but better education and training for end users should be prioritized in any new designs for this product.