Executive Summary

Purpose: The purpose of this study was to identify key usability issues during a researcher’s experience using Feedbackpanel. The majority of our research questions were focused on the Notes feature (how easily and efficiently can users create, edit, search, and share notes) as well as evaluating if Feedbackpanel is a comprehensive and valuable tool to assist a user researcher in their research study journey.

FOR BLINK - 517 Presentation - Oddities Agency.jpg

Participants: Our study included six participants who are all practicing user research and have no to little experience using Feedbackpanel.

Results: Participants  were given three scenarios and ten tasks, with post-task and pre/post session questions. Participants found Feedbackpanel most useful in the Search, ability to timestamp, and project storage/organization features. However, participants showed frustrations and confusion around tasks regarding the Notes features. In particular, 6 out of 6 participants were unable to successfully create a new note and make comments without assistance.

Findings and Recommendations: Our study discovered several key issues around efficiency, information feedback, and consistency. The following are our primary issues and recommendations:

  • Notes Labeling - The nomenclature of the Notes feature did not match participants’ expectations of how its components worked. Participants were confused about the meaning of “Create a New Note”; they perceived comments as notes.

    • Recommendation: Use alternative labels for notes and comments to accurately match users’ expectations for how these features work.

  • Notes Organization - The organization of the notes feature created difficulty for participants. They disliked how previous notes were hidden in the system and that they could not access all their notes in the default state.

    • Recommendation: Change layout and hierarchy of notes so that users can easily organize, view, and access notes in the default state.

  • Notes Layout Consistency - The Notes field changes its location depending on if a participant is on the uploaded video page or the live video stream page. This lack of consistency created confusion for participants.

    • Recommendation:  Maintain a consistent layout for Notes in live stream view and uploaded video view.

Study Objective

This usability study was designed to  comprehensively assess Feedbackpanel to determine if the tool properly aids researchers in completing their goals. Our objectives included:

  • Assess the overall effectiveness and usability of the desktop web application

  • Assess ease of use of the product

  • Identify obstacles to completing key tasks

  • Identify opportunities to improve the overall user experience

Our research questions included:

  • How easily can users upload and label videos?

  • How easily can users create, edit, and share notes?

  • How easily can users discover previously created notes?

  • How easily can users search for previous projects?

  • How easily can users invite clients or colleagues to the study?

  • Do users perceive Feedbackpanel as a valuable tool for usability testing?

Methods

We conducted exploratory tests of Feedbackpanel’s main features, focusing on tasks that support researchers to conduct research, review and gather data, and share insights with project collaborators. We conducted a total of six one-hour sessions over a three-day period.

Recruitment

We recruited a total of six participants using our professional and social networks. We screened all participants using an online questionnaire (see Appendix E). Participants received chocolates, baked goods, and handwritten Thank You notes upon arrival at Blink UX.

Procedure and Testing Logistics

We conducted our study at Blink UX using its usability lab. Testing equipment included two desktop computers – one with software that captured participants’ expressions and body language, as well as their use of Feedbackpanel. While the moderator and participant were located in the lab, the rest of our team sat in an observation room to take notes and collect data.

We executed our study using the following structure:

  • Pre-test open-ended questions to discuss  how participants currently manage and share usability video recordings and usability findings

  • Three scenarios and ten tasks conducted using Think-Aloud protocol, probing questions, and a post-task single ease of use Likert scale

  • Post-session open-ended questions focused on overall user experience, as well as a likelihood of use Likert scale