Reelchat
a Media Focus Group Video-Conferencing Platform
Needs Assessment & Usability Evaluation for
Overview
Reelchat is a virtual interactive focus room platform developed in response to the COVID-19 pandemic by Pilot.ly, a market research company exclusive to creative content.
​
It serves as a means for the entertainment industry to stream creative content such as trailers, ads, movies, etc., and collect detailed data from their target audiences through means of polls, surveys, live reactions, and discussions led by a moderator prior to the release of the media content.
​
This platform converts an in-person experience into the digital space. Much like an in-person focus room, client representatives can view conversations between the moderator and participants via a backroom. In addition, a representative admin is on the call to monitor the internet connection, microphones, video, and other technical details involved for the moderator, client observers, and participants in the session. At the end of each session, all audio recordings and responses are automatically saved and feedback is given to the client representatives.
Role: UX Researcher
Duration: 16 weeks
Skills: Interaction Mapping /Qualitative Interviews /Survey Design /Heuristic Evaluations /Usability Tests
Tools: Figma /Miro/ Qualtrics
The Brief
The scope of the project is to assess the user experience of the video platform Reelchat. This platform contains four active roles, participants, observers, moderators, and admins. Our team initially decided to prioritize the participant and admin roles. However, due to privacy limitations and the platform beta stage, we narrowed our scope to the participant's experience as their role is the primary component of the platform for data collection of their clients.
Participants
Observer
Moderator
Admin
Exploratory Research
Interaction Map
One of our first steps was to explore and gain a general understanding of the platform. By doing so, we were able to observe all possible actions, features, and errors and non-errors that might have occurred on the screens. We developed an interaction map using Miro, to enable us to effectively evaluate the platform and predict initial areas where usability issues might arise. Our team aggregated the screens associated with each role to gain insights into the platform's structure, user journeys, and points where roles intersected on the platform.
Key Findings
Participants have the most interactive features from the moment they enter the virtual waiting room until the conclusion of the video study sessions. Features include using the doorbell in the waiting room, expressing likes and dislikes, responding to questions both verbally and via chat, utilizing the "tuning out" button to indicate disinterest in content during the video sessions, as well as participating in polls.​
Admins play a vital role in the video sessions, as they are responsible for verifying proper functionality of audio and video components prior to the start of a session as well as monitoring audio and wifi connections during the sessions. Additionally, they must be the first to join a session in order for the different roles to be admitted into the study landing screens from the waiting rooms.
Moderators initiate discussions and polls, manage the timer, and launch video viewing for the participants. They have the ability to communicate with both observers and admin.
​
Observers have the capability to view chats, poll responses, likes, and dislikes, and view when the "tuning out" button was initiated during video screenings, as well as engage with other observers within the observer room and communicate with moderators and admins.
Qualitative Research
Interviews
Our team conducted a total of 4 semi-structured interviews, 1 admin, and 3 participant interviews. Although we were restricted from interviewing former participants of Reelchat studies due to third-party recruitment policies, we successfully recruited individuals with prior experience in virtual focus group sessions. We asked questions in relation to their experiences of the platform the studies were conducted on and any issues they or others may have encountered during the sessions. Similarly, we asked the admin about their role responsibilities, their session experiences and the impacts of troubleshooting.
4 Participants
Age Range: 20-60
Participated in an online focus group in the last 6 months
Experience with online systems
Participants Interview Quotes
"Interface was similar to the Zoom, so I could find the buttons easily. But different software uses different terms to name their functions. I was confused by their names to some extent. "
-Interviewee 2
​
"A live transcription service will provide better service if the network is not so stable"
-Interviewee 2
Some people didn’t understand how to use it, how to mute and unmute themselves, they should've been muted and they weren’t. Didn’t understand how Zoom worked"
-Interviewee 3
Pain Points
Users are not always familiar with locating or utilizing the control panels for muting and video displays
​
Users are unaware they are experiencing issues or delays due to internet connection problems
​
Lack of accessibility such as the absence of live transcription can be a barrier
Needs & Opportunities
Better accessibility and clarity of audio and video controls.
​
Alerts of unstable network connections to both participants and moderators.
1 Admin
Participated in a Reelchat study as moderator
Background in tech development
Admin Interview Quotes
"One thing that's missing for me is insight into what's happening in the room...for example, if the moderator is sharing their screen you don't see what they're doing." -Admin Interviewee
​
"The context of the people like the network...maybe a notification about that type of thing, where like the signals dropped off for one of the participants and the video suddenly gets very jittery or there's a delay suddenly in the audio." -Admin Interviewee
Pain Points
Admins have difficulties troubleshooting errors due to an absence of context during the video study.
​
There is a lack of notifications to draw the administrator's attention regarding technical difficulties.
​
Admins can only engage in one one-on-one chat with each individual.
Needs & Opportunities
A designated section in the admin view to monitor the sessions in real-time to understand what they are troubleshooting.
​
A pop-up feature for identifying flags on technical difficulties.
​
The ability for the admin to create a group chat with all the roles.
QuantitativeResearch
Surveys
At this stage of the project, our team decided after discussions with our Pilot.ly stakeholder to move forward with prioritizing the scope to improve the participant role only and discontinuing research of the admin role due to technical limitations. To delve deeper into the participant's role, we conducted a survey to gain insights from a broader user base. Given Reelchat is not a platform available and in use by the general public, we divided the product into two categories. It is a video conferencing platform and a focus group platform.
​
Through the survey, we aimed to address the overarching questions we developed from our interviews to understand common issues and priorities that should be addressed for the various participating roles as our client develops their platform.
Overarching Questions
-
What is important to participants when it comes to video platforms? (Buttons, Chat, etc.)
-
What are the criteria for improving participation online? (Unmuting Vs. Raise Hand, etc)
-
What are expectations and norms for in-person studies vs online meetings?
Survey Results
I used Qualtrics to create our 22-question survey targeting individuals who had either participated in online and/or in-person studies or used video conferencing platforms. Our team deployed on various platforms such as Reddit, Discord, FB groups, and Slack channels. We received around 94 responses in 72 hours with the average age range of respondents being from 25-35 years old. Out of the 94 respondents 41 of them had participated in in-person studies.
22 Questions
94 Responses
25-35 Average Age Range
41 Respondents participated in In Person Studies
​
Pain Points
Users showed expectations of opposite social standards when participating in traditional in-person studies versus online conferencing platforms.
Audio and video quality are the top frustrations of users across all video conferencing platforms.
​
Users ranked the visibility of buttons and chat features as important elements in video conferencing platforms.
Ranking of Preferred Method of Speaking in Online Meetings
Areas of Opportunities
Ensuring a raise-hand feature and the ability to unmute and speak cater to overall user preferences.
​
Providing alerts of unstable networks to both parties would reduce disruption, confusion, and frustrations of hosts and participants.
​
Providing visual contrasts to critical features such as enabling and disabling the camera and microphone
Ranking of Preferred Method of Speaking In Person Studies
Qualitative Research
Heuristics Evaluation
At this stage of our research, we performed a heuristic evaluation to identify and pinpoint potential usability issues that may occur for users. We took the four major screens: Sign up Page, Waiting Room, Focus Room, and Screening Room along with the features in the user's control panel and evaluated them against Neilsen 10 Usability Heuristics for User Interface Evaluation with the combination of Nielsen’s Video Usability Heuristics to better fit our criteria.
We assessed both strengths and weaknesses, by evaluating the system against each heuristic, assigning ratings to each identified issue on a severity rating of 1-5. A rating of 1 indicated no issue while a rating of 5 indicated a critical problem requiring immediate attention. We approached this task with the persona of a new user with average technological abilities to best emulate the target audience’s interaction with the Reelchat interface.
Heuristics vs Average Severity & Violations
Heuristics Findings
In total, there were 80 violations, with user control and freedom rating as the highest followed by visibility of system status. We highlighted to the client the key violations that should be prioritized to enhance and streamline the sessions for the participants which ultimately would improve the quality of the study and the usability of the platform for the participants.
Focus Room
-
The system will override a setting the user manually inputted such as being muted on one screen but is undone when switching back to a different screen such as coming back from the screening room.
-
No feature is available that can be used by the user to resolve issues or request help.
-
Joining and leaving a meeting is unclear as the only option is for the user to close the tab in the browser, if they are kicked out of the session there is uncertainty if the admin made this decision due to a study being full.
-
Accessibility issues hinder participants with disabilities.
-
There are no alerts of unstable network connections to both participants and moderators.
Recommended
-
Keep the user’s preferences for their settings or allow a warning the setting will be undone.
-
Add a help button or display a pop-up FAQ or quick tips for users to resolve their problems.
-
Notify the user that they are leaving a meeting when clicking out the browser or being asked to leave the sessions.
-
Add in a live transcript feature with a visible button to turn the feature on.
Focus Room: chat features are unavailable, Session Tech is just an image, with no button to leave the session.
Screening Room
-
The participant is unaware if their camera is on during screening, and is inconsistently given a warning that the screening is beginning.
-
Participants are able to go back to the intro polls after a screening is started but unable to join back to finish the video, nor does the "tuning out" button function properly as users can not select "Yes" to tune out nor leave the screening room and there is a lack of control to input their true reactions of likes and dislikes if they missed hitting the button although they can pause the video but not rewind or forward.
-
When the screening is completed, participants can only wait for the moderator to do something, all options are gone, no options to chat, ask a question, or leave.
Recommended
-
Need clearer indication if participants' cameras and microphones have been turned on by default
-
Allow participants to have more control over their experience during the video screening.
-
The participants should be taken back to the Focus room when the screening is completed.
From the observer and moderator's view, participant cameras are turned on the entire time of a screening without their knowledge.
Screening room for participants, they can pause but not rewind, like and dislike buttons are on opposite ends leaving the participants unable to input their true data. .
Participants get trapped on the tuning out page, unable to select "Yes"
Auxiliary Features
-
The doorbell feature in the waiting room is not intuitive as it is unclear if it is for asking questions or to indicate the participants are ready as the purpose is for neither, it is to call the admin.
-
Admin can force video and audio to be turned on for any participants.
-
Chat is confusing and inaccessible to all users. It only allows the participant to chat with the moderator. If the participant did not use the chat box with the admin in the waiting room, the participant will not have a chat box at throughout the entire meeting. The participants can not ask for help unless they speak up nor can participants answer questions in the chat box for the group to see.
Recommended
-
Have a clear explanation of the doorbell feature upon arriving at the waiting room.
-
Give participants the option to turn their camera and microphone off and have a warning the admin is requesting to turn it on.
-
Keep the chat function available during all aspects of the program for all parties.
Waiting room with doorbell, admin suddenly appears on screen when rung.
Design Research
Usability Evaluation
Our main objective in our next step was to confirm the validity of our key findings thus far of Reelchat. As Reelchat has never undergone a usability test since it was created, we requested our client to assist in recruiting individuals from their third-party contact who aligned with the target demographic of the media content that will be featured in our usability test. The aim was to ensure the client would receive accurate data about their platform with real future users. I created 7 tasks for our participants to perform, focusing on common problematic areas that arose throughout our research to validate areas of focus that would be advantageous for the client to address promptly.
Overarching Questions for Usability Test
-
Do the users understand the purpose of the Doorbell button without an explanation?
-
Is it essential to have an "entering" and "leaving" button?
-
How do the users troubleshoot?
-
How disruptive does the user feel by not having autonomy over their preferences and having their settings overridden by the Admin?
5 Usability Tests
We carried out a total of 5 usability tests using Zoom, with each team member taking a turn as moderator at least once. We provided links to each participant and requested them to screen share to be allowed to watch the user as they performed our tasks. The average participants were within the middle-aged demographic with experience with focus groups and some familiarity with video conferencing platforms.
Data Shortcuts
Data Logging Sheet
Usability Test Results
The absence of labels in the interface leads to confusion about the purpose of UI Elements
Enhancing labels and employing distinct designs will enable users to understand UI elements and set the importance of an element on a page. For example, using a color that stands out for the doorbell button or rephrasing it as "Ready for Session Tech" and adding subtext as a user hovers over to indicate its purpose such as “Click button to notify session tech to grant you access into the study” would improve user discoverability.
The absence of user controls results in frustrations for participants when exploring and familiarizing themselves on the platform
Enabling users to join the study without their camera or microphone activated and requesting the participants to manually deactivate their video and microphone before the admin overrides controls. Messages such as “Session Tech has muted you or unmuted.” would allow participants to be aware of their controls.
Ensuring the chat window is available at all times from the beginning of the session.
​
The interface's lack of notifications and status messages results in poor visibility of the system status.
Incorporating text to the invitation email or on the registration study link home page to allow users to register prior to the date of the session meeting.
​
Specifying form fields that must be filled out in the format wanted before a user can proceed at sign up. For example, asking for "First Name" and "Last Name" text fields rather than a single "Enter Name"
​
Including text in the waiting room to provide participants with what to anticipate next in the process.
Adding clarity to messages and notifications to explain the reasons behind a participant's abruptly being kicked out of the meeting and information regarding whether they will still be compensated for their time.
Unconventional UI elements cause participants to invest additional time troubleshooting and resolving issues
Using conventional standards such as being able to direct messages with "Hosts" or in this case "Session Tech" would match the participant's mental model of video conferencing platforms to ask questions or request help. Currently the icon of the "Session Tech" exists but it is a dormant icon that does not provide any functionality.
​
Providing a button to "Exit Session" or confirming a participant they are leaving would reduce confusion about how to leave or stop a participant from accidentally dropping out of the session when they are new users to this platform.
​
Participant Feedback
"Doorbell was cool. Being able to alert the moderator of the arrival of being present is something I've never seen before and would help solve problems for online studies to confirm their presence" -Participant 1
​
"Instructions were confusing on the screen. I didn't know where to get help. More visibility would help...It seems more of a professional platform. The polls make it professional. It didn’t seem friendly to use casually. More visibility would make it more friendly to use." -Participant 3
​
" I can see a private movie screening on a platform like this." -Participant 4
​
"Video quality was good but the platform was plain and unintuitive...Ring doorbell was the most memorable feature and hearing the little bell ring... I really like how other platforms are more intuitive. I know how to enter and exit. There are icons there that are easy to see. I know to click on them if I want to. There seems to be much more control through these other platforms." - Participant 5
Takeaway
In this project initially, the client came to us wanting to assess the whole system of Reelchat, however, we had to quickly learn to prioritize the scope due to our limit on time. It allowed us to ensure priority was given to the users with importance to yielding success to the product. Although we had limitations in this project to do research with current users of the target demographic, we were able to test with potential demographics. We were able to learn key pain points users were likely to encounter and validate our findings for Pilot.ly to use as they continue to develop their video conferencing platform. Prioritizing the participant's user controls, trouble shootings, system status and uses of UI elements would solve initial levels of pain points for users. The usability test was beneficial in gaining real-time insights into the mental models of users that would further improve this platform. Given additional time, I would have liked to further test the unique features Reelchat has within the video study section of this platform such as the likes and dislikes and "tuning out" features.
The Team: Safa Viqar, Sushmita Rao, Ariella Pearlman, Lingyu Xu