Updated on:
16 May 2023
Show Information Pages Research
Testing new designs with appropriate user base before development.
Live link:
Project Status
Completed
Read time
5 mins
Company
Viacom18
Role
UX Researcher
Overview
Voot is an Indian subscription video-on-demand and over-the-top streaming service, owned by Viacom18. Millions of users access content on the app every day for a variety of content - exclusive shows, movies, TV shows, and sports.
To comply with my non-disclosure agreement, I have omitted and obfuscated confidential information in this case study. All information in this case study is my own and does not necessarily reflect the views of Viacom18.
Problem Statement
The sports section of the app was already created at the time I joined the team, and I was tasked with understanding how the users find the new designs before rolling them out. This was to sniff out any issues in the design before it went into development, so we could be more efficient with our iterations.
📌Show info page = page that the user is taken to once they click on a show’s thumbnail.
Users & Audience
Voot has millions of users watching from all over India, but for this particular project, we were looking at people who were already consuming sports content on the app. My goal was to first optimise it for the existing users, as the designs introduced a new look and many new features, so we needed to be sure that it won’t cost us our user’s trust.
Roles & Responsibilities
The UX team at the time had 12 people (including the team lead and myself) and I was the dedicated UX Researcher in the team. For this project, I worked with 2 designers and our team lead, the data science team, the recruitment agency, and the market research team.
I was responsible for conducting the research and then collaborating with the designers to fix any major issues that surface.
Scope & Constraints
The benchmarking and competitive analysis was already done by the designers who had created the screens and flows, so it was my responsibility to:
Identify current and potential users
Prepare the questions and tasks for the user test (this will be my guide and observation sheet)
Create prototypes and flows according to tasks
Conduct the user tests
Analyse the user tests and present the report
Process
Initially, we decided to test just the football section.
I identified the user base on the platform with help from the data science team. We saw that 1/3rd of the viewers were women, and most of the viewership came from 18-35 year olds in metro cities.
Keeping that in mind, I created a small cohort of 6 users. The agency that helped us recruit participants mentioned that it would be difficult to recruit 2 women in the required set, but thankfully we were able to get 1 female participant by the end. Need me some representation.
We already knew we had to test the new designs, but this was the part where we narrow down the hypothesis to be tested. I collected the doubts that the designers and our team lead had, and added questions of my own. I categorised these questions according to user goals and created the guide.
In a guide, I kept the questions (the metrics), the description of the metric and what to record, and lastly the participant details.
It was a huge help to have the agency coordinate with participants so I only had to worry about the session, and not the before or after coordination with the participants.
The 6 user tests were conducted virtually over 3 days. I was looking into the perks of unmoderated user tests, so I tried out a new format for the session - tasks first, questions second. (Was it good? Was it bad? Stay tuned till the last “learnings” sessions to find out.)
After the session, I rewatched the recordings and transcribed the important data that we needed to record. I then created a report with data-supported insights and suggestions as to how we might solve these problems. Some insights in turn raised some further questions, so I flagged them to take up as follow-up research later.
I also coordinated with the data science team to make sure that the behavioural data I collected through the qualitative study was somewhat similar to the behavioural data collected through the platform. That gives the user test data more credibility and tells us that the test was done properly.
Outcomes & Lessons Learned
The insights from this sprint, except the football-specific ones, then went on to inform design decisions for the whole sports section. The screens mostly had good reception, and the issues I flagged were quickly attended to and fixed.
While presenting the report, the team pointed out that in the presence of a moderator, users tend to be more on-guard, so the original format - questions first, tasks second - would work much better. These users were passionate about football so they got comfortable with the tasks pretty quick, but that wouldn’t happen every time. I understood the logic and that was my major takeaway from this study. I’ve stuck to the original format since and really, the flow of the session is better.