PROJECT OVERVIEW
Project type: Usability Research for a film festival website
Team: UX Researcher and Web Designer
Date: February/March 2021, 5 weeks, 30 hours
Skills: Plan and conduct research, participant recruitment, data analysis, synthesis and reporting.
Methods: usability review, heatmaps and visitor recordings, moderated remote usability testing
Tools: Hotjar, Lookback, Calendly, Google Docs and Sheets, email templates
The Festival of Ocean Films Gallery page before and after the research.
THE CHALLENGE
When I joined the project, the festival website was newly launched and being built out in phases. I needed to consider the web designer’s time and help her to identify the most critical issues to address. I had 25 hours over 4 weeks to accomplish this and no budget for extra analytics tools.
I made a Research Schedule to outline weekly tasks and assignments (this is an excerpt from the full schedule) .
Research Methodology
The Georgia Strait Alliance (GSA) originally asked me to conduct a “usability audit” and provide them with recommendations in a report.
Instead, I proposed a more collaborative three-phase research plan that involved talking to users and included check-ins before between phases. I took this phased approach since we hadn’t worked together before and I wasn't sure how agile the process would be given the constraints.
I carried out a mixed methods approach consisting of:
Phase 1: Usability review – to quickly uncover usability, accessibility and visual issues
Phase 2: Activity monitoring on Hotjar – to quickly understand visitor traffic and how users navigate and interact with the site.
Phase 3: Remote moderated usability evaluations – to understand how users interact with the site, and why, and to explore any unmet needs
Research Goals
The study explored three key research questions:
Collaborative Approach
There’s nothing I love more than advocating for users and making research accessible to teams.
I made the research documents and activities collaborative so that the designer could contribute and see first-hand how people use the website.
The project was completed in three phases over a five week period, with check-ins between to discuss next steps.
Phase 1: Usability Review
I conducted a quick website usability review to find the obvious problems and check for accessibility issues.
I borrowed two guiding questions from cognitive walkthroughs to conduct the review:
My review considered:
Important Findings and Recommended Fixes
Deliverables
A list of issues, prioritised fixes and recommended solutions.
A meeting to go over the prioritized issues and discuss ways to address them.
On desktop I found opportunities to address accessibility and navigation issues, and some inconsistencies in the tone of voice (this is an excerpt from the full list).
I showed the Contrast Checker results and a screen shot of the mobile menu annotated with recommendations for improvements.
Phase 2: Activity Monitoring
I set up a free Hotjar account to capture Heatmaps and to record visitor activity.
Hotjar is good for gathering data on visitors’ navigation behaviour. In our case, the data informed both how we could improve the site and our questions for the moderated research sessions.
An early heat map of the films gallery showed click activity on static film images in the film cards.
Important Findings and Recommended Fixes
I gave the designer a light 5-pager that identified three top priority issues and two “quick fixes”. I supported the research findings with screenshots and visitor recordings.
Deliverables
Phase 3: Remote Moderated Usability Evaluations
Participant Recruitment
Due to time limitations, we had to start recruiting before finalising the research direction. We felt confident doing this because we had a general idea about what to evaluate, and GSA identified that their membership would be a convenient cohort to recruit from. The membership are enthusiastic volunteers, plus they were the target group and were easy to contact by email.
I wrote the invitation and designed the screener survey and set up Calendly to schedule sessions. I used an email template to send login instructions and a reminder a few hours before the session.
Criteria
Many of GSA’s membership are people in mid- to late-life. I screened for age-related disabilities and accessibility tech such as screen readers and magnifiers.
I also wrote questions to identify people who did or didn’t buy and stream online videos so we had a sense of the respondents’ existing habits.
Recruitment Details
Participant number: 5 participants (+2 to cover dropouts)
Participant target group: GSA membership, local environmentalists, people that watch/stream films online
Availability: on desktop or mobile, Half-hour session with three tasks + feedback interview at end.
View the email Invitation and Recruitment Screener text
Research Guide
The designer and I exchanged ideas about research goals and tasks and I finalized the research guide for the sessions. Based on insights from earlier research and new untested web content, we defined three user behaviours to investigate:
The first task from the Research guide helped us understand how users navigate to the films gallery and browse films.
Usability Evaluations
I used a free trial of Lookback, a remote user research platform for the moderated sessions. The designer observed the sessions that I moderated, and afterwards we met for a quick debrief session.
Analysis and Synthesis
In total I conducted three half-hour sessions plus two test-runs. Beyond the session debriefs, I replayed and analysed the sessions, made notes and video highlights of key findings for the designer to review.
Lookback has an impressive transcribing tool, so I also highlighted key quotes from the sessions.
During live sessions on Lookback the designer and I commented on some of the surprises we encountered. I moderated five remote sessions, including test-runs, to check the tech and research guide.
Deliverables
I made a spreadsheet of key issues for future reference.
I created a spreadsheet of prioritized key findings so that the designer has documentation after the Lookback trial expired.
Important Findings
A screen recording shows that one of the participants tapped the “Enter the Festival Button” to browse films and was surprised by a login pop-up.
The different ways the film trailers show viewing controls were beyond the designer’s control, plus, a couple of films only had posters. I recommended consistent labelling across all trailers and posters to reduce confusion (shown in green).
Impact
The website was new, so we didn’t have any prior metrics, for example sales and visits, to compare. But we did notice two improvements:
Fewer errors in film browsing.
Rethinking menu and button content showed improved navigation to the films browsing page.
The desktop homepage Before and After: the menu item “Get Your Pass!” confused users. It was converted to a call to action button to differentiate it from films browsing and grouped with the “Enter Festival” button to guide user flow. I flagged the yellow visited link colour in the menu for accessibility and legibility issues.
My collaborative approach to the research helped build the web designer's capacity to do usability research. The GSA plans to use these new skills in future projects. Here's an excerpt from her recommendation on LinkedIn:
“You helped me develop my understanding of accessibility and anticipate what can cause potential issues, not to mention exposing me to valuable tools to evaluate usability.”
Kirsten McPherson, Website Designer
Appendix
Joys
I was pleasantly surprised by the designer’s buy-in for user research. This really helped to make the work fun, collaborative and led to evidence-based decisions based on user feedback.
Seeing the designer get excited about the research and watching the site evolve based on the results was energizing.
Using my recent learning from UX Research (Qual and Quant) and Sociology courses and putting theory into practice was very rewarding.
Challenges
Only two participants used Calendly as prompted, meaning I had to manually book (and re-book) sessions through emails. Recruitment logistics, unsurprisingly, were the most challenging aspect of the study. Next time I’ll give clearer instructions about the Calendly scheduling, or look for an alternative tool.
Learning the lookback platform, coordinating participants, and rescheduling was not as automated as I expected. Plus, I wasn’t sure how all the pieces were going to come together. To solve this, my test-runs included the entire scheduling process so I could see how the communication and onboarding linked up. Once I’d done my test-runs, including some login glitches that I troubleshooted, I felt much more confident.
Lessons
We had to move our schedule ahead a week due to lack of participant responses. Being flexible also bought us some to make some crucial changes to the site.
The screening and recruiting process for such a small sample was over done. Our efforts via the membership email and survey weren’t successful. Partly due to an Admin error, only two members responded and only one signed up after being contacted for scheduling. The rest were recruited from friends and family, and we had one no-show from this cohort.
For a small scale study like this, next time I would use more targeted recruitment, such as using word-of-mouth and targeting potential participants directly.
I feel we only scratched the surface of Hotjar’s capabilities. Next year GSA could create more impact by monitoring metrics such as click-thru and conversion rates to identify problems. User research can then be used to improve results and for generative research.
The sample size for the sessions was small, meaning in some cases we had only one example of an error. However, these examples, when triangulated, helped to confirm or reject issues found in the Hotjar monitoring and website evaluations.
Say hello!
In need of a UX Researcher? Comments or questions? I'd be happy to hear from you. If you would like to hear more about my experience and discuss how me might work together, please get in contact.