Festival of Ocean Films

My Role: UX Researcher

Georgia Strait Alliance’s 2021 film festival website is a platform for browsing films and buying tickets. GSA hired me to make the site more user-friendly in order to promote community engagement and increase ticket sales.

PROJECT OVERVIEW

Project type: Usability Research for a film festival website

Team: UX Researcher and Web Designer

Date: February/March 2021, 5 weeks, 30 hours

Skills: Plan and conduct research, participant recruitment, data analysis, synthesis and reporting.

Methods: usability review, heatmaps and visitor recordings, moderated remote usability testing

Tools: Hotjar, Lookback, Calendly, Google Docs and Sheets, email templates

FoOF_Films-Page_Before-After

The Festival of Ocean Films Gallery page before and after the research.

THE CHALLENGE

When I joined the project, the festival website was newly launched and being built out in phases. I needed to consider the web designer’s time and help her to identify the most critical issues to address. I had 25 hours over 4 weeks to accomplish this and no budget for extra analytics tools.

An excerpt from the usability audit plan document shows the first phase of tasks.

I made a Research Schedule to outline weekly tasks and assignments (this is an excerpt from the full schedule) .

Research Methodology

The Georgia Strait Alliance (GSA) originally asked me to conduct a “usability audit” and provide them with recommendations in a report.

Instead, I proposed a more collaborative three-phase research plan that involved talking to users and included check-ins before between  phases. I took this phased approach since we hadn’t worked together before and I wasn't sure how agile the process would be given the constraints.

I carried out a mixed methods approach consisting of:

Phase 1: Usability review – to quickly uncover usability, accessibility and visual issues

Phase 2: Activity monitoring on Hotjar – to quickly understand visitor traffic and how users navigate and interact with the site.

Phase 3: Remote moderated usability evaluations – to understand how users interact with the site, and why, and to explore any unmet needs

Research Goals

The study explored three key research questions:

  • What are visitors’ current goals and how do they achieve them on the site?
  • What pain points, frustrations, barriers to buying a Pass do visitors encounter on the site?
  • How do visitors browse and watch films?

Collaborative Approach

There’s nothing I love more than advocating for users and making research accessible to teams.

I made the research documents and activities collaborative so that the designer could contribute and see first-hand how people use the website.

FoOF-research-phases

The project was completed in three phases over a five week period, with check-ins between to discuss next steps.

Phase 1: Usability Review

I conducted a quick website usability review to find the obvious problems and check for accessibility issues. 

I borrowed  two guiding questions from cognitive walkthroughs to conduct the review:

  1. “Will the user know what to do at this step?”
  2. “If the user does the right thing, will they know that they did the right thing, and are making progress toward their goal?”


 My review considered:

  • navigation
  • accessibility
  • content consistency
  • mobile responsiveness

Important Findings and Recommended Fixes

  • Several basic accessibility problems, including small text size, low contrast and missing alt tags.
  • Wix’s automatic conversion to the responsive layout contained major problems, including large gaps in content and button target sizes.

 Deliverables

  • A list of issues, prioritised fixes and recommended solutions.

  • A meeting to go over the prioritized issues and discuss ways to address them. 

View the Usability Review Findings

FoOF-usability-review-detail

On desktop I found opportunities to address accessibility and navigation issues, and some inconsistencies in the tone of voice (this is an excerpt from the full list).

FoOF-contrast-checker
FoOF-menu-notes

I showed the Contrast Checker results and a screen shot of the mobile menu annotated with recommendations for improvements.

Phase 2: Activity Monitoring

I set up a free Hotjar account to capture Heatmaps and to record visitor activity.

Hotjar is good for gathering data on visitors’ navigation behaviour. In our case, the data informed both how we could improve the site and our questions for the moderated research sessions.

The-Films-Heatmap-Mar15

An early heat map of the films gallery showed click activity on static film images in the film cards.

Important Findings and Recommended Fixes

  • Too much scrolling to get valuable content. I recommended reprioritizing content and shortening the depth of the cards in the film gallery.
  • On both desktop and Mobile, visitors thought the static film images were links. To fix this the designer linked the images to the film description pop-up. This made more space for film overview text and reduced scrolling.
  • On mobile, visitors tapped the browser’s back arrow and got lost when navigating back to the film gallery on the film description pop-up. I recommended to making the collapse button more obvious (larger and a darker colour).
  • Some visitors methodically reviewed each film sequentially, however, appeared to lose track of the films they’d already viewed. I recommended designing a visual cue for visited films  (for example a check mark or colour change). A ‘favourite’ or ‘save’ button  could be a feature for next year’s festival.
FoOF-HotJar-report-1

I gave the designer a light 5-pager that identified three top priority issues and two “quick fixes”. I supported the research findings with screenshots and visitor recordings.

Deliverables

  • A light 5-pager showing the top three issues and two quick fixes, including recommendations for resolving them.
  • A meeting with the designer to go over findings from key recordings and heat maps.

View the Activity Monitoring Report

Phase 3: Remote Moderated Usability Evaluations

Participant Recruitment

Due to time limitations, we had to start recruiting before finalising the research direction. We felt confident doing this because we had a general idea about what to evaluate, and GSA identified that their membership would be a convenient cohort to recruit from. The membership are enthusiastic volunteers, plus they were the target group and were easy to contact by email.

I wrote the invitation and designed the screener survey and set up Calendly to schedule sessions. I used an email template to send login instructions and a reminder a few hours before the session.

Criteria

Many of GSA’s membership are people in mid- to late-life. I screened for age-related disabilities and accessibility tech such as screen readers and magnifiers.

I also wrote questions to identify people who did or didn’t buy and stream online videos so we had a sense of the respondents’ existing habits.

Recruitment Details

Participant number: 5 participants (+2 to cover dropouts)

Participant target group: GSA membership, local environmentalists, people that watch/stream films online

Availability: on desktop or mobile, Half-hour session with three tasks + feedback interview at end.

View the email Invitation and Recruitment Screener text

 

Research Guide

The designer and I exchanged ideas about research goals and tasks and I finalized the research guide for the sessions. Based on insights from earlier research and new untested web content, we defined three user behaviours to investigate:

  • Browsing films
  • Buying a festival pass (eventbrite e-commerce, email confirmation)
  • Signing in, selecting a film and viewing it

View the Research Guide

FoOF-research-guide-excerpt

The first task from the Research guide helped us understand how users navigate to the films gallery and browse films.

Usability Evaluations

I used a free trial of Lookback, a remote user research platform for the moderated sessions. The designer observed the sessions that I moderated, and afterwards we met for a quick debrief session.

Analysis and Synthesis

In total I conducted three half-hour sessions plus two test-runs. Beyond the session debriefs, I replayed and analysed the sessions, made notes and video highlights of key findings for the designer to review.

Lookback has an impressive transcribing tool, so I also highlighted key quotes from the sessions.

FoOF-lookback-comments
FoOF-remote-moderated-session-setup-1

During live sessions on Lookback the designer and I commented on some of the surprises we encountered. I moderated five remote sessions, including test-runs, to check the tech and research guide.

Deliverables

I made a spreadsheet of key issues for future reference

FoOOF-moderated-session-spreadsheet

I created a spreadsheet of prioritized key findings so that the designer has documentation after the Lookback trial expired.

Important Findings

  • Recurring navigation issues caused by confusing labels for “The Films” and “Get Your Pass” menu items and the “Enter the Festival” Button.

 

FoOF-2quotes
FoOF-password-confusion-video-poster

A screen recording shows that one of the participants tapped the “Enter the Festival Button” to browse films and was surprised by a login pop-up.

FoOF-quote4-1
  • Getting lost between the film gallery and the film details pop-up window. Participants on mobile devices missed the “Close” button and clicked the browser’s back arrow instead. This reroutes them to the home page. I identified this earlier in the Hotjar data; seeing participants struggle in the sessions made the issue even more apparent. 
  • Inconsistent film trailer viewing in the film description pop-ups. There was no control over how the trailers embed via Youtube or Vimeo, and two of the films only had poster images. As a quick fix for the films without trailers, the designer added external links to the films' websites. I recommended consistent labelling across all videos and posters to reduce confusion (for effect, shown below in green).
FoOF-quote5
FoOOF-inconsistent-film-popups-vert-1

The different ways the film trailers show viewing controls were beyond the designer’s control, plus, a couple of films only had posters. I recommended consistent labelling across all trailers and posters to reduce confusion (shown in green).

Impact

The website was new, so we didn’t have any prior metrics, for example sales and visits, to compare. But we did notice two improvements:

Fewer errors in film browsing.

Rethinking menu and button content showed improved navigation to the films browsing page.

FoOF-header-buttons-before-after

The desktop homepage Before and After: the menu item “Get Your Pass!” confused users. It was converted to a call to action button to differentiate it from films browsing and grouped with the “Enter Festival” button to guide user flow. I flagged the yellow visited link colour in the menu for accessibility and legibility issues.

My collaborative approach to the research helped build the web designer's capacity to do usability research. The GSA plans to use these new skills in future projects. Here's an excerpt from her recommendation on LinkedIn:

 You helped me develop my understanding of accessibility and anticipate what can cause potential issues, not to mention exposing me to valuable tools to evaluate usability.
Kirsten McPherson, Website Designer

Appendix

Joys

I was pleasantly surprised by the designer’s buy-in for user research. This really helped to make the work fun, collaborative and led to evidence-based decisions based on user feedback.

Seeing the designer get excited about the research and watching the site evolve based on the results was energizing.

Using my recent learning from UX Research (Qual and Quant) and Sociology courses and putting theory into practice was very rewarding.

Challenges

Only two participants used Calendly as prompted, meaning I had to manually book (and re-book) sessions through emails. Recruitment logistics, unsurprisingly, were the most challenging aspect of the study. Next time I’ll give clearer instructions about the Calendly scheduling, or look for an alternative tool.

Learning the lookback platform, coordinating participants, and rescheduling was not as automated as I expected. Plus, I wasn’t sure how all the pieces were going to come together. To solve this, my test-runs included the entire scheduling process so I could see how the communication and onboarding  linked up. Once I’d done my test-runs, including some login glitches that I troubleshooted, I felt much more confident.

Lessons

We had to move our schedule ahead a week due to lack of participant responses. Being flexible also bought us some to make some crucial changes to the site.

The screening and recruiting process for such a small sample was over done. Our efforts via the membership email and survey weren’t successful. Partly due to an Admin error, only two members responded and only one signed up after being contacted for scheduling. The rest were recruited from friends and family, and we had one no-show from this cohort. 

For a small scale study like this, next time I would use more targeted recruitment, such as using word-of-mouth and targeting potential participants directly.

I feel we only scratched the surface of Hotjar’s capabilities. Next year GSA could create more impact by monitoring metrics such as click-thru and conversion rates to identify problems. User research can then be used to improve results and for generative research.

The sample size for the sessions was small, meaning in some cases we had only one example of an error. However, these examples, when triangulated, helped to confirm or reject issues found in the Hotjar monitoring and website evaluations.

Say hello!
In need of a UX Researcher? Comments or questions? I'd be happy to hear from you. If you would like to hear more about my experience and discuss how me might work together, please get in contact.