Sky - increasing accessibility for Sky Glass with voice control

Making the TV experience better for all, with voice

Subtitles with Sky Glass.

Responsibilities: UX/UI design, UX research, UX writing, Team Lead
Length: 6 weeks
Team: Part of a team of 5 UX/UI designers
Year: 2025

HMW create a seamless experience with content and subtitles?

As part of the Kings College London course, I worked in a team to problem solve the UI and UX of Sky Glass: using and improving the UX of voice control, of content navigation and playback.

Firstly, to better understand voice control and users, we researched the market and its competitors.

Sky made it clear that they wanted better reach into those with accessibility needs. Sky has a dedicated community forum to those users and we researched this forum for insights, too. We also asked our peers as to how they used voice control, their TV, streaming services. What issues did they have?

There were several pain points, but to focus on 1, we decided to see how the subtitle experience could be better improved within the TV experience and using voice control: without any/too much reliance on a remote control.

At this point, a research plan was created and actioned to understand in more detail issues with subtitles and voice control.

Feedback ranged from:

“I use voice control when my hands are occupied, e.g., during cooking or moving around the house, by settings timers/adding to lists.”

to

“Mixing both the remote with voice control is frustrating, especially if the remote buttons are small or too sensitive.”

We developed an empathy map, based from Sky's user personas and our research with real people.

And a storyboard added to the understanding of the emotional responses to the environment.

From the key takeaway, we developed HMW questions to further explore via ideation.

HMW allow for the user to create a collection place for their favourite content
HMW allow for user to do regular commands
HMW allow for auto-play
HMW get the user to learn what accessibility features are available and what it suitable for them
HMW design for the voice control to recognise various accents?
HMW remind user of how accessibility features work/what’s available
HMW allow for the flexibility of lifestyle and be adaptable for a routine?
HMW set up regularly watched/similar content landing page
HMW design for the voice control to recognise various accents?
What is specific to the remote control that could be voice controlled?

With these set, we ideated to answer each HMW question.

This resulted in a defined problem statement:

Users who consistently rely on subtitles - whether due to hearing loss, sensory sensitivity, or language needs - are often frustrated that subtitles don’t stay enabled across apps or sessions. When voice commands like “turn on subtitles’” are misinterpreted or unrecognised, this forces them to navigate complex menus to activate a core accessibility feature.

We created a mid-fi prototype to test with our research subjects further, using Figma. The mid-fi focused on data privacy and security, less reliance on a remote control to assist a poor voice control experience and options within voice control.

Our empathy map, wireframes for our prototype and user flow diagrams are here.

See the mid-fi prototype here: https://www.figma.com/proto/ZZtLXLYw1d0BMybjS0XuB1/Team-4---Employer-Project---User-Flow--Empathy-Map-and-Wireframes?node-id=328-166&t=6iryYD50l0ClMkUw-1

Our design iterations from user feedback of our prototype and overal conclusions, as well as our other design process assets are within the presentation below.

All Recent Work