YOUTUBE VR

I work on several projects involving real time graphics at YouTube including YouTube VR. Starting out, there was no dedicated researcher, so I ran user research sessions myself with the PM.

Live research session

We used diary studies and remote sessions with users to define critical user journeys and determine the project roadmap from feature priorities. Working with engineers, we introduced support for playlists, reading comments, better player controls, a unified dark UI theme, search filters, and a way to browse movies.

Browsing movies in YouTube VR

Here's a before-and-after with some of the UI changes:

2019 YouTube VR Interface

2021 YouTube VR Interface

Introducing hand tracking required a lot of special nuance to the visual feedback system as well as tricky things about ray angles, when and how to be accepting user input, and how to make scrolling not terrible. I still have a lot of ways this can be improved.

Hand tracking shader construction

Hand tracking in YouTube VR

Passthrough XR support represents the beginning of XR use cases in consumer devices.

Passthrough video in YouTube VR

Going forward, there are ways to improve quality of immersion.

Improved environmental immersion

I also introduced ways to zoom 360 videos in YouTube's mobile app with subtle haptic feedback and sticky moments for the midpoint and endpoints.

Zooming a 360 video on a phone

More to come as time goes on in YouTube VR and beyond.

Penrose triangle in YouTube's style