Eye tracking is will likely become standard for new headsets in the coming year. What does that mean for VR content creators?
I’ve interviewed no fewer than 35 VR storytellers for Immersive Shooter’s Maker Q&A series. Despite a general lack of consensus on anything related to VR production – from what VR even is to what storytelling rules exist (if any) – I’ve noticed a surprising level of solidarity when it comes to identifying VR’s ‘secret sauce’.
Most creators I’ve talked to agree that VR shines most brightly, existing somewhere between a story and a game. But up to this point, most narrative VR lacks even a basic level of interactivity. However, more and more creators I’ve chatted with recently have begun outlining their plans to increase interactivity in 2018 through a handful of new technologies, from brain-computer interfaces to eye tracking-equipped headsets.
Although brain-computer interfaces are on their way, eye tracking is rapidly becoming a reality in virtual reality.
In January 2017, the first headset with built-in eye tracking was released by Japanese startup FOVE. A few months later, HTC announced one of the startups from its accelerator program was developing an eye-tracking accessory for the Vive. At the same time, companies like Apple, Facebook and Google began snapping up eye-tracking startups.
This January, another headset with built-in eye tracking (and brain sensors!) was announced and was named CES 2018’s most innovative VR product.
Also at CES 2018, a handful of companies showed off impressive eye-tracking tools for real-world and virtual-world applications. One of which, Tobii, is currently in talks with around 10 headset manufacturers, hoping to see integrated eye tracking systems launch by the end of this year or early 2019.
No matter whether eye tracking headset integration will happen in 2018 or early 2019, it’s safe to say a) it’s happening, and b) as a content creator, I’m very excited about it.
It’s easy to see why eye tracking is a worthwhile addition to VR headsets.
Tangibly, it can drastically improve headset performance, automate headset adjustments and provide more detailed analytics. Less tangibly, it can improve the user interface and offer a new level of interactivity.
One of the most immediate, promising benefits of eye tracking is the power to improve user experiences through the use of foveated rendering.
Although we can’t perceive it, only a very small portion of what we see in the real world is rendered in high-definition clarity. The rest is actually a blur of outlines, textures and colours that our brains ‘fill in’ with sharper detail.
Foveated rendering relies on this concept to reduce the overall pixel count of VR experiences without sacrificing image quality. It only renders what you’re looking at, at full resolution, and slightly drops the quality of everything around you, while rapidly re-adjusting as your gaze shifts.
Although this technology isn’t limited to eye tracking equipped headsets, eye tracking makes foveated rendering much more efficient and accurate.
The real value of foveated rendering is that it can reduce the GPU load by between 30 and 50 percent, according to Tobii. That means lower end headsets could run more demanding VR content. It also means potentially smaller and more mobile headset designs, improvements in battery life, and higher fidelity and frame rates at reduced bandwidths. All great news for content creators.
Eye tracking can also be used to automatically adjust a headset’s settings to an individual user. Scan someone’s eyes to immediately load their avatar and – more importantly – adjust the inter-pupillary distance (the distance between someone’s eyes).
That automatic adjustment means less guesswork in adjusting IPDs and could deliver images optimised for a user’s eyes. Meaning content will look better and better with little to no effort from creators.
Today, heatmap analytics for immersive content rely on the direction you’re facing, not where you’re actually looking. Sidelong glance be damned.
Eye tracking will enable significantly more accurate and detailed analytics.
In addition to knowing exactly where a user is looking, you could also potentially measure engagement through pupil tracking. When we see something we like, our pupils dilate. These minute changes can also be used to detect emotion and mental strain.
This level of insight could help us craft better, more engaging stories and even offer individualised, branching narratives based on a user’s behaviour.
Today’s VR relies on unnatural behaviours to control experiences. We turn our heads, point them toward where we want to go, click the controller and teleport. Our heads become a clunky cursor to guide our experiences. Anyone playing games that rely on aim (i.e. most games) will know how frustrating it can be to track a target this way.
Eye tracking turns this four-step process into a two-step process: look, click. That means each interaction in VR is a bit more natural, more fluid, faster and has a shorter learning curve.
One of the rules of immersion in narrative VR is eye contact.
I often ask people appearing in my stories to look directly at the camera during the interview, making ‘eye contact’ with the viewer in the same way a good conversationalist establishes eye contact with the listener. Even though it’s predetermined, this helps establish a sense of presence. Eye tracking could make this more interactive and realistic. This could come in a couple flavours, including gaze-activated and gaze-sensitive experiences.
Gaze-activated experiences mean viewers can initiate an action within the experience by staring at something. A person begins talking only when you look at them. An athlete takes off only after you see them. This could make it easier to ensure the audience doesn’t miss an important moment, as well as offer them greater control.
In addition to gaze-activated experiences, eye tracking can also enable gaze-sensitive interactivity through more natural social interactions. I look at someone, they look back at me. I turn away, they turn away.
Eye tracking could also enable avatars in VR experiences to display more realistic facial expressions: winking, blinking, a raised eyebrow. It’s possible that experiences could even branch, based on the user’s eyes (remember how our eyes can convey emotion, engagement and stress?).
In an interview with Nicole McDonald, who directed Jaunt’s first VR experience offering six degrees of freedom (the ability to move up/down, forward/back, left/right in a headset), she said the first step to making immersive experiences more interactive is responsivity.
For example, in her piece Hue, if you approach the main character in a mean way, he reacts more timidly. Approach him in a friendly manner and he responds accordingly.
“The first step [of interactivity is], if I do something, something else happens and somehow that supports the narrative,” she said.
That partnership between a game and a story? That’s VR’s secret weapon.
Although many virtual experiences may be technologically limited to looking around a space, eye tracking can make this ability feel like a superpower.