The Consumer Electronics Show (CES) is the world’s largest electronics trade show and every year companies from around the world attend the event to show off their upcoming devices. Being developers in the immersive technology space, we were particularly interested in the new trends in virtual reality and augmented reality.
This year, HTC held a press event ahead of CES, in which it revealed a handful of Vive-branded products. HTC is one of the leading companies in the virtual reality industry, and it is one of the few VR hardware makers that is thinking about the enterprise market as much or more than the consumer market.
Naturally, we were excited to learn about HTC’s upcoming product line. In past years at CES, it revealed the HTC Vive headset, the Vive Pro enterprise-level headset, the Vive Tracker accessory platform, and the Vive Wireless Adapters. This year, the company announced two new headsets, including a new consumer-targeted device called the Vive Cosmos.
Eye-tracking technology is considered one of the most important innovations in VR technology–a holy grail if you will– and developers and consumers alike have been waiting for years to get their hands on VR headsets with this coveted technology.
Eye-tracking has many benefits for VR, including new ways to interact with digital interfaces, advanced analytics possibilities, improved system performance, and more convincing social interactions. In short, eye-tracking in virtual reality is a bit of a game changer.
The most obvious advantage to eye-tracking technology is the ability to interact with software without required physical input from your hands.
Gaze-based menu systems allow you to interact with software with a simple glance, which frees your hands for other tasks, which can be particularly useful in training simulations like the ones we build here at BSD.
For example, we built a welding training simulation for one of our clients and we fashioned a peripheral to go along with it. Unfortunately, the peripheral doesn’t include a button, so the user must switch between a controller and our custom device to start the training application.
With an eye-tracking system, you wouldn’t need to switch between the wand controller and our welding prop to launch the simulation.
Eye-tracking hardware also enables developers to capture analytics data to study the effects of VR simulation applications. By looking at the eye-tracking data, we could determine what trainees pay attention to and when they shift their focus from one object to another.
OvationVR is using the Vive Pro Eye’s eye-tracking hardware in its public speaking VR simulation for exactly that purpose. The company uses the eye-tracking tech to analyze how much time speakers spend looking at the teleprompter instead of the audience they’re supposed to be speaking to.
That type of information would enable us to measure the effectiveness of training simulations to a higher degree than currently possible.
Eye-tracking technology also opens the door for convincing social interactions in virtual reality.
Right now, you can hop into a multi-user simulation (or multi-player game) and collaborate with other people in a virtual space. But the avatars that we currently use to represent people in virtual experiences are devoid of realistic physical traits. Eye-tracking enables use to get a step closer to using realistic virtual representations of ourselves by animating your true eye movement, which can lead to better animated facial expressions that can bring an otherwise emotionless character to life.
It may not seem like an important detail, but eye-trackers make interactive experiences much more realistic because they enable eye-contact with non-player characters (NPCs) and other avatar characters. Life-like eyes on virtual characters make the experience much more convincing. Even when a character looks like a cartoon, you still get the impression that they’re there with you in the virtual world..
Perhaps the most important feature that eye-tracking enables is something called Foveated Rendering, which is a complicated technology that enables higher performance and better image quality at the same time.
Today, if you want to run a VR application, your computer must drive the full scene at once, which is hard for all but the best computers to perform. With eye-tracking, we can render the area of a scene that you’re specifically looking at high-resolution while limiting the image quality of the space outside of your direct line of vision.
At Bit Space Development, we’re not yet experimenting with Foveated Rendering. However, other development firms are, and they are excited about what this new technology brings to the table. A company called ZeroLight is working on an automotive visualization tool, and thanks to the Vive Pro Eye’s built-in eye-tracking technology, the company is able to render its application at 9x the resolution of the headset, to produce a vivid and crisp image.
We look forward to what we can do with this tech once we get our hands on Vive’s new headset.