Vive’s Facial Tracker Makes VR Experiences More Realistic
HTC recently released the Vive Facial Tracker, which tracks your facial expressions and maps them to your virtual avatar. Now you can fully express yourself in virtual reality. But why would you want to?
Let’s explore a few potential use cases to deploy the Vive Facial Tracker in your business. But first, we should introduce you to the hardware.
Face Tracking Peripheral
The Vive Facial Tracker is a small peripheral with two infrared depth cameras designed to map the movements of your face. The device attaches to your VR headset and hangs in front of your mouth to capture every facial expression. It maps your lips, tongue, chin, and cheeks, which allows you to bring all your facial expressions into virtual environments.
Facial expressions are an essential part of human communication. Our expressions often tell people more about our emotions than our words. You can tell when people are interested or not, excited, or bored, or if they’re happy, sad, angry, or annoyed, just by the expressions on their faces. Until now, the expressive side of communication has never been possible in VR.
Last year, we started working with the Vive Pro Eye, a VR headset from HTC that includes built-in eye-tracking cameras from Tobii. Eye-tracking in VR has several benefits, including performance improvements, heat mapping, and attention analysis. It also gives your virtual avatar a more natural appearance by giving them animated eyes that map your own. In a gaming experience, that’s not very important. Still, in a multi-user collaboration or training solution, it’s the difference between making eye contact with who you’re talking to or interacting with an expressionless avatar.
The Vive Facial Tracker is designed to pair with HTC’s Vive Pro lineup, and when paired with the Vive Pro Eye, you can animate your entire virtual face. Officially, the Facial Tracker only fits on Vive Pro headsets, but with a 3D printer, you can make it work with any headset.
What can you do with the Facial Tracker accessory?
The most obvious use for the Vive Facial Tracker is live multi-user interaction. Apps like VRChat and AltspaceVR allow people from around the world to meet in virtual environments. VRChat supports full-body tracking with Vive Trackers attached to your feet and hips. It doesn’t yet support the Vive Facial Tracker. But imagine having your mouth and eyes tracked to bring life to your virtual avatar.
Neos is currently the only consumer application that supports the Vive Facial Tracker, but we can quickly implement it into a custom VR experience with HTC Vive’s Facial Tracking SDK.
Digital animators will love this!
The Vive Facial Tracker brings exciting improvements to live human interaction in VR, but that’s not all it’s suitable for. The Facial Tracker is a perfect tool for digital animators that want to bring their characters to life. Voice actors can bring their expressions to their characters while acting out the lines, rather than having an animator recreate the muscle movements in the face.
Big video game and movie production studios have used expensive rigs to capture facial expressions to animate digital characters for years. The Vive Facial Tracker makes that kind of digital animation accessible to practically any studio. We’re sure our friends at FlipsideXR are excited about this technology.
Facial Expressions in Soft Skills Training
Here at BSD, our primary interest is in making educational and training content, and we think the Vive Facial Tracker will help us create new types of training solutions. For instance, we believe that facial expressions could improve soft skills training in VR in a couple of ways.
For instance, a multi-user experience, where two or more people must interact, becomes more realistic when the characters are animated with the actual user’s expressions.
Furthermore, we can monitor how people react to specific scenarios based on their expressions. For example, If we put someone through a management training scenario where they must deliver bad news, such as firing an employee, their facial expressions can tell us a lot about the person. If that person is smiling while firing their employee, they may not be the right person for a management role.
Building on the idea of tracking someone’s emotions while they perform a task, we can take the concept of monitoring facial expressions while we put people through stressful, high-risk situations. Some people aren’t cut out for some jobs, and assessing them before they get to the job site is very important. We have yet to determine if the facial expressions will bring any meaningful value to that assessment, but it’s an idea we may explore.
Just The Beginning
The Vive Facial Tracker is the first device of its kind on the market, but it won’t be the last. HP is getting ready to release its Reverb G2 Omnicept Edition, a VR headset with a face-tracking camera built into it. Megadodo Games is making a headset called the DecaMove 1, which also includes a facial expression camera. The DecaMove headset is geared towards the consumer market, whereas the Omnicept is strictly meant for the enterprise market.
Over the next year or so, the industry will figure out the real value of facial tracking in VR experiences. We’re excited to be at the forefront of this new advancement.
If you have an idea for a VR training solution that incorporates facial tracking, we’d love to build it for you. Let’s have a chat.