We're looking for new writers to join us!

CES 2019: Eye tracking with Tobii

by: John -
More On: CES 2019

So one of the big announcements in the VR space is HTC coming out with the VIVE Pro Eye, which features Tobii's technology for eye tracking. While walking through the halls, I saw the Tobii booth and decided to stop by to see what all the fuss is about. I've never tried eye tracking so here was a chance for me early on in the show schedule.

I was led up to the second floor of the booth where they had an HTC VIVE Pro Eye ready to go. A quick calibration and the tracking was set. Two mirrors were put in front of me showing me an avatar with eyes. One mirror had the eye tracking feature turned off, the other was turned on. The one without the eye tracking kept tracked of my moving head, but it had very lifeless eyes. Looking over to the other mirror, the eyes of the avatar moved along with mine, which at first was kinda creepy. I even tried to close one eye to see if it mimicked that and it did sometimes. I would alternate closing an eye and I would say 30% of the time I would see the correct eye open and closed in the mirror. The other times, both eyes in the mirror was closed shut so there seems to maybe be a little bit of improvement needed to detect when a single eye was closed. What was accurate though was how it kept up with where I was looking at.

Three short demos were presented to me that really showed off the benefit of this tech.

First up, I was tasked to pick up some rocks and throw them at bottles at different distances. Now, if you ever tried to throw objects in VR games, they can be tricky at times. So as usual, my first few throws were really off the mark or short. I purposely did a weak toss as I didn't want to lose control of the Vive wand and damage their equipment.

The eye tracking tech was then activated and this time I looked at the bottle I wanted to throw to and used the same semi-weak toss I did earlier. This time, the rock nailed the close bottles without any problem. I mean it hit it dead center. Then I looked at a few bottles very far away and did the same strength toss, not really aiming but just motioning in the general direction of the bottle. And with the accuracy of Hawkeye, the mighty Avenger, I was able to knock it over with the rock.

For the next scene, I was put in front of two small robots and a menu for ordering items. Whichever robot I looked it, it would turn to look at me. I didn't even have to turn my head as when my eyes gazed over to the other one, the one I was previously looking at turned away and the one I currently looked at turned to look at me. I went back and forth making each robot look at me. 

I then looked over to the menu board and without turning my head, I looked at the item I wanted to "order". Once I made my choice and press the button, the correct item dropped in front of me from a drone. Now, I easily could have moved my head to select the item I wanted to pick out, but with the benefit of eye tracking, selecting items was much more accurate and easy to do.

GunHeart was the last demo and let me tell you, the eye tracking's effectiveness was really driven home in this one. I fired weapons without it and just like normal, the bullets or arrows flew straight without any guidance. Turning on the eye tracking, I then gazed at the enemies I wanted to hit. They lit up with a white outline and when I fired my weapon, the projectile curved to its target and nailed it 100% of the time. I felt pretty powerful with my gaze of death raining hellfire down on the enemies trying to attack my crystal. 

Now, a big thing I was interested in was foveated rendering and to show that off, a red box was displayed to show where my gaze was. With a push of a button, any place that I wasn't focusing on showed a very pixelated view. As I moved my eyes, the area that was pixelated became clear when I looked at it and the previous areas that were now not in my field of vision became a garbled mess. The person running the demo then reduced the pixelation but still kept it a little heavy and I continued to look around. Finally, he got rid of all the heavy pixelation and you'd be hard pressed to know that the areas you weren't looking at weren't rendered in full quality versus what you did look at. 

This is what will help with increasing picture quality in HMDs while keeping the specs lower on your computer. This will also help for say wireless attachments whereby you can decrease the bandwidth needed to wirelessly send to the headset because only the areas you are focusing on would get the full high quality image and the lesser quality, lower bandwidth eating area would be sent along with that data.

I have to say, the eye tracking of the Vive Pro Eye was amazing and I can see how this technology would have a lot of benefits. Now, the real question is will it priced at a cost for consumers to not have to balk at purchasing. Of course, my thinking for the first iteration is no considering this is being featured in HTC's Pro line and well, new technology is rarely ever priced for consumers.

But that's not what I wanted to focus on here. Tobii's eye tracking is really awesome and I can't wait to see other headsets take advantage of this to improve on the experience that we already have in VR.