The eyes have it: How one startup aims to change the future of VR

12.02.2016
Augmented and virtual reality headsets could be the future when it comes to computing, but right now, navigating virtual worlds and interacting with applications is often a clunky experience.

That's the problem Eyefluence aims to solve with a technology that tracks the movement of your eyes. Eyefluence isn’t developing its own VR and AR headsets; it's leaving that to other companies and hopes to put eye tracking inside their devices.

The company is led by CEO Jim Marggraff, an entrepreneur best known as the mind behind the LiveScribe pen. So far, Eyefluence has raised $21 million in funding from Intel, Motorola Solutions and others.

Marggraff argues that in the coming years, more companies will have to add eye tracking to their headsets — anything else, he says, would be missing a key method of interaction. After taking the company’s technology for a spin, I’m inclined to agree.

Eyefluence isn't going public yet with exactly how its technology works, but what I can say is that it uses a set of eye gestures to control how users move through menus and interact with apps. It sounds a bit abstract until you try it, but the effect is impressive. During my first demo using a modified Oculus Rift headset, I picked the system up in about 5 minutes using a tutorial the company built. The second time, I felt right at home and sailed through the tutorial in no time. 

One demo allowed me to zoom and pan in a “Where’s Waldo” scene, searching out the iconic character using my eyes. Instead of just zooming in on the center of the scene, Eyefluence’s tech let me zoom to the spot where I was looking, making it easier to navigate where I wanted to go. 

Marggraff also showed the technology added to a pair of ODG’s R6 augmented reality glasses. He showed a “World Store” application that takes what a user is looking at, snaps a picture of it, passes it off to an outside service, and buys it on Amazon.

Eyefluence was naturally focused on its eye tracking technology, but I think headset makers will be best off combining eye tracking with head tracking and even physical and hand gesture controls. Using your eyes and head to navigate is fine for some tasks, but doing something like manipulating a 3D manufacturing model seems like it might be better with more granular physical input on top of that. 

But using a headset without eye tracking just feels like a lesser experience after knowing what it's capable of. 

There’s one other snazzy feature of Eyefluence’s technology: it can be used to power foveated rendering, a technique that generates high-quality graphics only at the spot where someone is focused, and shows low-res imagery everywhere else. 

Headset makers want to get rid of the cables that tie their devices to computers, and one way to do that is to reduce the processing power required by using foveated rendering, which Marggraff said can allow big performance gains without users perceiving a difference.

He loaded up a rendered scene for me in Unity, and switched on foveated rendering. Marggraff said the technology had reduced the load on the computer rendering the scene by 60 percent, yet I didn't notice any difference. 

That may have had something to do with the fact that I'd removed my massive nerd glasses in order to fit my head inside the modified Rift to try out the technology, but even so, the effect was impressive.

That gives Eyefluence multiple angles to try to sell its technology to headset makers. One company might want a user interface optimized for hands-free eye interaction, with the ability to use foveated rendering just a bonus. For another company, it could be the other way around. 

Eyefluence is counting on partners to bring its technology to market, and it's not saying yet when people will be able to get their hands on it in a finished product, though Marggraff said it’s not imminent.

Blair Hanley Frank

Zur Startseite