There are dozens of augmented reality HMDs that have emerged over the past 4 years. In that time, I’ve used most of them. As a Human Factors technician at Eyefluence, user experience is one of the first things I think about when I put on an HMD. The current generation of devices are incredible. They demonstrate what the future holds for AR, but they’ve only just scratched the surface of what’s possible when you blend the digital and physical worlds together. AR devices are still maturing, still experiencing growing pains, but even as we watch their awkward development it becomes clear they are destined for great things.
Consider as an analogy a particularly gifted child: they’re smart, extraordinary, and by definition expert in particular areas, but they’re still a child. Still immature. That’s AR for me. That’s not to say it’s not amazing! There are products I’ve donned that make me feel like a sorcerer. Take the Microsoft Hololens, for example. For every person who’s ever had that dream of magic, of being able to directly control and manipulate the world around you, the HoloLens makes you feel that. For a moment it really did make me feel like a wizard.
Then, instinctively, I attempted to control the interface with my gaze. I looked at something I wanted rather than turned my head towards it and when I couldn’t, when the reticle didn’t move to where I was looking, I felt an overwhelming sense of frustration! For a moment I felt trapped – claustrophobic even. I’ve been conditioned in a Pavlovian fashion to expect my eyes to have control, for my eyes to be the vehicle through which actions are channeled. When that doesn’t happen the magic dissipates and while I know the HoloLens is beautiful and extraordinary, I also know it’s incomplete.
Every future AR product will have eye-tracking and eye-interaction. How could it not? When you are wearing a display on your face, your first instinct is to control the interface with your eyes, whether you realize it or not. Every action begins with your eyes, through purposeful and non-purposeful eye movements. I’ve experienced this personally and can say that eye-interaction adds the final degree of freedom for AR. Once a user experiences that freedom it’s impossible to take it away. Eyefluence gaze technology is the banana to the AR foster and the two together make a proper dessert.
Okay, the banana thing is a cheesy sentiment to be sure, but few things separate man and ape more starkly than the ability to transform the raw materials around us into the complex cultural and technological trappings of society; hence why there are few examples better than the transformation of a simple banana into a bananas foster. This, then, is what AR without eye-interaction is for me…a banana. Nothing wrong with a banana, but I find it wanting. At Eyefluence we transform intent into action through your eyes and AR needs all of that action to push through to the next stage of its evolution.