Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

Eye-tracking AR systems like this one could give rise to as many problems as they solve

Systems that attempt to make augmented reality less intrusive could throw off our sense for how we’re moving, and might even lead to disorienting motion sickness, says Eric Sabelman, a functional neurosurgery bioengineer at Kaiser Permanente. The problem, says Sabelman, is that shunting alerts and other interface elements to the periphery of our visual field isn’t much better than having them float in our central field of view.

“There seems to be an assumption that peripheral vision is idle and putting icons there will have no impact on visual signal processing. Most of our information on movement/velocity relative to the environment comes from peripheral vision,” Sabelman said in an email.

The research he was addressing – a prototype by Japanese researchers that combines eye tracking with a traditional eyeglasses-based augmented reality projection system to shunt interface elements into peripheral vision – uses as one example the problem of giving GPS navigation directions to a bicycle rider without blocking their vision. That’s precisely the kind of scenario in which placing elements in peripheral vision could be problematic.

“[There is] no problem with a static image in the corner of your eye if you are at a desktop, but it will present conflicting information if you are walking or driving,” he adds.

Other complications could arise from interfaces that present interface elements that are inherently fixed in space relative to our gaze. For example, the eye rapidly “accomodates” to an image at a fixed location on the retina, rendering it invisible. Keeping interface elements visible could require jiggling them subtly, which might lead to further visual confusion as the user’s brain interprets such movement as movement of their real-world surroundings.

This could mean that using an augmented reality heads up-type display to text or browse the web while walking could be distracting enough to significantly impair a user’s ability to observe hazards in their environment.

Mixing fixed elements into a dynamic real environment could also lead to “simulator sickness” in some users, Sabelman speculates. The factors that contribute to illness in virtual reality environments aren’t completely understood, but static elements in a user’s peripheral vision could be a contributing factor.

This doesn’t mean augmented reality is automatically a boondoggle, any more than ipods and cell phones have proven to be – but it does mean we might have to some day expand on those no texting while driving laws.

“I suppose we could learn to [walk or drive and use an AR system at the same time],” says Sabelman. “We can do all kinds of optical manipulations [on test subjects] and people will learn to interpret them correctly… but it adds extra load to our visual processing.”

3 comments. Share your thoughts »

Tagged: Computing, augmented reality, neurology, eye-tracking, AR

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me