Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo


Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

Pollution visualized: Another application developed at Columbia shows carbon monoxide levels projected over New York City. The height of each ball reflects concentrations of the pollutant.

Another potential obstacle for AR is social acceptance. While people already text or check e-mail while they walk, looking through a phone can be awkward. Feiner suggests that well-designed goggles could help. “There’s a very high bar of what people are willing to wear on their heads,” he says.

Last spring, a group at the MIT Media Lab demoed an interface that avoids the need to look at a display altogether. Graduate student Prana Mistry, a 2009 TR35 winner, developed SixthSense, a device that combines a webcam and a projector worn around the neck, along with colored markers on the fingers, to recognize a user’s gestures and project information onto surfaces. (See a TR video of SixthSense in action here.)

“Your world can be augmented without you having to change your behavior and do anything extra [like] taking out your cell phone and starting an application,” says MIT professor Pattie Maes, who heads the SixthSense project. Maes’s group is also exploring technical applications for AR. “If my car stops working, I might open the hood and an expert might remotely see what I see and [then] project information in front of the engine, saying things like, ‘Open this valve,’” explains Maes.

Nokia’s Mobile Augmented Reality Applications and Mixed Reality Experiences projects aim to use a combination of hardware in AR applications. Ville-Veikko Mattila, the senior research manager at Nokia Research Center, believes that combining visual and audio information could be most practical. “I think it’s clear that people won’t be walking and holding a device upright. Therefore, the use of audio may be more intuitive,” he says.

Mattila adds that AR could potentially combine social information and location-based services to give user-tailored recommendations. For example, an application could show what your friends think of a particular restaurant, instead of providing a guidebook’s reviews.

“There’s a lot of hype obviously,” Feiner says. But ultimately he agrees that AR may be able to help people with their daily lives. “Like being able to get somewhere, find information, or recognize a face of a person you know, but can’t remember the name of,” he says.

1 comment. Share your thoughts »

Credits: Ohan Oda and Steve Feiner, Columbia University, Sean White and Steve Feiner, Columbia University

Tagged: Computing, displays, augmented reality, location-based services, context-aware computing

Reprints and Permissions | Send feedback to the editor

From the Archives


Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me