Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

Update: this article previously identified the firm Metaio as ‘Junaio.’ Turns out Junaio is the app, Metaio its maker.

Most “real world” augmented reality is primitive, at best. Applications like Layar and “rich storytelling” new media startup TagWhat produce software that can overlay, on your smartphone’s camera view, interesting information about places and objects around you, but that’s about it. Their interfaces are relatively static, relying on “tags” or pop-ups to provide information, rather than truly integrating their augmented reality with the real thing.

Not so Metaio. In a video just launched to promote upcoming AR conference insideAR, Metaio is showing off a demonstration technology that overlays augmented reality on top of real reality – with a fidelity of up to 40,000 polygons at once.

The results might look like they were rendered on a Nintendo GameCube, but keep in mind that Junaio is using a mere phone to: analyze a scene, interpret the distance to objects and the orientation of the phone in 6 dimensions, and project images on top of what it sees – all in real time.

The problem that has plagued real-world augmented reality, the kind that, once unleashed, will probably be the most amazing thing anyone has seen since the PC revolution itself, is that it’s really hard to know exactly what the camera that’s to overlay reality with a convincingly augmented simulacrum of it is pointing at.

GPS and similar location services aren’t accurate enough to pinpoint your location, and the orientation sensors in most phones barely know when you’ve rotated the device, much less what its exact pitch and yaw are.

But if a device, like the Tegra-2 and Android-powered Samsung smartphone used in this demo, has stored in it a complete three-dimensional representation of a given physical environment, it can simply analyze the scene visually and map augmented reality on top of it using that information. In this way, GPS and orientation sensors are used solely to narrow down the number of different scenes the device has to search in its map of a given city.

This, in short, is what real augmented reality looks like. And when it arrives – in conjunction with displays that fit into eyeglasses, which need to improve significantly before they can be used for this application – it will transform our lives as much as PCs and smartphones have, if not more.

via Smartplanet

1 comment. Share your thoughts »

Tagged: Computing, augmented reality

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me