Skip to Content

German Firm Metaio Demonstrates Real Augmented Reality

Overcoming the registration problem in augmented reality is easy – as long as you’re willing to cache a three dimensional map of an entire city

Update: this article previously identified the firm Metaio as ‘Junaio.’ Turns out Junaio is the app, Metaio its maker.

Most “real world” augmented reality is primitive, at best. Applications like Layar and “rich storytelling” new media startup TagWhat produce software that can overlay, on your smartphone’s camera view, interesting information about places and objects around you, but that’s about it. Their interfaces are relatively static, relying on “tags” or pop-ups to provide information, rather than truly integrating their augmented reality with the real thing.

Not so Metaio. In a video just launched to promote upcoming AR conference insideAR, Metaio is showing off a demonstration technology that overlays augmented reality on top of real reality – with a fidelity of up to 40,000 polygons at once.

The results might look like they were rendered on a Nintendo GameCube, but keep in mind that Junaio is using a mere phone to: analyze a scene, interpret the distance to objects and the orientation of the phone in 6 dimensions, and project images on top of what it sees – all in real time.

The problem that has plagued real-world augmented reality, the kind that, once unleashed, will probably be the most amazing thing anyone has seen since the PC revolution itself, is that it’s really hard to know exactly what the camera that’s to overlay reality with a convincingly augmented simulacrum of it is pointing at.

GPS and similar location services aren’t accurate enough to pinpoint your location, and the orientation sensors in most phones barely know when you’ve rotated the device, much less what its exact pitch and yaw are.

But if a device, like the Tegra-2 and Android-powered Samsung smartphone used in this demo, has stored in it a complete three-dimensional representation of a given physical environment, it can simply analyze the scene visually and map augmented reality on top of it using that information. In this way, GPS and orientation sensors are used solely to narrow down the number of different scenes the device has to search in its map of a given city.

This, in short, is what real augmented reality looks like. And when it arrives – in conjunction with displays that fit into eyeglasses, which need to improve significantly before they can be used for this application – it will transform our lives as much as PCs and smartphones have, if not more.

via Smartplanet

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.