Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

Being able to see with both eyes comes with a perk: the ability to judge distance in 3D. Say, between a plate of food on the table and the saltshaker, or the space between the front of your car and the bumprt of the vehicle ahead of you.

People who’ve lost sight in one eye can still see with the other, but they lack binocular depth perception.

Some of them could benefit from a pair of augmented reality glasses being built at the University of Yamanashi in Japan, that artificially introduces a feeling of depth in a person’s healthy eye.

The group, led by Xiaoyang Mao, started out with a pair of commercially available 3D glasses, the daintily named Wrap 920AR, manufactured by Vuzix Corporation. (Vuzix is also building another AR headset called the M100 that on first sight looks like quite the competitor to to Google Glass.)

The Wrap 920AR looks like a pair of regular tinted glasses, but with small cameras poking out of each lens. The lenses are transparent and the device, Vuzix explains on its website, both captures and projects images, giving the wearer of the device front-row seats to a 2D or 3D AR show transmitted from a computer.

The group at Yamanashi have created software that makes use of the twin cameras. When a person puts the glasses on, each camera scopes out the scene that each eye would see. The images are funneled into software on a computer, which combines the perspective of both cameras and creates a “defocus” effect. That is, some objects to stay in focus while others stay out of focus, resulting in a feeling of depth. That version of the scene in front of them is projected to the single healthy eye of the wearer.

8 volunteers with two healthy eyes each tested the setup. They had one task, to pick up and place a cylindrical peg in a groove in front of them. All but one of the volunteers did this quicker when a composite image was projected into one lens.

The system isn’t quite ready to be taken for spin around town yet. It’s bulky still, the creators write, and needs a computer by its side, creating and projecting images in real time. But the creators admit such computing power is likely to be found on mobile devices soon, and when it is, they’ll be ready.

3 comments. Share your thoughts »

Credit: Vuzix Corporation

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me