A View from Nidhi Subbaraman
AR Goggles Restore Depth Perception To People Blind in One Eye
Software written for augmented reality glasses creates and projects images for the healthy eye, giving a wearer the feeling of depth.
Being able to see with both eyes comes with a perk: the ability to judge distance in 3D. Say, between a plate of food on the table and the saltshaker, or the space between the front of your car and the bumprt of the vehicle ahead of you.
People who’ve lost sight in one eye can still see with the other, but they lack binocular depth perception.
Some of them could benefit from a pair of augmented reality glasses being built at the University of Yamanashi in Japan, that artificially introduces a feeling of depth in a person’s healthy eye.
The group, led by Xiaoyang Mao, started out with a pair of commercially available 3D glasses, the daintily named Wrap 920AR, manufactured by Vuzix Corporation. (Vuzix is also building another AR headset called the M100 that on first sight looks like quite the competitor to to Google Glass.)
The Wrap 920AR looks like a pair of regular tinted glasses, but with small cameras poking out of each lens. The lenses are transparent and the device, Vuzix explains on its website, both captures and projects images, giving the wearer of the device front-row seats to a 2D or 3D AR show transmitted from a computer.
The group at Yamanashi have created software that makes use of the twin cameras. When a person puts the glasses on, each camera scopes out the scene that each eye would see. The images are funneled into software on a computer, which combines the perspective of both cameras and creates a “defocus” effect. That is, some objects to stay in focus while others stay out of focus, resulting in a feeling of depth. That version of the scene in front of them is projected to the single healthy eye of the wearer.
8 volunteers with two healthy eyes each tested the setup. They had one task, to pick up and place a cylindrical peg in a groove in front of them. All but one of the volunteers did this quicker when a composite image was projected into one lens.
The system isn’t quite ready to be taken for spin around town yet. It’s bulky still, the creators write, and needs a computer by its side, creating and projecting images in real time. But the creators admit such computing power is likely to be found on mobile devices soon, and when it is, they’ll be ready.
Become an MIT Technology Review Insider for in-depth analysis and unparalleled perspective.Subscribe today