Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

A desktop seeing machine created at MIT by a visually impaired artist could help people with poor vision view images, use the Internet, virtually “previsit” unfamiliar buildings, or see the faces of friends.

Its creator, Elizabeth Goldring, is a senior fellow at MIT’s Center for Advanced Visual Studies and has no vision in one eye and little in the other. She got the idea for her invention more than 15 years ago, when she had her eyes examined with a large, expensive machine called a scanning laser ophthalmoscope.

The machine projects an image directly onto a patient’s retina to help determine how much, if any, retinal function he or she has. Someone who still has some healthy retinal cells will be able to see the image when it’s projected onto them. ­Goldring did see images of stick figures, but as a poet as well as an artist, she very much wanted to see a word. At her request, she was shown the word “sun.” She was thrilled.

After that exam, Goldring used the ophthalmoscope–which costs more than $100,000–for a project of her own. She created a “visual language” with hundreds of symbols–representing both nouns and verbs–that could be projected onto the retina. Each symbol is a combination of letters and ­simple graphics; for example, the word “door” is spelled with a d, the outline of a door, and an r. The symbols, ­Goldring says, are more visually economical than their text equivalents.

Goldring’s next goal was to make a cheaper, more portable version of the costly device. With the collaboration of Rob Webb, the ophthalmoscope’s inventor, and dozens of scientists, engineers, and students, that’s what she did. The seeing machine–about the size of a bread box–has an eyepiece, a projector, a computer, a monitor, and a joystick. To cut expenses (the prototype cost about $4,000 to build), she used light-emitting diodes instead of a laser. When a person looks through the machine’s eyepiece, the LEDs project black-and-white images and words from Goldring’s visual language across the entire retina. If any part of the retina is healthy, the person may see the image.

Goldring conducted a pilot clinical trial with 10 visually impaired people. Six correctly interpreted every word-image they were presented with, and all could navigate the corridors of a simulated building using a joystick to move forward, backward, and sideways.

Although the device has been called a “seeing machine,” Goldring is not developing a wearable version to help visually impaired people get around. “It’s too much to expect someone who is visually challenged to see and walk at the same time,” she says. But she is now working on a smaller, even cheaper machine that will allow people to see in color.

1 comment. Share your thoughts »

Credit: Donna Coveney/MIT

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me