A desktop seeing machine created at MIT by a visually impaired artist could help people with poor vision view images, use the Internet, virtually “previsit” unfamiliar buildings, or see the faces of friends.
Its creator, Elizabeth Goldring, is a senior fellow at MIT’s Center for Advanced Visual Studies and has no vision in one eye and little in the other. She got the idea for her invention more than 15 years ago, when she had her eyes examined with a large, expensive machine called a scanning laser ophthalmoscope.
The machine projects an image directly onto a patient’s retina to help determine how much, if any, retinal function he or she has. Someone who still has some healthy retinal cells will be able to see the image when it’s projected onto them. Goldring did see images of stick figures, but as a poet as well as an artist, she very much wanted to see a word. At her request, she was shown the word “sun.” She was thrilled.
After that exam, Goldring used the ophthalmoscope–which costs more than $100,000–for a project of her own. She created a “visual language” with hundreds of symbols–representing both nouns and verbs–that could be projected onto the retina. Each symbol is a combination of letters and simple graphics; for example, the word “door” is spelled with a d, the outline of a door, and an r. The symbols, Goldring says, are more visually economical than their text equivalents.
Goldring’s next goal was to make a cheaper, more portable version of the costly device. With the collaboration of Rob Webb, the ophthalmoscope’s inventor, and dozens of scientists, engineers, and students, that’s what she did. The seeing machine–about the size of a bread box–has an eyepiece, a projector, a computer, a monitor, and a joystick. To cut expenses (the prototype cost about $4,000 to build), she used light-emitting diodes instead of a laser. When a person looks through the machine’s eyepiece, the LEDs project black-and-white images and words from Goldring’s visual language across the entire retina. If any part of the retina is healthy, the person may see the image.
Goldring conducted a pilot clinical trial with 10 visually impaired people. Six correctly interpreted every word-image they were presented with, and all could navigate the corridors of a simulated building using a joystick to move forward, backward, and sideways.
Although the device has been called a “seeing machine,” Goldring is not developing a wearable version to help visually impaired people get around. “It’s too much to expect someone who is visually challenged to see and walk at the same time,” she says. But she is now working on a smaller, even cheaper machine that will allow people to see in color.
Toronto wants to kill the smart city forever
The city wants to get right what Sidewalk Labs got so wrong.
Saudi Arabia plans to spend $1 billion a year discovering treatments to slow aging
The oil kingdom fears that its population is aging at an accelerated rate and hopes to test drugs to reverse the problem. First up might be the diabetes drug metformin.
Yann LeCun has a bold new vision for the future of AI
One of the godfathers of deep learning pulls together old ideas to sketch out a fresh path for AI, but raises as many questions as he answers.
The dark secret behind those cute AI-generated animal images
Google Brain has revealed its own image-making AI, called Imagen. But don't expect to see anything that isn't wholesome.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.