The world’s first bionic eyes have now been attached to the retinas of dozens of blind or nearly blind people, and we are just now beginning to get a sense of what those patients see.
People with these implants have the ability to distinguish light from dark, and they can recognize the outlines of objects in their view. However, the artificially created vision is also distorted in certain characteristic ways, says Geoffrey Boynton, a professor of psychology at the University of Washington. New computer-simulated images, based on reports from people with retinal implants as well as fairly well-established knowledge of how cells in the retina respond to electrical signals, can help illustrate these distortions, says Boynton, who conducted the research along with a fellow University of Washington psychology professor, Ione Fine. This information can serve as the basis of future, more advanced models that might help technologists develop next-generation devices with a better chance at re-creating real vision.
The only clinically approved retinal implant is a device called the Argus II, made by the company Second Sight (see “Bionic Eye Implant Approved for U.S. Patients”). It has been used to treat patients with retinitis pigmentosa, a disease characterized by degeneration of photoreceptors, the cells in the retina that are sensitive to light. A camera captures images and the device converts them into electrical pulse patterns, which are then delivered to the retina via an implanted electrode array.
One challenge to achieving better vision with today’s prosthetics is that because of the retina’s anatomy, the electrode arrays tend to stimulate more cells than the ones they are targeting. This is why patients report seeing streaks, says Boynton. Another difficulty is that today’s implants don’t account for the wide range of cell types in the retina. As a consequence, certain cells fire together that would not do so in a normal eye, making the resulting images difficult to comprehend.
Many efforts to improve the capacity of retinal prosthetics have focused on increasing the resolution of the electrode array (see “Vision-Restoring Implants That Fit Inside the Eye”). But Fine and Boynton’s simulations assume that that the array they are modeling has significantly higher resolution than the Argus II, and the resulting images suggest that other approaches are needed to create more comprehensible perceptual experiences.
How AI is reinventing what computers are
Three key ways artificial intelligence is changing what it means to compute.
These weird virtual creatures evolve their bodies to solve problems
They show how intelligence and body plans are closely linked—and could unlock AI for robots.
A horrifying new AI app swaps women into porn videos with a click
Deepfake researchers have long feared the day this would arrive.
Surgeons have successfully tested a pig’s kidney in a human patient
The test, in a brain-dead patient, was very short but represents a milestone in the long quest to use animal organs in human transplants.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.