With funding from the U.S. Department of Defense, several researchers are making progress toward more humanlike prosthetic hands—ones that give users a sense of control and touch.
Scientists from Stanford announced a new type of pressure sensor in the form of a flat, flexible material that might eventually serve as an artificial skin that could go over prosthetics, allowing users not just to manipulate objects but also feel them. The sensors send pulses that the brain interprets in order to determine a certain sense of touch. “It’s directly mimicking the biological system,” says lead researcher Zhenan Bao.
The “skin” is made from plastic which is printed with a waffle pattern to make it compressible. Embedded inside are carbon nanotubes—tiny cylinders of pure carbon that conduct electricity. Squeezing the material brings the rods closer together, creating more rapid pulses as the pressure increases.
In a paper in this week’s issue of Science, Bao and colleagues claim the sensors can pick up gradations in pressure that are equivalent to the difference between a firm handshake and a limp one. This is just one component of touch, and it wasn’t tested in humans. Instead, Bao and her colleagues sent the signals to slices of mouse brain in vitro—just to show they could get the sensor to communicate with neurons.
Still, it’s a promising step in the quest to make prosthetic limbs more real. “It’s impossible to do much of anything with a prosthetic hand if you can’t feel,” says Sliman Bensmaia, a biologist and computational neuroscientist who works on artificial limbs at the University of Chicago. Touch, he says, is hard to re-create because it’s a surprisingly complex sense. We can not only tell silk from satin, but also distinguish cheap silk from good silk. We do this because our skin can sense textures down to 10s of nanometers.
Eventually the researchers are hoping to channel information from artificial sensors into the peripheral nerves that were once connected to the lost hand. Already they’ve created interfaces that give users the ability to open and close their hands (see “An Artificial Hand With Real Feeling”). What they’d like next is the fine coӧrdination that allows us to move each finger separately.
And that will require electrical pulses to travel both ways—signals from a patient’s muscles and nerves moving the prosthesis, and sensors giving natural-seeming feedback to the patient. It’s a translation task, in part, as they try to get the sensors to speak the language of the nervous system, says Dustin Tyler, a professor of biomedical engineering at Case Western Reserve University.
The hard part is in creating an interface between the prosthetic and the patient that allows this language to be transmitted in all its complexity. “We’re not at the point where we can reproduce natural touch, but we are at the point where we can convey useful touch sensation,” says Bensmaia.
A Roomba recorded a woman on the toilet. How did screenshots end up on Facebook?
Robot vacuum companies say your images are safe, but a sprawling global supply chain for data from our devices creates risk.
A startup says it’s begun releasing particles into the atmosphere, in an effort to tweak the climate
Make Sunsets is already attempting to earn revenue for geoengineering, a move likely to provoke widespread criticism.
10 Breakthrough Technologies 2023
These exclusive satellite images show that Saudi Arabia’s sci-fi megacity is well underway
Weirdly, any recent work on The Line doesn’t show up on Google Maps. But we got the images anyway.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.