Artificial Skin Provides a Step Toward Bionic Hands
With funding from the U.S. Department of Defense, several researchers are making progress toward more humanlike prosthetic hands—ones that give users a sense of control and touch.

Scientists from Stanford announced a new type of pressure sensor in the form of a flat, flexible material that might eventually serve as an artificial skin that could go over prosthetics, allowing users not just to manipulate objects but also feel them. The sensors send pulses that the brain interprets in order to determine a certain sense of touch. “It’s directly mimicking the biological system,” says lead researcher Zhenan Bao.
The “skin” is made from plastic which is printed with a waffle pattern to make it compressible. Embedded inside are carbon nanotubes—tiny cylinders of pure carbon that conduct electricity. Squeezing the material brings the rods closer together, creating more rapid pulses as the pressure increases.
In a paper in this week’s issue of Science, Bao and colleagues claim the sensors can pick up gradations in pressure that are equivalent to the difference between a firm handshake and a limp one. This is just one component of touch, and it wasn’t tested in humans. Instead, Bao and her colleagues sent the signals to slices of mouse brain in vitro—just to show they could get the sensor to communicate with neurons.
Still, it’s a promising step in the quest to make prosthetic limbs more real. “It’s impossible to do much of anything with a prosthetic hand if you can’t feel,” says Sliman Bensmaia, a biologist and computational neuroscientist who works on artificial limbs at the University of Chicago. Touch, he says, is hard to re-create because it’s a surprisingly complex sense. We can not only tell silk from satin, but also distinguish cheap silk from good silk. We do this because our skin can sense textures down to 10s of nanometers.
Eventually the researchers are hoping to channel information from artificial sensors into the peripheral nerves that were once connected to the lost hand. Already they’ve created interfaces that give users the ability to open and close their hands (see “An Artificial Hand With Real Feeling”). What they’d like next is the fine coӧrdination that allows us to move each finger separately.
And that will require electrical pulses to travel both ways—signals from a patient’s muscles and nerves moving the prosthesis, and sensors giving natural-seeming feedback to the patient. It’s a translation task, in part, as they try to get the sensors to speak the language of the nervous system, says Dustin Tyler, a professor of biomedical engineering at Case Western Reserve University.
The hard part is in creating an interface between the prosthetic and the patient that allows this language to be transmitted in all its complexity. “We’re not at the point where we can reproduce natural touch, but we are at the point where we can convey useful touch sensation,” says Bensmaia.
Keep Reading
Most Popular
DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.
“This is a profound moment in the history of technology,” says Mustafa Suleyman.
What to know about this autumn’s covid vaccines
New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.
Human-plus-AI solutions mitigate security threats
With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure
Next slide, please: A brief history of the corporate presentation
From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.