Skip to Content

Artificial Skin Provides a Step Toward Bionic Hands

New sensors made from plastic and carbon nanotubes could provide an artificial skin that would endow prosthetic hands with a sense of touch.
October 15, 2015

With funding from the U.S. Department of Defense, several researchers are making progress toward more humanlike prosthetic hands—ones that give users a sense of control and touch.

The fingertips on this prosthetic hand are covered with a new form of artificial skin that can pick up slight pressure gradients.

Scientists from Stanford announced a new type of pressure sensor in the form of a flat, flexible material that might eventually serve as an artificial skin that could go over prosthetics, allowing users not just to manipulate objects but also feel them. The sensors send pulses that the brain interprets in order to determine a certain sense of touch. “It’s directly mimicking the biological system,” says lead researcher Zhenan Bao.

The “skin” is made from plastic which is printed with a waffle pattern to make it compressible. Embedded inside are carbon nanotubes—tiny cylinders of pure carbon that conduct electricity. Squeezing the material brings the rods closer together, creating more rapid pulses as the pressure increases.

In a paper in this week’s issue of Science, Bao and colleagues claim the sensors can pick up gradations in pressure that are equivalent to the difference between a firm handshake and a limp one. This is just one component of touch, and it wasn’t tested in humans. Instead, Bao and her colleagues sent the signals to slices of mouse brain in vitro—just to show they could get the sensor to communicate with neurons.

Still, it’s a promising step in the quest to make prosthetic limbs more real. “It’s impossible to do much of anything with a prosthetic hand if you can’t feel,” says Sliman Bensmaia, a biologist and computational neuroscientist who works on artificial limbs at the University of Chicago. Touch, he says, is hard to re-create because it’s a surprisingly complex sense. We can not only tell silk from satin, but also distinguish cheap silk from good silk. We do this because our skin can sense textures down to 10s of nanometers.

Eventually the researchers are hoping to channel information from artificial sensors into the peripheral nerves that were once connected to the lost hand. Already they’ve created interfaces that give users the ability to open and close their hands (see “An Artificial Hand With Real Feeling”). What they’d like next is the fine coӧrdination that allows us to move each finger separately.

And that will require electrical pulses to travel both ways—signals from a patient’s muscles and nerves moving the prosthesis, and sensors giving natural-seeming feedback to the patient. It’s a translation task, in part, as they try to get the sensors to speak the language of the nervous system, says Dustin Tyler, a professor of biomedical engineering at Case Western Reserve University.

The hard part is in creating an interface between the prosthetic and the patient that allows this language to be transmitted in all its complexity. “We’re not at the point where we can reproduce natural touch, but we are at the point where we can convey useful touch sensation,” says Bensmaia.   

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.