Skip to Content

Giving Prosthetics a Sense of Touch

A study gives a first demonstration that brain-machine interfaces can include touch feedback.

Brain-machine interfaces have made it possible for monkeys and some humans to control robotic limbs using just their thoughts. But ideally, a person using an artificial limb or other device would not only be able to control the device, but also feel what it’s touching.

Monkey see: In an experiment, monkeys implanted with two interfaces—one that reads their intended movements and another that responds to touch sensations—learned to operate the arm of a virtual monkey.

A new study from the lab of Miguel Nicolelis at Duke University Medical Center takes a first step toward such an interface. In a paper published today in Nature, his team reports that monkeys can learn to operate a virtual-reality hand that incorporates tactile feedback.

Nicolelis says that brain-machine interfaces will only be clinically useful if they use bidirectional signals, with both sensory feedback from the device and motor commands from the user. “It’s not enough to just provide motion,” he says. “You need to sense what you’re touching.”

As a first experiment, monkeys used a joystick to control a virtual “avatar” (a monkey arm and hand) on a computer screen, and were encouraged to use the avatar to grab objects on the screen. The virtual objects had textures, and this was conveyed using stimulation through microwave arrays implanted in a part of the brain’s cortex responsible for sensing touch. The monkeys learned to hold the avatar’s hand over objects with a particular texture—conveyed by the frequency of stimulation—in order to be rewarded with food.

In another experiment, the monkeys received the same tactile feedback but controlled the virtual hand using just their thoughts, via microwire arrays implanted in the motor cortex. Although their performance on the task was less accurate, the monkeys improved over time.

Nicolelis says the successful use of a “brain-machine-brain interface” demonstrates that the processes of sensing and responding to tactile sensations can be combined. “We are decoding motor intentions and tactile messages simultaneously,” he says. “That’s never been done before.” Although the stimulation the monkeys receive is artificial, he says, they seem to learn to associate it with tactile information.

The next step is to incorporate the sense of touch into real prosthetics, using pressure sensors that will generate similar tactile feedback about real-world objects. Nicolelis says his group hopes to build a simulator that would test this approach in humans, then incorporate touch sensation in prosthetics it’s creating for people with reduced mobility.

NitishThakor, a biomedical engineer at Johns Hopkins University, says that adding sensory information “is absolutely the next logical step” in brain-machine interface design. Thakor says the experiment not only demonstrates the feasibility of adding touch, but shows that the monkeys can learn a task using these coupled signals. The caveat, he adds, is that textures in the real world are much more complex, as are body movements, and “whether this is scalable remains to be seen.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.