Skip to Content

Playing Piano with a Robotic Hand

Scientists are developing a neural interface that can control the movement of individual fingers on a prosthetic hand.
July 25, 2007

By tapping directly into the brain’s electrical signals, scientists at John’s Hopkins University, in Baltimore, are on their way to developing a prosthetic hand more dexterous than ever before. They have demonstrated for the first time that neural activity recorded from a monkey’s brain can control fingers on a robotic hand, making it play several notes on a piano.

Robotic hand: Scientists are developing a neural interface that can use signals in the brain to control fingers on a robotic hand, shown here.

“We would hope that eventually, we’ll be able to implant similar arrays permanently in the motor cortex of human subjects,” says Mark Schieber, a neuroscientist at the University of Rochester, in New York, who is working on the project. However, researchers caution that a practical human version of the neural interface is still a long way off.

Most prosthetic hands currently available are limited to a clawlike grasping motion. A significantly improved version, which went on the market last week, uses muscle contractions in the arms to individually control fingers. (See “A Hand for the Wounded.”) While this type of design is a huge boon to amputees, translating their intention to move into action via muscle activity requires conscious effort. In the long term, scientists would like to develop a prosthesis that is effortlessly controlled by the user’s thoughts. “If you can tap into the brain, you can record from the brain itself the intent of hand and finger movement,” says Nitish Thakor, a neuroengineer at John’s Hopkins, who is working on the project.

So far, scientists have made neural interfaces that allow monkeys–and paralyzed patients, in a few experimental cases–to use their brain activity to reach and grasp with a robotic arm. (See “Brain Chips Give Paralyzed Patients New Powers.”) However, the more-sophisticated prosthetic hands currently being created require finer levels of control. “With the development of the highly dexterous prosthetic hand, we now have a motivation to try to control individual fingers,” says Thakor.

Multimedia

  • Watch the translation of neural activity into robotic finger movements.

  • Watch a robotic hand, controlled by neural activity, play "Frère Jacques."

To make the neural interface, researchers recorded brain-cell activity from monkeys as they moved their fingers in different ways. (A particular part of the motor cortex has previously been shown to control finger movement.) The scientists then created algorithms to decode these brain signals by identifying the specific activity patterns linked to particular movements. When the decoding system was connected to a robotic hand and fed new neural-activity patterns, the fingers on the hand performed the intended movement 95 percent of the time.”The findings are extremely encouraging,” says Krishna Shenoy, a neuroscientist at Stanford University who isn’t involved in the research. The researchers presented their findings at a neural-engineering conference earlier this year.

These initial experiments have been performed “off-line,” meaning that brain activity was recorded and then fed into the system at a later time. But researchers are planning a live demonstration within the next six months. Monkeys implanted with an array of recording electrodes will be hooked up directly to a virtual version of a prosthetic arm, which is currently under development. Scientists will then determine how well these animals, which are trained to perform specific hand movements, can use their brain activity to control the virtual hand in real time.

While preliminary results are exciting, scientists have a long way to go before they can mimic the true dexterity of the hand. “Each finger has three or four degrees of freedom that need to be controlled: flexion and extension at each of three joints, as well as adduction and abduction,” says Schieber. Added to that is the complexity of moving five individual fingers, sometimes in unison and sometimes independently.

Scientists don’t yet know if the decoding system they have built will be able to execute unique actions–movements that were not a part of the original repertoire used to create the decoder. “In the long run, we want [the monkey] to be able to do anything he can think of in the moment,” says Schieber. “But getting the decoding algorithm to generalize like that is another challenge.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.