Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

By tapping directly into the brain’s electrical signals, scientists at John’s Hopkins University, in Baltimore, are on their way to developing a prosthetic hand more dexterous than ever before. They have demonstrated for the first time that neural activity recorded from a monkey’s brain can control fingers on a robotic hand, making it play several notes on a piano.

“We would hope that eventually, we’ll be able to implant similar arrays permanently in the motor cortex of human subjects,” says Mark Schieber, a neuroscientist at the University of Rochester, in New York, who is working on the project. However, researchers caution that a practical human version of the neural interface is still a long way off.

Most prosthetic hands currently available are limited to a clawlike grasping motion. A significantly improved version, which went on the market last week, uses muscle contractions in the arms to individually control fingers. (See “A Hand for the Wounded.”) While this type of design is a huge boon to amputees, translating their intention to move into action via muscle activity requires conscious effort. In the long term, scientists would like to develop a prosthesis that is effortlessly controlled by the user’s thoughts. “If you can tap into the brain, you can record from the brain itself the intent of hand and finger movement,” says Nitish Thakor, a neuroengineer at John’s Hopkins, who is working on the project.

So far, scientists have made neural interfaces that allow monkeys–and paralyzed patients, in a few experimental cases–to use their brain activity to reach and grasp with a robotic arm. (See “Brain Chips Give Paralyzed Patients New Powers.”) However, the more-sophisticated prosthetic hands currently being created require finer levels of control. “With the development of the highly dexterous prosthetic hand, we now have a motivation to try to control individual fingers,” says Thakor.

To make the neural interface, researchers recorded brain-cell activity from monkeys as they moved their fingers in different ways. (A particular part of the motor cortex has previously been shown to control finger movement.) The scientists then created algorithms to decode these brain signals by identifying the specific activity patterns linked to particular movements. When the decoding system was connected to a robotic hand and fed new neural-activity patterns, the fingers on the hand performed the intended movement 95 percent of the time.”The findings are extremely encouraging,” says Krishna Shenoy, a neuroscientist at Stanford University who isn’t involved in the research. The researchers presented their findings at a neural-engineering conference earlier this year.

2 comments. Share your thoughts »

Credit: Soumyadipta Acharya, Vikram Aggarwal, and Francesco Tenore at Johns Hopkins

Tagged: Biomedicine, robotics, neuroscience, prosthesis

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me