MIT Technology Review Subscribe

Monkey Thinks Robot into Action

A monkey is able to feed itself with a robotic arm.

In a dramatic display of the potential of prosthetic arms, a monkey at the University of Pittsburgh was able to use his brain to directly control a robotic arm and feed himself a marshmallow. The research, published today in the journal Nature, is the first to show that an interface that converts brain signals directly into action is sophisticated enough to perform a practical function: eating. Researchers who led the work have just begun human tests of a related technology.

Brain power: A monkey with an array of tiny electrodes implanted into his brain uses his thoughts to control a robotic arm, grabbing a piece of marshmallow and bringing it to his mouth. Scientists ultimately hope that this type of brain machine interface will help paralyzed people perform everyday tasks, like feeding themselves or brushing their hair.

“It’s the first time a monkey–or a human–is directly, with their brain, controlling a real prosthetic arm,” says Krishna Shenoy, a neuroscientist at Stanford University who was not involved in the research.

Advertisement

People who suffer from strokes or spinal cord injury, or from some neurodegenerative diseases, such as amyotrophic lateral sclerosis (ALS), are often left paralyzed. But their cerebral cortices–the parts of the brain that control movement, planning, and other functions–may remain largely intact. Scientists hope to capitalize on that with the development of brain machine interfaces–devices that convert brain activity into action, such as movement of a cursor on a computer screen.

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

People who are completely paralyzed can now use brain machine interfaces that noninvasively measure signals recorded from the surface of the scalp, but the devices are slow and require sustained concentration to operate. To create a prosthesis that works like a real arm–the user thinks about moving his arm, and it moves–will most likely require that electrical activity be recorded directly from the brain.

That has become possible in recent years, thanks to advances in the tiny arrays of electrodes used to record neural signals. In previous research, John Donoghue and his colleagues at Brown University showed that electrodes implanted into the brain of a paralyzed man could be used to move a cursor on a computer screen and even make a simple movement with a robotic arm. But that and other research have been limited to one- or two-dimensional movements, and, other than a few cases using a mechanical arm or gripper, were performed virtually, on a screen.

In the latest research, headed by neuroscientist Andrew Schwartz at the University of Pittsburgh, the monkey was able to perform a more complicated task. “Andy has taken this one step further, to a practical device that could be of use in the real world,” says John Kalaska, a neuroscientist at the University of Montreal, in Canada, who wrote a commentary accompanying the publication. “The animal can simply, through a kind of mental practice, get the robot to move toward where the [food] is, close the hand, and bring it back to the mouth and let him eat it.”

To achieve the feat, two monkeys had a grid of microelectrodes implanted into the motor cortex, part of the brain that controls motor planning and execution. The animals had previously been trained to move an anthropomorphic robotic arm, with moveable joints at the shoulder, elbow, and wrist, using a joystick. To learn to control the prosthesis with their minds, the monkeys had their arms temporarily restrained as they watched a computer move the arm through the required motions–to extend the arm to the piece of food, grip it, bring it to the mouth, and release it. “They imagine themselves doing the task, like athletes do for sports,” says Schwartz. “The neurons are active as they observe the movement, and then we can capture the [neural signals] and use them for our own control.”

Schwartz and his team used relatively simple algorithms to decode the patterns of neural activity recorded during the observation phase, and then used that information to control the robotic arm in real time. (Scientists can deduce both direction and speed of an intended movement from the activity of ensembles of neurons in the motor cortex: activity of specific collections of cells indicates direction, while the amplitude of the overall signal dictates speed.)

After just two days of training, the monkeys learned to control the arm in three dimensions and to control the gripper placed at the end that functions as a hand. The animals even learned to use the arm in ways in which they hadn’t been trained: an accompanying video shows an animal using the arm to push a piece of food into his mouth. In a second video, the monkey brings the gripper back to his mouth and licks it, ignoring another piece of food. “He gets so good at using the tool that he may think about it as part of his own body,” says Schwartz. He likens the training process to learning to use a mouse to control a computer cursor. After a certain learning period, “you’re not thinking about how you have to activate a muscle in an index finger to push the left mouse button,” he says. “In that way, you’ve embodied the cursor on the screen.”

Schwartz and his collaborators are now testing the technology in humans. The first test run, begun just last week, is in an epilepsy patient who is undergoing a diagnostic test, known as electrocorticography, in which electrodes are surgically placed on the surface of the brain to try to identify the source of seizures. The surface electrodes are more precise than noninvasive scalp recordings and are less invasive than electrodes implanted into the brain, although they give a cruder level of control. Scientists will piggyback on that diagnostic test and try to use the signals recorded from the electrodes to control a computer program.

Advertisement

If successful, the researchers will begin testing the technology in ALS patients. In the end stages of this disease, patients are completely paralyzed; a brain-controlled computer program could help them do basic things, such as write an e-mail. “We think this could give them a way of communicating with others that’s faster than existing methods,” says Schwartz. “We hope to be able to create an interface for moderate typing speed, about 30 to 40 words a minute.”

The researchers aim to test fully implanted electrodes, like those used in the monkey to control the robotic arm, in humans within the next two years. “With humans, I fully expect to get a lot better control,” says Schwartz. In addition to being easier to train, humans can hopefully explain what’s difficult or needs to be improved, he says.

Even if those tests are successful, significant hurdles remain before such devices can be routinely used in patients. The electrodes currently in use aren’t ideal for long-term recording: the signals degrade over time. And the entire system will ultimately need to be made portable and wireless, or at least user friendly. “We need to make it easy enough so that patients can practice anytime they want to, rather than having a technician come out to the house and set up complicated equipment,” says Schwartz. “We hope there will be improvement in electrode arrays–everything from bioactive coatings to telemetry. In two years, a lot of that should be in place.”

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement