Doctors will attempt to reanimate a patient’s paralyzed arm with a pioneering surgery that involves capturing signals from his brain and restoring movement through a fine network of electronics linked to arm muscles.
The new effort, being planned by researchers at Case Western Reserve University, will use a brain computer interface, or BCI, developed by researchers at Brown University and Massachusetts General Hospital. In previous work, patients have used this interface to control a computer cursor or a robotic arm (see “Brain Chip Helps Quadriplegics Move Robotic Arms with Their Thoughts” and “Patient Shows New Dexterity with a Mind-Controlled Robot Arm”).
The new effort will use the same technology to control the patient’s actual arm with a system called functional electrical stimulation (FES). This will send signals to as many as 18 arm and hand muscles to allow the subject, who is paralyzed from the neck down, to perform tasks such as eating and nose-scratching.
“This will be the first time someone has hooked up a BCI to an FES device,” says Daniel Moran, a neuroscientist at Washington University at St. Louis who is not involved in the study. “They’re putting the whole system together.” The surgery may occur this or next year, according to Case Western researchers.
Muscle activation technology has long been tested in paralyzed patients. Various patients can do things like press a button to activate muscles in their otherwise paralyzed legs to allow them to stand and even move about with a walker, helped along by legs that can stiffen and swing forward. If the patient does not have the use of his hands, activation of paralyzed muscles can be triggered by movements that a patient can control in his arm, cheek, or neck. The new effort will use the brain itself to send these signals.
At the heart of the new device is the brain implant—a small probe four millimeters on each side with 96 hair-like electrodes that penetrate 1.5 millimeters into a portion of the motor cortex that controls arm movements. The implant records the impulses of dozens of neurons corresponding to a patient’s intent to move.
In preparation for reconnecting real arm muscles, researchers have recently shown that the brain chip can control a virtual representation of those arm muscles. The ongoing clinical trial is known as BrainGate2.
While the signals from the brain correspond to a direction to move something, the algorithm translates those triggers into carefully coӧrdinated contractions in as many as 18 muscles, based on the model of those muscles’ movements and the degrees of arm freedom.
“The patient thinks ‘up and to the right,’ and we have a controller that actually figures out the correct muscle activations to move in that direction,” says Robert Kirsch, the project’s principal investigator, chair of biomedical engineering at Case Western, and executive director of the Department of Veterans Affairs Functional Electrical Stimulation Center.
The current version of that model includes 29 muscles, divided into 138 muscle elements, and 11 joints. On a screen, the patient sees an image of the virtual arm, and works to generate brain commands that ultimately move the virtual arm to touch a red spot, making it turn green.
Leigh Hochberg, a neurologist at Massachusetts General Hospital, associate professor of engineering at Brown, and one of the leaders of the underlying research collaboration, says the experiment on the virtual arm, first conducted in 2011, was a crucial milestone. That, together with recent advances in monkey experiments “provide encouragement that the goal is within reach” to connect brain chips to muscle stimulation, he says.
Even if successful, the reanimated arm itself would still not be able to convey a sense of touch back to the wearer. In a separate set of experiments, researchers at Case Western are testing a system that provides a sense of touch thanks to sensors on a prosthetic hand wired to peripheral nerves in the patient’s arm (see “An Artificial Hand with Real Feelings”). In theory, such sensory feedback could be delivered directly to the brain, too.
Neuroscientists are also working on better brain implants. Current interfaces used in the project essentially collect someone’s intent to move something in a certain direction. Next-generation versions would actually collect more natural muscle-movement commands from the brain itself—a more challenging task but one that promises more realistic control. Another advance under development is a wireless interface between the skull connector and the system that reads and interprets the signals from the brain (see “A Wireless Brain Computer Interface”).
The big new idea for making self-driving cars that can go anywhere
The mainstream approach to driverless cars is slow and difficult. These startups think going all-in on AI will get there faster.
Inside Charm Industrial’s big bet on corn stalks for carbon removal
The startup used plant matter and bio-oil to sequester thousands of tons of carbon. The question now is how reliable, scalable, and economical this approach will prove.
The hype around DeepMind’s new AI model misses what’s actually cool about it
Some worry that the chatter about these tools is doing the whole field a disservice.
The dark secret behind those cute AI-generated animal images
Google Brain has revealed its own image-making AI, called Imagen. But don't expect to see anything that isn't wholesome.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.