A woman who is completely paralyzed below the neck has regained the ability to reach out and interact with the world around her thanks to the most advanced brain-computer interface for operating a robotic arm so far.
In February, surgeons implanted two four-millimeter-by-four-millimeter electrode arrays into the participant’s motor cortex, the region of the brain that initiates movements. Each chip has 96 electrodes and is wired through the skull to a computer that translates her thoughts into signals for the robotic arm. The work, performed by researchers from the University of Pittsburgh, is reported in the latest issue of The Lancet.
The work is the latest advance to show how brain-controlled interface technology can restore some movement to quadriplegics. In May of this year, researchers at Brown University described how a paralyzed patient could use a robotic limb to perform basic tasks, including giving herself a drink of coffee (see “Brain Chip Helps Quadriplegics Move Robotic Arms with Their Thoughts”). The participant in the new study has twice as many electrodes in her brain as the woman in the Brown study and can demonstrate more complex hand movements with her robotic limb.
“We are reproducing more of a natural and realistic movement of the arm and hand,” says Andrew Schwartz, a neuroscientist at the University of Pittsburgh and the senior author on the study.
Some experts, however, caution that it’s hard to draw conclusions about the technology’s potential from a single case.
Miguel Nicolelis, a brain-machine interface researcher at Duke University, notes that recording from more neurons makes it possible to improve precision and complexity in the movements of connected devices. However, he adds that it is hard to say just how many neurons the Pittsburgh team was actually recording from. “There is little documentation of the brain signal,” says Nicolelis of the Lancet paper describing the work. “It would be really great if they had reached the 200-neuron mark, but there appears to be no documentation of that,” he says.
The Lancet study describes the progress of the woman as she operated the robot arm over 13 weeks. After the electrodes were implanted into her brain, she began her training by watching the arm move and imagining that she was controlling it. All the while, the computer was recording neural activity in her motor cortex, and this information was used to better decode her intentions into movements of the robot arm. “Then we started to give her some control,” says Jennifer Collinger, a biomedical engineer at Pittsburgh and the first author on the study. “That generates a feedback loop—she can see whether what she is thinking is moving the arm in the right direction or not. Eventually, we took off those training wheels and gave her full control.”
By the second day of use, the participant was able to move the arm in three dimensions on her own. With practice, she was able to move cubes and other objects around a table and even pick up a two-pound rock. The woman continues to work with the researchers. She recently was able to pick up a piece of chocolate and feed herself, says Schwartz.
Like a spinal cord, the robotic arm used in the study has some ability to control its own movement. Years of study in primates on how the motor cortex coördinates hand movements helped the team develop the technology that could translate the participant’s thoughts in more “fluid and natural” movements, says Grégoire Courtine, a neuroscientist at the Swiss Federal Institute of Technology Lausanne in Switzerland.
“When animals move, they follow certain sets of rules, and it turns out we can pick that up in the neural signals that we record from the motor cortex,” says Schwartz.
The arm, which was developed under a Defense Advanced Research Projects Agency contract, has 17 motors that control 26 joints in what is the most sophisticated artificial limb system in the world. “The arm was designed to be able to mimic a human limb,” says Michael McLoughlin, program manager for the Modular Prosthetic Limb project, which is based at Johns Hopkins University in Maryland. The Johns Hopkins team has built six of the robotic limbs that are in use by different research groups in the U.S., says McLoughlin.
A crucial next step for the Pittsburgh team will be incorporating sensory feedback into the prosthetic. The arm has over 100 sensors, says McLoughlin, capable of detecting vibration, pressure, temperature, and more. The team is also working on developing a wireless version of the brain-machine interface so that participants do not have to have electronics sticking out of their heads.
The researchers also hope to recruit more participants to work with the prosthetic, and to continue to improve the technology so that one day the “laboratory oddity can be translated into therapeutic use,” says Schwartz.
DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.
“This is a profound moment in the history of technology,” says Mustafa Suleyman.
What to know about this autumn’s covid vaccines
New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.
Human-plus-AI solutions mitigate security threats
With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure
Next slide, please: A brief history of the corporate presentation
From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.