Neuroscientists dream of creating neural prosthetics that would allow paralyzed patients to regain control over their arms and legs. While that goal is still far off, researchers at Brown University and Massachusetts General Hospital are reporting a promising step forward.
In a study published in the journal Nature this week, the researchers describe how two paralyzed patients with a surgically implanted neural device successfully controlled a computer and, in one case, a robotic arm – using only their minds.
It is the first time such results have been achieved with neural implants in humans. The researchers are now refining the experimental system into a commercial product – one that could help patients in their daily lives. They plan to make the device wireless and fully implantable and to improve the speed and complexity of movements that patients using the implant can perform.
“It’s a landmark study because it shows that even years after injury, you can still record useful signals from the brain and use them to drive a device,” says Joseph Pancrazio, program director for neural engineering research at the National Institutes for Neurological Disorders and Stroke in Bethesda, MD. “This group has really pushed the frontier.”
During spinal-cord injuries and some types of stroke, the information relay system between the brain and muscles is disrupted. Neural devices, such as the one used in the study, aim to record and process existing signals and use them to control a computer cursor, robotic arm, or even paralyzed limb. The Brown/MGH researchers first implanted a brain chip in a human in June 2004. And while there have been signs of success since then, the Nature paper is the first peer-review publication describing in detail what paralyzed patients can do with the implant. (Technology Review reported on Donoghue’s work with Nagle last year, in “Implanting Hope,” March 2005.)
The brain-computer interface used in the study, made by Cyberkinetics Neurotechnology Systems in Foxborough, MA, consists of a tiny silicon chip containing 100 electrodes that record signals from hundreds of neurons in the motor cortex. A computer algorithm then translates this complex pattern of activity into a signal used to control an external device.
The first patient implanted with the device, a 25-year-old man who was paralyzed after a knife wound in 2001, successfully learned to control a computer cursor, moving adeptly through an e-mail program and using the computer to turn on a television and change the channel. When the device was hooked up to a robotic hand, he quickly learned to control the hand, picking up and dropping a piece of candy into a technician’s hand. “It was exciting because he picked up on that very quickly – around ten minutes,” says John Donoghue, senior scientist on the project, founder of Cyberkinetics, and a neuroscientist at Brown University in Providence, RI. (Click here to see video of the patient controlling a computer cursor and a prosthetic hand.)
Two other patients in the trial, both with different types of injuries, also learned to manipulate a computer program, although they have not yet tried the robotic arm. “The results show it’s feasible to use these devices in a real world setting, but we’ve got a long way to go before everyday use,” says Donohue.
Neuroscientists have used similar devices in monkeys and other animals for several years, but Donohue’s trial is the first to test surgically implanted electrode arrays in human patients. “It’s a big leap to bring this technology into humans,” says Stephen Scott, a neuroscientist at Queen’s University in Kingston, Ontario, who wrote a commentary accompanying the paper. “This was pretty successful for a first attempt – the patients showed some impressive capabilities.”
While the results are promising, experts caution the technology is in the early stages. “This is still far from being a useful device that actually increases the quality of life for this patient,” says Andrew Schwartz, a neuroscientist at the University of Pittsburg who studies similar devices in animals. The same technology works better in monkeys, suggesting that more work needs to be done in designing the recording electrodes and software filters, he says.
Currently, available assistive devices for paralyzed patients, such as computer programs activated by voice or eye movements, rely on a secondary signal to carry out the command, and require both a training period and a high level of concentration. An implanted device has the potential to aid patients in a much more natural way. It “taps into all the information the brain uses to move [the muscles],” says Donoghue. Because it mimics the brain’s normal processing system, patients can control a cursor and talk at the same time, he says.
Donoghue and colleagues are now adapting the experimental system into a device for broader use. The current system has wires connecting the implant to an external computer through the skull, which carries the risk of infection. The researchers plan to miniaturize the hardware and make it wireless, so the entire system can be implanted.
The team is also developing new analysis software, which they hope will allow more sophisticated types of movement. Currently, patients can navigate an e-mail program or make crude movements with a robotic arm; but they can’t perform more complex tasks, such as using the robotic arm to type on a keyboard or to eat a bowl of soup.
To accomplish such complicated movements, scientists must first create a better “decoder,” the algorithm that interprets the brain’s neural signals. When the brain prepares to move, say, a hand from left to right, millions of neurons in the brain’s motor cortex fire in a specific way. The researchers generate the decoder by asking patients to imagine moving their hand in a circle, which triggers neurons to fire as if the paralyzed limb was moving. A computer program then records and processes this information, ultimately creating a filter that translates subsequent neuronal activity into the desired actions.
But the filter still has a much more limited ability to translate information than the brain does. It uses data from hundreds of neurons rather than millions and collects information from a single part of the brain. Donohue and colleagues now are developing different types of algorithms to see which are most adaptable and make the best use of the available neural signals.
“We can test different algorithms and patients can tell us which are easiest or feel most natural,” says Leigh Hochberg, a neurologist at Massachusetts General Hospital and lead author of the study. “I suspect that if we can continue to improve the decoding from just a small area and perhaps record from multiple areas of the brain, we might be able to further improve the variety of control systems available to people.”
Other scientists are also developing ways to make brain interfaces much faster. For a patient, that could mean the difference between struggling to write an e-mail and composing one with little effort. Working with primates, Krishna Shenoy and colleagues at Stanford University in Stanford, CA, were able to quadruple rates of information transfer using a similar implant, but recording from a different part of the brain. For a human being, that would translate into typing 15 words per minute instead of just four.
Donoghue eventually plans to adapt his system to perform an even grander feat. The team is collaborating with scientists at Case Western Reserve University in Cleveland, OH, to create a device that uses signals from the brain to electrically stimulate paralyzed muscle, potentially allowing patients to move their limbs.
Not surprisingly, this is what people want the most. When Donoghue asked a patient if he would prefer to be able to make sophisticated movements with a prosthetic arm or crude movements with his own arm, he chose the latter. “The idea of reanimating his own body was much more important than how sophisticated the movement could be,” says Donoghue.
A horrifying new AI app swaps women into porn videos with a click
Deepfake researchers have long feared the day this would arrive.
We can’t afford to stop solar geoengineering research
It is the wrong time to take this strategy for combating climate change off the table.
Meet Altos Labs, Silicon Valley’s latest wild bet on living forever
Funders of a deep-pocketed new "rejuvenation" startup are said to include Jeff Bezos and Yuri Milner.
The new version of GPT-3 is much better behaved (and should be less toxic)
OpenAI has trained its flagship language model to follow instructions, making it spit out less unwanted text—but there's still a way to go.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.