Paralyzed patients dream of the day when they can once again move their limbs. That dream is making its way to becoming a reality, thanks to a neural implant created by John Donoghue and colleagues at Brown University and Cyberkinetics Neurotechnology Systems.
In 2004, Matthew Nagle, who is paralyzed due to a spinal-cord injury, became the first person to test the device, which translated his brain activity into action (see “Implanting Hope,” March 2005, and “Brain Chips Give Paralyzed Patients New Powers”). Nagle’s experience with the prosthetic was exciting but very preliminary: he could move a cursor on a computer screen and make rough movements with a robotic arm. Now Donoghue and team are pushing ahead with their quest to develop a commercially available product by testing the device in two new patients, one with a neurodegenerative disease and the other suffering the effects of a stroke.
With spinal-cord injuries and some types of stroke and neurodegenerative disease, the information-relay system between the brain and muscles is disrupted. The Cyberkinetics device consists of a tiny chip containing 100 electrodes that record signals from hundreds of neurons in the motor cortex. A computer algorithm then translates this complex pattern of activity into a signal used to control a computer cursor, robotic arm, and, maybe eventually, the patient’s own limb.
The researchers have now tested the device in two new patients, one with ALS, a progressive neurodegenerative disease, and the other with brain-stem stroke, a particularly devastating type of stroke that paralyzes the body but leaves the mind intact. The scientists presented their latest results at the Society for Neurosciences conference this week in Atlanta, GA. At the conference, Donoghue, founder of Cyberkinetics and a neuroscientist at Brown, and Leigh Hochberg, a neurologist at MGH who works with the patients studied, talked with Technology Review about the latest developments in neural prosthetics and their plans for the future.
Technology Review: Who are your two newest patients?
Leigh Hochberg: One patient is a 53-year-old woman who had a brain-stem stroke nine years ago. She has no use of her hands or legs and can’t speak, but she can move her head and usually uses a button on her wheelchair to communicate. The other patient is a 37-year-old man with advanced ALS. He can’t speak or move his arms or legs.
TR: Are the new patients testing a new, improved device, or is it the same one used in Matthew Nagle’s trial?
John Donoghue: The device is the same, but we’re using a new filter [a piece of software that decodes neural signals and transmits the command to a user interface, e.g., a computer]. Now it’s possible to get quite a good level of control. Patients can move the cursor much more cleanly, and they can point to a target and click on it, just like you would with a mouse.
LH: The new filter does a much better job of stabilizing the cursor. The patients imagine moving their wrist to move the cursor and squeezing their hand to click on a target. Once you have the capacity to move a cursor in two dimensions and point and click, you can imagine a very powerful tool. Patients could control any computer-based device. For example, we could use the same point-and-click concept with a typing board.
We’re also working with a company called Rolltalk, which has developed a powerful interface. It was built for people who use eye-based controls [devices that convert directed eye movements into specific commands], but we’re adapting it for brain control. One patient has already used it to control the movement of a wheelchair.
TR: What have you learned from testing the device in these two new patients?
JD: When we first started working with other patients, we weren’t sure how similar their responses would be to Matthew’s. But we were struck by the similarity. We found that the same types of cells were present, and patients were able to modulate them. All were able to achieve control, with some variability.
TR: What about the ALS patient? As a neurodegenerative disease, that’s a very different issue than spinal-cord injury or brain-stem stroke.
LH: ALS affects motor neurons, but it also affects the motor cortex directly, so there was some question about whether we could use signals from these cells [to control the implant]. However, we saw lots of signals in the motor cortex, and the patient was able to modulate those signals. In fact, he was able to move the cursor immediately, even though he hadn’t used those cells in a while.
JD: This also gives us an unprecedented view into the disease. This is the first opportunity to track neurons in the intact nervous system. Will we be able to see neurons degenerate? It’s a whole other potential to this technology.
TR: What do the patients think?
LH: We run sessions with the patients twice a week, and we get feedback from them every day. They’re guiding us in development as much as anyone else is. When we test a new filter [to decode their neural signals], we ask them how it feels. Sometimes they’ll tell us it feels natural, or sometimes that it doesn’t feel right. We learn so much from each participant.
JD: Our stroke patient says she likes this system better than the one she was using, which involved banging her head against her wheelchair.
TR: How close are you to having a commercially available treatment?
JD: We first have to complete the current pilot trials. Then we’ll move on to a larger, multicenter trial. If we show that more people can use the device effectively, then the FDA [Food and Drug Administration] could approve it. In terms of safety, we now have more than 1,500 days of testing, and we have seen no significant device-related adverse events.
TR: What other improvements are in the works?
JD: We want to make the system automated–right now a technician has to run it. And we want to make the system fully implantable, both to decrease the chance of infection [via the hole in the skull] and to make life more normal for the patient. The system right now is sort of analogous to the first cardiac pacemaker in the 1950s. It had a big cart with an oscilloscope that the patient had to move around.
TR: Have you made any progress in developing a wireless system?
JD: We’re working on two wireless systems. They both use the same electrode array, but in one case, the array is connected to a titanium can modeled after the cochlear implant. The can, which is also implanted, contains electronics that can amplify the neural signals and transmit them outside the body. We can then integrate that system with our computer decoder and use the Rolltalk interface to control a wheelchair, lights, or TV. We have a bench version of the system that works, but we haven’t assembled all the parts yet. Because the system is modeled after an FDA-approved cochlear device, we hope that it can move quickly into patients. In the second system, the electronics are actually mounted onto the array, which is connected to a fiber-optic cable. Both power and neural signals could be transmitted in and out via this cable. We hope to start tests of the implantable devices in monkeys this winter.
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
Data analytics reveal real business value
Sophisticated analytics tools mine insights from data, optimizing operational processes across the enterprise.
Driving companywide efficiencies with AI
Advanced AI and ML capabilities revolutionize how administrative and operations tasks are done.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.