I was about 15 minutes late for my first phone call with Jan Scheuermann. When I tried to apologize for keeping her waiting, she stopped me. “I wasn’t just sitting around waiting for you, you know,” she said, before catching herself. “Well, actually I was sitting around.”
Scheuermann, who is 54, has been paralyzed for 14 years. She had been living in California and running a part-time business putting on mystery-theater dinners, where guests acted out roles she made up for them. “Perfectly healthy, married, with two kids,” she says. One night, during a dinner she’d organized, it felt as if her legs were dragging behind her. “I chalked it up to being a cold snowy night, but there were a couple of steps in the house, and boy, I was really having trouble,” she says.
Anguished months of doctor’s visits and misdiagnoses followed. A neurologist said she had multiple sclerosis. By then, she was using an electric wheelchair and “fading rapidly.” She thought she was dying, so she moved home to Pittsburgh, where her family could take care of her children. Eventually she was diagnosed with a rare disease called spinocerebellar degeneration. She can feel her body, but the nerves that carry signals out from her brain no longer work. Her brain says “Move,” but her limbs can’t hear.
Two and a half years ago, doctors screwed two ports into Scheuermann’s skull (she calls them “Lewis and Clark”). The ports allow researchers to insert cables that connect with two thumbtack-size implants in her brain’s motor cortex. Two or three times a week, she joins a team of scientists at the University of Pittsburgh and is plugged into a robotic arm that she controls with her mind. She uses it to move blocks, stack cones, give high fives, and pose for silly pictures, doing things like pretending to knock out a researcher or two. She calls the arm Hector.
Scheuermann, who says that in her dreams she is not disabled, underwent brain surgery in 2012 after seeing a video of another paralyzed patient controlling a robotic arm with his thoughts. She immediately applied to join the study. During the surgery, doctors used an air gun to fire the two tiny beds of silicon needles, called the Utah Electrode Array, into her motor cortex, the slim strip of brain that runs over the top of the head to the jaws and controls voluntary movement. She awoke from the surgery with a pounding headache and “the worst case of buyer’s remorse.” She couldn’t believe she’d had voluntary brain surgery. “I thought, Please, God, don’t let this be for nothing. My greatest fear was that it wouldn’t work,” she says. But within days, she was controlling the robotic arm, and with unexpected success: “I was moving something in my environment for the first time in years. It was gasp-inducing and exciting. The researchers couldn’t wipe the smile off their faces for weeks either.”
Scheuermann is one of about 15 to 20 paralyzed patients who have joined long-term studies of implants that can convey information from the brain to a computer. She is the first subject at Pittsburgh. Nine others, including people in the advanced stages of ALS, have undergone similar tests in a closely related study, called BrainGate. Another four “locked-in” patients, unable to move or speak, have regained some ability to communicate thanks to a different type of electrode developed by a Georgia company called Neural Signals.
A third of these patients have undergone surgery since 2011, when the U.S. Food and Drug Administration said it would loosen rules for testing “truly pioneering technologies” such as brain-machine interfaces. More human experiments are under way. One, at Caltech, wants to give a patient “autonomous control over the Google Android tablet operating system.” A team at Ohio State University, in collaboration with the R&D organization Battelle, put an implant in a patient in April with the intention of using the patient’s brain signals to control stimulators attached to his arm. Battelle describes the idea as “reanimating a paralyzed limb under voluntary control by the participant’s thoughts.”
These nervy first-of-a-kind studies rely on the fact that recording the electrical activity of a few dozen cells in the brain can give a fairly accurate picture of where someone intends to move a limb. “We are technologically limited to sampling a couple of hundred neurons, from billions in your brain, so it’s actually amazing they can get a signal out at all,” says Kip Ludwig, director of the neural engineering program at the National Institute of Neurological Disorders and Stroke.
The technology being used at Pittsburgh was developed in physiology labs to study animals, and it is plainly still experimental. The bundled wires lead from Scheuermann’s cranium to a bulky rack of signal processors, amplifiers, and computers. The nine-pound robotic arm, paid for by the military, has a dexterous hand and fingers that can make lifelike movements, but it is finicky, breaks frequently, and is somewhat dangerous. When things don’t work, graduate students hunt among tangles of wires for loose connections.
John Donoghue, the Brown University neuroscientist who leads the longer-running BrainGate study, compares today’s brain-machine interfaces to the first pacemakers. Those early models also featured carts of electronics, with wires punched through the skin into the heart. Some were hand-cranked. “When you don’t know what is going on, you keep as much as possible on the outside and as little as possible on the inside,” says Donoghue. Today, though, pacemakers are self-contained, powered by a long-lasting battery, and installed in a doctor’s office. Donoghue says brain-machine interfaces are at the start of a similar trajectory.
For brain-controlled computers to become a medical product, there has to be an economic rationale, and the risks must be offset by the reward. So far, Scheuermann’s case has come closest to showing that these conditions can be met. In 2013, the Pittsburgh team reported its work with Scheuermann in the medical journal the Lancet. After two weeks, they reported, she could move the robot arm in three dimensions. Within a few months, she could make seven movements, including rotating Hector’s hand and moving the thumb. At one point, she was filmed feeding herself a bite of a chocolate bar, a goal she had set for herself.
The researchers tried to show that they were close to something practical—helping with so-called “tasks of daily living” that most people take for granted, like brushing teeth. During the study, Scheuermann’s abilities were examined using the Action Research Arm Test, the same kit of wooden blocks, marbles, and cups that doctors use to evaluate hand dexterity in people with recent injuries. She scored 17 out of 57, or about as well as someone with a severe stroke. Without Hector, Scheuermann would have scored zero. The findings made 60 Minutes.
Since the TV cameras went away, however, some of the shortcomings of the technology have become apparent. At first Scheuermann kept demonstrating new abilities. “It was success, success, success,” she says. But controlling Hector has become harder. The reason is that the implants, over time, stop recording. The brain is a hostile environment for electronics, and tiny movements of the array may build up scar tissue as well. The effect is well known to researchers and has been observed hundreds of times in animals. One by one, fewer neurons can be detected.
Scheuermann says no one told her. “The team said that they were expecting loss of neuron signals at some point. I was not, so I was surprised,” she says. She now routinely controls the robot in only three to five dimensions, and she has gradually lost the ability to open and close its thumb and fingers. Was this at all like her experience of becoming paralyzed? I asked her the question a few days later by e-mail. She replied in a message typed by an aide who stays with her most days: “I was disappointed that I would probably never do better than I had already done, but accepted it without anger or bitterness.”
The researcher who planned the Pittsburgh experiment is Andrew Schwartz, a lean Minnesotan whose laboratory occupies a sunlit floor dominated by three gray metal towers of equipment that are used to monitor monkeys in adjacent suites. Seen on closed-circuit TVs, the scenes from inside the experimental rooms defy belief. On one screen, a metal wheel repeatedly rotates, changing the position of a bright orange handle. After each revolution, an outsize robotic hand reaches up from the edge of the screen to grab the handle. Amid the spinning machinery, it’s easy to miss the gray and pink face of the rhesus monkey that is controlling all this from a cable in its head.
The technology has its roots in the 1920s, with the discovery that neurons convey information via electrical “spikes” that can be recorded with a thin metal wire, or electrode. By 1969, a researcher named Eberhard Fetz had connected a single neuron in a monkey’s brain to a dial the animal could see. The monkey, he discovered, learned to make the neuron fire faster to move the dial and get a reward of a banana-flavored pellet. Although Fetz didn’t realize it at the time, he had created the first brain-machine interface.
Schwartz helped extend that discovery 30 years ago when physiologists began recording from many neurons in living animals. They found that although the entire motor cortex erupts in a blaze of electrical signals when an animal moves, a single neuron will tend to fire fastest in connection with certain movements—say, moving your arm left or up, or bending the elbow—but less quickly otherwise. Record from enough neurons and you can get a rough idea of what motion a person is making, or merely intending. “It’s like a political poll, and the more neurons you poll, the better the result,” he says.
The 192 electrodes on Scheuermann’s two implants have recorded more than 270 neurons at a time, which is the most ever simultaneously measured from a human being’s brain. Schwartz says this is why her control over the robot has been so good.
The neuronal signals are interpreted by software called a decoder. Over the years, scientists built better and better decoders, and they tried more ambitious control schemes. In 1999, the Duke University neuroscientist Miguel Nicolelis trained a rat to swing a cantilever with its mind to obtain a reward. Three years later, Donoghue had a monkey moving a cursor in two dimensions across a computer screen, and by 2004 his BrainGate team had carried out the first long-term human test of the Utah array, showing that even someone whose limbs had been paralyzed for years could control a cursor mentally. By 2008, Schwartz had a monkey grabbing and feeding itself a marshmallow with a robotic hand.
Scheuermann has been able to quickly attempt many new tasks. She has been asked to control two robot arms at once and lift a box (“I only managed it once or twice,” she says). Some results are strange: Scheuermann is able to close Hector’s fingers around a plastic cone, but often only if she shuts her eyes first. Is the presence of the cone somehow reflected in the neurons’ firing patterns? Schwartz has spent months trying to figure it out. Behind such points of uncertainty may lie major discoveries about how the brain prepares and executes actions.
Scheuermann once had her aide dress her in stick-on rat whiskers and a fuzzy tail to greet researchers. It was a darkly humorous way of acknowledging that these experiments depend on human volunteers. “They are not nearly as hard to train as these guys,” Schwartz says, jerking a thumb to the row of monkey rooms.
These volunteers are trapped; some of them desperately hope science can provide an escape. Realistically, that is unlikely in their lifetimes. The first BrainGate volunteer was a 25-year-old named Matt Nagle, who had breathed through a ventilator since his spinal cord was severed during a knife fight. He was able to move a cursor on a screen in 2004. But Nagle also wanted to commit suicide and tried to get others to help him do it, according to The Man with the Bionic Brain, a book written by his doctor. He died of an infection in 2007. On online chat boards where paralyzed people trade hopeful news about possible cures, like stem cells, some dismiss brain-machine interfaces as wacky. Others are starting to think it’s their best chance. “I’ll take it! Cut off my dead arm and give me a robotic one that I can FEEL with please!” wrote one.
Schwartz says he hopes to generate physical sensations from the robotic arm this year, if he can find another quadriplegic volunteer. Like Scheuermann, the next patient will receive two arrays in the motor cortex to control the robotic arm. But Schwartz says surgeons will place two additional implants into the volunteer’s sensory cortex; these will receive signals from pressure sensors attached to the robotic fingertips. Studies by Nicolelis’s Duke laboratory proved recently that animals do sense and respond to such electrical inputs. “We don’t know if the subject will feel it as touch,” says Schwartz. “It’s very crude and simplistic and an undoubtedly incorrect set of assumptions, but you can’t ask the monkey what he just felt. We think it will be a new scientific finding. If the patient can say how it feels, that is going to be news.”
Another key aim, shared by Schwartz and the BrainGate researchers, is to connect a volunteer’s motor cortex to electrodes placed in his or her limbs, which would make the muscles contract—say, to open and close a hand. In April, the Ohio surgeons working with Battelle announced that they would be the first to try it. They put a brain implant in a man with a spinal-cord injury. And as soon as the patient recovers, says Battelle, they’ll initiate tests to “reanimate” his fingers, wrist, and hand. “We want to help someone gain control over their own limb,” says Chad Bouton, the engineer in charge of the project, who previously collaborated with the BrainGate group. “Can someone pick up a TV remote and change the channel?” Although Battelle has not won approval from regulators to attempt it, Bouton says the obvious next step is to try a bidirectional signal to and from a paralyzed limb, combining control and sensation.
Brain-machine interfaces may seem as if they’re progressing quickly. “If you fast-forward from the first video of that monkey to someone moving a robot in seven dimensions, picking things up, putting them down, it’s pretty dramatic,” says Lee Miller, a neurophysiologist at Northwestern University. “But what hasn’t changed, literally, is the array. It’s the Stanley Steamer of brain implants. Even if you demonstrate control, it’s going to peter out in two to three years. We need an interface that will last 20 years before this can be a product.”
The Utah array was developed in the early 1990s as a way to record from the cortex, initially of cats, with minimal trauma to the brain. It’s believed that scar tissue builds up around the needle-like recording tips, each 1.5 millimeters long. If that interface problem is solved, says Miller, he doesn’t see any reason why there couldn’t be 100,000 people with brain implants to control wheelchairs, computer cursors, or their own limbs. Schwartz adds that if it’s also possible to measure from enough neurons at once, someone could even play the piano with a thought-controlled robotic arm.
Researchers are pursuing several ideas for improving the brain interface. There are efforts to develop ultrathin electrodes, versions that are more compatible with the body, or sheets of flexible electronics that could wrap around the top of the brain. In San Francisco, doctors are studying whether such surface electrodes, although less precise, could be used in a decoder for speech, potentially allowing a person like Stephen Hawking to speak via a brain-computer interface. In an ambitious project launched last year at the University of California, Berkeley, researchers are trying to develop what they call “neural dust.” The goal is to scatter micron-size piezoelectric sensors throughout the brain and use reflected sound waves to capture electrical discharges from nearby neurons.
Jose Carmena, a Berkeley researcher who, like Schwartz, works with monkeys to test the limits of thought control, now meets weekly with a group of a dozen scientists to outline plans for better ways to record from neurons. But whatever they come up with would have to be tested in animals for years before it could be tried in a person. “I don’t think the Utah array is going to become the pacemaker of the brain,” he says. “But maybe what we end up using is not that different. You don’t see the newest computer in space missions. You need the most robust technology. It’s the same kind of thing here.”
To succeed, any new medical device needs to be safe, useful, and economically viable. Right now, brain-machine interfaces don’t meet these requirements. One problem is the riskiness of brain surgery and the chance of infection. At Brown, Donoghue says the BrainGate team is almost finished developing a wireless transmitter, about the size of a cigarette lighter, that would go under a person’s skin and cut the infection risk by getting rid of the pedestals and wires that make brain-machine interfaces so unwieldy. Donoghue says that with a wireless system, implants could be a realistic medical option soon.
But that raises another tricky problem: what will patients control? The arm Scheuermann controls is still a hugely expensive prototype, and it often breaks. She worries that not everyone could afford one. Instead, Leigh Hochberg, a neurologist at Massachusetts General Hospital who runs the BrainGate study with Donoghue, thinks the first users will probably be “locked-in” patients who can neither move nor speak. Hochberg would consider it a “breakthrough” to afford such patients reliable thought control over a computer mouse. That would let them type out words or change the channel on a television.
Yet even locked-in patients can often move their eyes. This means they have simpler ways to communicate, like using an eye tracker. A survey of 61 ALS patients by the University of Michigan found that about 40 percent of them would consider undergoing surgery for a brain implant, but only if it would let them communicate more than 15 words a minute (a fifth of the people who responded to the survey were already unable to speak). BrainGate has not yet reported speeds that high.
All the pieces of the technology “have at some level been solved,” says Andy Gotshalk, CEO of Blackrock Microsystems, which manufactures the Utah array and has acquired some of the BrainGate technology. “But if you ask me what is the product—is it a prosthetic arm or is it a wheelchair you control?—then I don’t know. There is a high-level product in mind, which is to make life for quadriplegics a lot easier. But exactly what it would be hasn’t been defined. It’s just not concrete. The scientists are getting some high-level publications, but I have to think about the business plan, and that is a problem.”
Without a clear product to shoot for, no large company has ever jumped in. And the risks for a business are especially high because there are relatively few patients with complete quadriplegia—about 40,000 in the U.S.—and even fewer with advanced ALS. A company Donoghue created, Cyberkinetics, went out of business after raising more than $30 million. Researchers instead get by on small grants that are insignificant compared with a typical commercial effort to develop a new medical device, which can cost $100 million. “There is not a single company out there willing to put the money in to create a neuroprosthetic for quadriplegics, and the market is not big enough for a venture capitalist to get in,” says Gotshalk. “The numbers don’t add up.”
Others think the technology behind brain-machine interfaces may have unexpected applications, far removed from controlling robot arms. Many researchers, including Carmena and the team at Battelle, are trying to determine whether the interfaces might help rehabilitate stroke patients. Because they form a large market, it “would be a game changer,” Carmena says. Some of the recording technologies could be useful for understanding psychiatric diseases like depression or obsessive-compulsive disorder.
In Scheuermann’s case, at least, her brain-machine interface has proved to be powerful medicine. When she first arrived at Pittsburgh, her doctors say, her affect was flat, and she didn’t smile. But being part of the experiment energized her. “I was loving it. I had coworkers for the first time in 20 years, and I felt needed,” she says. She finished dictating a mystery novel, Sharp as a Cucumber, that she’d started before she became ill and published it online. Now she’s working on a second one. Scheuermann told me she’d like to have a robotic arm at home. She’d be able to open the door, roll onto her lawn, and talk to neighbors. Maybe she’d open the refrigerator and grab a sandwich that her aide had prepared for her.
Our call was ending. The moment was awkward. I could hang up the phone, but she couldn’t. Her husband had gone out shopping. Hector was back in the lab. She was alone and couldn’t move. “That’s all right,” Scheuermann said. “I’ll just let the phone slip to the floor. Good-bye.”
Be the leader your company needs. Implement ethical AI.
Join us at EmTech Digital 2019.