Using neural activity recorded from a sheet of electrodes laid directly on the surface of a patient’s brain, scientists can predict the movement of fingers, as well as which of several sounds the patient is imagining. Eventually, researchers hope to use the findings to develop intuitive neural prostheses, such as a robotic hand that moves its fingers with as little mental effort as it takes to move real ones, or a computer interface that detects imagined words. To realize this vision, scientists are also developing smaller, more flexible technology, which could be more easily implanted and make better contact with the brain. Details of the latest brain-computer interface technology were presented this week at the Society for Neurosciences conference in Washington, DC.
“It could create the basis for a brain-computer interface that is very intuitive, and a recording platform that is very robust,” says Gerwin Schalk, a research scientist at the Wadsworth Center, in Albany, NY, who led one of the projects.
Schalk and his colleagues studied epilepsy patients undergoing a procedure known as electrocorticography (ECoG), in which a flat array of electrodes is laid over an exposed section of cortex to record electrical activity. Normally, surgeons use this information to pinpoint the source of seizures and to map the location of specific brain functions, which must be avoided during surgery. The technique generates a better spatial resolution than electroencephalography (EEG), a noninvasive approach that records activity through the scalp. ECoG is now being explored for use in brain-computer interfaces. “There’s a growing interest in use of ECoG signals because nothing penetrates into the brain, and that appeals to people more than penetrating electrodes,” says Marc Schieber, a physician and scientist at the University of Rochester Medical School, who was not involved in the research.
Schalk and his collaborators recorded electrical activity from the motor cortex and Broca’s area, a part of the brain involved in speech, in five patients as they moved their hands and fingers in specific ways and vocalized or imagined specific sounds. The researchers then used specially developed algorithms to search the neural activity for patterns relating to a certain movement or sound. “We can tell you how they are flexing each of their fingers,” says Schalk. What’s more, the researchers could determine in real time which of two sounds a patient was imagining. This kind of information could be used to control a brain-computer interface, providing a lifeline for people with severe paralysis, such as that associated with end-stage amyotrophic lateral sclerosis, a neurodegenerative disease, or locked-in syndrome, the result of a specific kind of stroke that leaves the patient unable to move or communicate.