Turning Thoughts into Words
A new approach allows more information to be extracted from the brain.
Brain-computer interfaces could someday provide a lifeline to “locked-in” patients, who are unable to talk or move but are aware and awake. Many of these patients can communicate by blinking their eyes, but turning blinks into words is time-consuming and exhausting.
Scientists in Utah have now demonstrated a way to determine which of 10 distinct words a person is thinking by recording the electrical activity from the surface of the brain.
The new technique involves training algorithms to recognize specific brain signals picked up by an array of nonpenetrating electrodes placed over the language centers of the brain, says Spencer Kellis, one of the bioengineers who carried out the work at the University of Utah, in Salt Lake City. The approach used is known as electrocorticography (ECoG). The group was able to identify the words “yes,” “no,” “hot, “cold,” “thirsty,” “hungry,” “hello,” “goodbye,” “more,” and “less” with an accuracy of 48 percent.
“The accuracy definitely needs to be improved,” says Kellis. “But we have shown the information is there.”
Individual words have been decoded from brain signals in the past using functional magnetic resonance imaging (fMRI), says Eric Leuthardt, director of the Center for Innovation in Neuroscience and Technology at Washington University School of Medicine in St. Louis, Missouri. This is the first time that the feat has been performed using ECoG, a far more practical and portable approach than fMRI, he says.
Working with colleagues Bradley Greger and Paul House, Kellis placed 16 electrodes on the surface of the brain of a patient being treated for epilepsy. The electrodes recorded signals from the facial motor cortex–an area of the brain that controls face muscles during speech–and over the Wernicke’s area, part of the cerebral cortex that is linked with language. To train the algorithm, signals were analyzed as the patient was asked to repeatedly utter the 10 words.
ECoG has long been used to locate the source of epileptic seizures in the brain. But electrodes used are typically several hundred microns in size and are positioned centimeters apart, says Kellis. “The brain is doing processing at a much finer spatial scale than is really detectable by these standard clinical electrodes,” he says. The Utah team used a new type of microelectrode array developed by PMT Neurosurgical. The electrodes are much smaller–40 microns in size–and are separated by a couple of millimeters.
It’s possible to use less invasive techniques, such as electroencephalography (EEG), which places electrodes on the scalp, to enable brain-to-computer communications. Adrian Owen, a senior scientist in the Cognition and Brain Sciences Unit at the University of Cambridge, UK, has shown that EEG signals can be used to allow people in a persistent vegetative state to communicate “yes” and “no.”
But with EEG, many of the signals are filtered out by the skull, says Leuthardt. “What’s really nice about ECoG is its potential to give us a lot more information,” he says.
Decoding 10 words is “very cool,” says Owen, but the accuracy will need to improve dramatically, given the patients the technology is aimed at. “I don’t think even 60 percent or 70 percent accuracy is going to work for patients who cannot communicate in any other way and where there is no other margin for verification,” he says.
Ultimately, the hope is that ECoG will enable much more sophisticated communication. Last year Leuthardt showed that ECoG could be used to decode vowel and consonant sounds–an approach that might eventually be used to reconstruct a much larger number of complete words.
AI is here.
Own what happens next at EmTech Digital 2019.