Could the tongue help a paraplegic pick up a slippery glass or a soft, squishy kitten? Justin Williams and his colleagues at the University of Wisconsin are testing electrical stimulation of the tongue as an adjunct to visual feedback for brain-controlled computer interfaces, such as those used to control prosthetic arms. According to research presented at the Society of Neurosciences conference in Washington, DC, last week, the researchers showed that volunteers can control the movement of a cursor on a computer screen using electrical stimulation to the tongue just as well as they can control it using visual feedback.
“The tongue device opens new ways for devices to interact with the brain and the body in a collaborative way,” says Gerwin Schalk, a research scientist at the Wadsworth Center in Albany, NY, who was not involved in the research. “If we have more ways of providing feedback, it would enhance performance.”
The tongue stimulator consists of a thin-film array of 144 electrodes–it’s a bit larger than a quarter–that sits on the surface of the tongue. A stimulator delivers electrical signals based on visual information–in this case, the movement of a dot on a computer screen. “It acts like a low-resolution monitor with a 12-by-12 array of pixels,” says Williams. A similar device is already in use for people with balance disorders–tongue stimulation tells the user whether her head is upright–and is also being tested as a visual aid for the blind.
For the current project, scientists attached the tongue device to a cap that reads electrical activity from the brain through the scalp using electroencephalography (EEG). The EEG cap is being developed as part of a neural prosthesis for people who are severely paralyzed. Specially developed software can detect when the user is thinking of moving his or her arms or feet, which can then be used to control a cursor on the screen.
Users usually determine how well they are performing a task, such as using brain activity to move a ball to a target on a screen, by looking at the position of the ball. Williams and his colleagues found that the volunteers could perform just as quickly and accurately when the ball’s location was represented as a pattern of electrical activity on the tongue with no visual information.
The researchers hope to use the device to augment prosthetic limbs. When we pick up a cup, we use visual signals to bring our hand to it, but we use tactile and other information, such as its heaviness or texture, to judge how hard it is to grasp. This type of feedback is missing from existing prosthetic limbs. “A prosthetic hand may be able to do wonderful things mechanically, but if the subject cannot feel anything on the fingertips, it vastly limits the amount of manipulation they may be able to do,” says Marc Schieber, a physician and scientist at the University of Rochester Medical School, who was not involved in the research.
While the tongue may seem like an odd location to deliver sensory information, Williams says that it is actually an ideal spot. “It is one of the most densely enervated organs; it has receptors for taste, pain, and temperature,” he says. Under some conditions, the device can even stimulate taste, like sweet and sour. As for how the electrical jolts feel: “Sort of like Pop Rocks,” says Williams.
AI is here. Will you lead or follow?
Join us at EmTech Digital 2019.