Researchers in California have created a way to place a call on a cell phone using just your thoughts. Their new brain-computer interface is almost 100 percent accurate for most people after only a brief training period.
The system was developed by Tzyy-Ping Jung, a researcher at the Swartz Center for Computational Neuroscience at the University of California, San Diego, and colleagues. Besides acting as an ultraportable aid for severely disabled people, the system might one day have broader uses, he says. For example, it could create the ultimate hands-free experience for cell-phone users, or be used to detect when drivers or air-traffic controllers are getting drowsy by sensing lapses in concentration.
Like many other such interfaces, Jung’s system relies on electroencephalogram (EEG) electrodes on the scalp to analyze electrical activity in the brain. An EEG headband is hooked up to a Bluetooth module that wirelessly sends the signals to a Nokia N73 cell phone, which uses algorithms to process the signals.
Participants were trained on the system via a novel visual feedback system. They were shown images on a computer screen that flashed on and off almost imperceptibly at different speeds. These oscillations can be detected in a part of the brain called the midline occipital. Jung and his colleagues exploited this by displaying a keypad on a large screen with each number flashing at a slightly different frequency. For instance, “1” flashed at nine hertz, and “2” at 9.25 hertz, and so on. Jung says this frequency can be detected through the EEG, thus making it possible to tell which number the subject is looking at.
“From our experience, anyone can do it. Some people have a higher accuracy than others,” says Jung, who himself can only reach around 85 percent accuracy. But in an experiment published in the Journal of Neural Engineering, 10 subjects were asked to input a 10-digit phone number, and seven of them achieved 100 percent accuracy.
In theory, the approach could be used to help severely disabled people communicate, says Jung. But he believes the technology doesn’t have to be limited to such applications. “I want to target larger populations,” he says.
“It’s interesting work,” says Rajeev Raizada, a cognitive neuroscientist at Dartmouth College who published work last year on a similar concept called the Neurophone. “People have used this sort of visually evoked response before, but the notion of making it small, cheap, and portable for a cell phone is attractive.”
The Neurophone used a brain signal known as the P300. This signal is triggered by a range of different stimuli and is used by other brain-control interfaces to gauge when something has caught a person’s attention. But this typically involves a longer training period.
However, Eric Leuthardt, director of the Center for Innovation and Neuroscience Technology at Washington University, is not convinced. “Reducing the size of the processors to a cell phone is a natural step,” he says. He says the kind of visually evoked response used in Jung’s research has been around for years, but it usually requires a large visual stimulus, which small cell phone displays are unlikely to elicit.
It will soon be easy for self-driving cars to hide in plain sight. We shouldn’t let them.
If they ever hit our roads for real, other drivers need to know exactly what they are.
Maximize business value with data-driven strategies
Every organization is now collecting data, but few are truly data driven. Here are five ways data can transform your business.
Cryptocurrency fuels new business opportunities
As adoption of digital assets accelerates, companies are investing in innovative products and services.
Where to get abortion pills and how to use them
New US restrictions could turn abortion into do-it-yourself medicine, but there might be legal risks.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.