The Epoc+ is an $800 brain-wave-sensing headset marketed as being able to detect emotional states such as frustration or excitement, and permit you to control robots with your thoughts.
Nitesh Saxena, an associate professor at the University of Alabama at Birmingham, has shown that it can also help software guess PINs and passwords by monitoring a person’s brain waves. The study joins a small but growing body of evidence on brain-interface security that researchers say shows even the limited headsets available today need better security.
“I would say it’s a risk for today’s devices, and with more advanced devices much more could be done in future,” says Saxena, of the prospects for private data being stolen with a brain interface. “People need to think though the privacy and security models of these interfaces.” Facebook and a new startup from Elon Musk are among those working on more advanced brain interfaces that would come with greater security risks (see “With Neuralink, Elon Musk Promises Human-to-Human Telepathy. Don’t Believe It”).
The Epoc+, made by Emotiv, is one of a handful of devices on the market that use a headset with electrodes to detect voltage changes in the outer layer of the brain, an approach known as electroencephalography, or EEG. The gadgets are used in research and medicine for tasks such as steering robots and diagnosing concussion, and are sold to consumers as games controllers (see “Controlling VR with Your Mind”).
EEG signals can’t be used to simply read out what a person is thinking or doing, and the control they can provide as interfaces is relatively crude. But the University of Alabama experiments add to evidence that they can still spill private information.
The new study tested the idea that a person who paused a gaming session and logged into a bank account while still wearing an EEG headset could be at risk from malicious software snooping on personal credentials via brain waves.
People first entered random PINs and passwords while wearing the headset, allowing software to learn the link between their typing and brain waves. Saxena says this training step could be achieved in the real world by a game that asked users to enter text or codes as part of gameplay, for example.
After observing a person enter about 200 characters, algorithms could make educated guesses at new characters a person entered just by watching the EEG data. That could let a malicious game, say, snoop on someone taking a break to go on the Web. It is far from perfect, but it shortens the odds of guessing a four-digit numerical PIN from one in 10,000 to one in 20, and increases the chance of guessing a six-letter password by around 500,000 times, to roughly one in 500.
When asked about the study, a spokesperson for Emotiv said that such an attack would be impractical. Users would become suspicious if a program tried to lead them through the training exercise needed for software to be able to guess at characters they enter, and Emotiv approves all software that connects to its headsets, the spokesperson said. But Alejandro Hernández, a security researcher with IOActive, who has reviewed the security of EEG hardware and related software, considers the Alabama attack “100 percent feasible.” His research indicated that a lot of EEG software in use today isn’t well designed, and is easily hackable.
Researchers at the University of Washington have demonstrated another way to extract private information using an EEG headset. They created games that subliminally flashed up images such as bank logos and noted when a person’s brain waves registered recognition. That could provide data valuable for phishing campaigns or ads, or even elicit information about a person’s sexual orientation, says Tamara Bonaci, a researcher who was involved in the work.
The Washington group says one motivation for its research is the way companies have aggressively gathered broad data on people’s use of the Web and from mobile devices—for example, to target ads.
Even without access to brain data, companies already look for emotional clues in text to gauge people’s emotional states, and documents leaked to the Australian newspaper show that Facebook has considered targeting ads at teens on the basis of their emotions. Last month, a lawyer and ethicist at the University of Zurich called for development of new legal frameworks around neurotechnology, including a “right to mental privacy.”
Bonaci says companies working on EEG headsets should engage with these issues now, because the stakes are rising as advances in machine learning are helping researchers extract more and more from EEG data. “The improvements have been tremendous over the last few years, and I expect that to continue,” she says.
Humans and technology
Technology that lets us “speak” to our dead relatives has arrived. Are we ready?
Digital clones of the people we love could forever change how we grieve.
How to befriend a crow
I watched a bunch of crows on TikTok and now I'm trying to connect with some local birds.
How Twitter’s “Teacher Li” became the central hub of China protest information
In his own words, the Chinese painter shares how he became a one-person newsroom during a week of intense protests against China's zero-covid policy.
Meta is desperately trying to make the metaverse happen
Will web access and avatar legs be enough?
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.