Mobile Phone Mind Control
A new device from Dartmouth College lets users select and dial a contact’s phone number just by thinking about it.
NeuroPhone was developed by assistant professor of computer science Tanzeem Choudhury, (a 2008 TR35 winner), professor Andrew Campbell and others. It uses neural signals detected by an off-the-shelf wireless EEG headset from Emotiv to control an iPhone. A mind-controlled contact-dialing app flashes photos of contacts–when the user sees the contact she wants to call, a specific signature of brain activity triggers the system, which automatically tells the iPhone to dial that person.
In contrast to voice activation or eye tracking, the team says brain-control could allow for easy and silent phone use, and for conveying emotional states to other users. The team showed that the system works more reliably by detecting electrical signals from muscle movement when a user winks to select a contact.
Other computer-human interaction devices based on EEG are used to give vegetative patients a way to communicate, let users play hands-free video games, and even allow robot-owners to convey messages to Roombas. But while most consumers probably wouldn’t want to walk around with an EEG-reading headband, if such interfaces become smaller, less intrusive, and cheaper (the one used in the experiment was about $300), this sort of device might well take off.
The researchers demo the device in the video below:
Keep Reading
Most Popular
Large language models can do jaw-dropping things. But nobody knows exactly why.
And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.
OpenAI teases an amazing new generative video model called Sora
The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.
Google’s Gemini is now in everything. Here’s how you can try it out.
Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.
This baby with a head camera helped teach an AI how kids learn language
A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.