Skip to Content

Samsung Demos a Tablet Controlled by Your Brain

An easy-to-use EEG cap could expand the number of ways to interact with your mobile devices.

One day, we may be able to check e-mail or call a friend without ever touching a screen or even speaking to a disembodied helper. Samsung is researching how to bring mind control to its mobile devices with the hope of developing ways for people with mobility impairments to connect to the world. The ultimate goal of the project, say researchers in the company’s Emerging Technology Lab, is to broaden the ways in which all people can interact with devices.

Samsung mind control device
Thought launch: A Samsung researcher tests an EEG-controlled app on a tablet.

In collaboration with Roozbeh Jafari, an assistant professor of electrical engineering at the University of Texas, Dallas, Samsung researchers are testing how people can use their thoughts to launch an application, select a contact, select a song from a playlist, or power up or down a Samsung Galaxy Note 10.1. While Samsung has no immediate plans to offer a brain-controlled phone, the early-stage research, which involves a cap studded with EEG-monitoring electrodes, shows how a brain-computer interface could help people with mobility issues complete tasks that would otherwise be impossible.

Brain-computer interfaces that monitor brainwaves through EEG have already made their way to the market. NeuroSky’s headset uses EEG readings as well as electromyography to pick up signals about a person’s level of concentration to control toys and games (see “Next-Generation Toys Read Brain Waves, May Help Kids Focus”). Emotiv Systems sells a headset that reads EEG and facial expression to enhance the experience of gaming (see “Mind-Reading Game Controller”).

To use EEG-detected brain signals to control a smartphone, the Samsung and UT Dallas researchers monitored well-known brain activity patterns that occur when people are shown repetitive visual patterns. In their demonstration, the researchers found that people could launch an application and make selections within it by concentrating on an icon that was blinking at a distinctive frequency.

Robert Jacob, a human-computer interaction researcher at Tufts University, says the project fits into a broader effort by researchers to find more ways for communicating with small devices like smartphones. “This is one of the ways to expand the type of input you can have and still stick the phone in the pocket,” he says.

Finding new ways to interact with mobile devices has driven the project, says Insoo Kim, Samsung’s lead researcher. “Several years ago, a small keypad was the only input modality to control the phone, but nowadays the user can use voice, touch, gesture, and eye movement to control and interact with mobile devices,” says Kim. “Adding more input modalities will provide us with more convenient and richer ways of interacting with mobile devices.”

Still, it will take considerable research for a brain-computer interface to become a new way of interacting with smartphones, says Kim. The initial focus for the team was to develop signal processing methods that could extract the right information to control a device from weak and noisy EEG signals, and to get those methods to work on a mobile device.

Jafari’s research is addressing another challenge—developing more convenient EEG sensors. Classic EEG systems have gel or wet contact electrodes, which means a bit of liquid material has to come between a person’s scalp and the sensor. “Depending on how many electrodes you have, this can take up to 45 minutes to set up, and the system is uncomfortable,” says Jafari. His sensors, however, do not require a liquid bridge and take about 10 seconds to set up, he says. But they still require the user to wear a cap covered with wires.

The concept of a dry EEG is not new, and it can carry the drawback of lower signal quality, but Jafari says his group is improving the system’s processing of brain signals. Ultimately, if reliable EEG contacts were convenient to use and slimmed down, a brain-controlled device could look like “a cap that people wear all day long,” says Jafari.

Kim says the speed with which a user of the EEG-control system can control the tablet depends on the user. In the team’s limited experiments, users could, on average, make a selection once every five seconds with an accuracy ranging from 80 to 95 percent.

“It is nearly impossible to accurately predict what the future might bring,” says Kim, “but given the broad support for initiatives such as the U.S. BRAIN initiative, improvements in man-machine interfaces seem inevitable” (see “Interview with BRAIN Project Pioneer: Miyoung Chun”).

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

It’s time to retire the term “user”

The proliferation of AI means we need a new word.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.