Skip to Content

A Computer Interface that Takes a Load Off Your Mind

A wearable brain scanner could give computers insight into how hard you’re thinking.

Conversations between people include a lot more than just words. All sorts of visual and aural cues indicate each party’s state of mind and make for a productive interaction.

Mental load: A user tries the Brainput system.

But a furrowed brow, a gesticulating hand, and a beaming smile are all lost on computers. Now, researchers at MIT and Tufts are experimenting with a way for computers to gain a little insight into our inner world.

Their system, called Brainput, is designed to recognize when a person’s workload is excessive and then automatically modify a computer interface to make it easier. The researchers used a lightweight, portable brain monitoring technology, called functional near-infrared spectroscopy (fNIRS), that determines when a person is multitasking. Analysis of the brain scan data was then fed into a system that adjusted the user’s workload at those times. A computing system with Brainput could, in other words, learn to give you a break.

There are other ways that a computer could detect when a person’s mental workload is becoming overwhelming. It could, for example, log errors in typing or speed of keystrokes. It could also use computer vision to detect facial expressions. “Brainput tries to get to closer to the source, by looking directly at brain activity,” says Erin Treacy Solovey, a postdoctoral researcher at MIT. She presented the results last Wednesday at the Computer Human Interaction Conference in Austin, Texas.

For an experiment, Treacy Solovey and her team incorporated Brainput into virtual robots designed to adapt to the mental state of their human controller. The main goal was for each operator, capped with fNIRS headgear, to guide two different robots through a maze to find a location where a Wi-Fi signal was strong enough to send a message. But here’s what made it tough: the drivers had to constantly switch between the two robots, trying to keep track of both their locations and keep them from crashing into walls.

As the research subjects drove their robots toward the strongest Wi-Fi signal, their fNIRS sensors transmitted information about their mental state to the robots. The robots, for their part, were programmed to focus on a state of mind called branching, in which a person is simultaneously working on two goals that require attention. (Previous studies have correlated certain fNIRS signals to this sort of mental state.) When the robots sensed that the driver was branching, they took on more of the navigation themselves.

The researchers found that when the robots’ autonomous mode kicked in, the overall performance of the human-robot team improved. The drivers didn’t seem to notice or get frustrated by the autonomous behavior of the robot when they were multitasking. The researchers also tried increasing the autonomy of the robots when Brainput did not indicate that users were mentally overloaded. When they did this, they found that overall performance decreased. In other words, increased autonomy only helped when users were struggling to cope.

“A good chunk of computer and human-computing interaction research these days is focused on giving computers better senses so they can either implicitly or explicitly augment our intellect and assist with our tasks,” says Desney Tan, a researcher at Microsoft Research. “This work is a wonderful first step toward understanding our changing mental state and designing interfaces that dynamically tailor themselves so that the human-computer system can be as effective as possible.”

Treacy Solovey suggests that such a system could potentially be used to help drivers, pilots, and supervisors of unmanned aerial vehicles. She says future work will investigate other cognitive states that can be reliably measured using fNIRS.

Keep Reading

Most Popular

A Roomba recorded a woman on the toilet. How did screenshots end up on Facebook?

Robot vacuum companies say your images are safe, but a sprawling global supply chain for data from our devices creates risk.

A startup says it’s begun releasing particles into the atmosphere, in an effort to tweak the climate

Make Sunsets is already attempting to earn revenue for geoengineering, a move likely to provoke widespread criticism.

10 Breakthrough Technologies 2023

Every year, we pick the 10 technologies that matter the most right now. We look for advances that will have a big impact on our lives and break down why they matter.

These exclusive satellite images show that Saudi Arabia’s sci-fi megacity is well underway

Weirdly, any recent work on The Line doesn’t show up on Google Maps. But we got the images anyway.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.