MIT Technology Review Subscribe

Computers Get in Touch with Your Emotions

Machines that respond to emotional state could help you focus better on the task at hand.

Computers could be a lot more useful if they paid attention to how you felt. With the emergence of new tools that can measure a person’s biological state, computer interfaces are starting to do exactly that: take users’ feelings into account. So claim several speakers at Blur, a conference this week in Orlando, Florida, that focused on human-computer interaction.

Look out: A system developed by Design Interactive monitors a marine at Camp Lejeune in North Carolina as he scans images for weapons.

Kay Stanney, owner of Design Interactive, an engineering and consulting firm that works with the Defense Advanced Research Projects Agency and the Office of Naval Research, says that a lot of information about a user’s mental and physiological state can be measured, and that this data can help computers cater to that user’s needs.

Advertisement

Design Interactive is prototyping Next Generation Interactive Systems, or NexIS, a system that will place biological sensors on soldiers. If a sensor detects that a soldier’s pulse is weakening, for example, the system might call for help or administer adrenaline. Similar technology could prove useful in civilian conditions, Stanney says. Sensors on air traffic controllers or baggage screeners could help prevent errors or poor performance, she says.

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

Design Interactive is working on another project called Auto-Diagnostic Adaptive Precision Training for Baggage Screeners (Screen-ADAPT), which would aid in training by using measurements including electroencephalography, eye tracking, and heart-rate monitoring to assess performance. The idea is to learn how successful screeners scan an image so others can apply similar techniques.

Stanney admits this is challenging, because not every successful baggage screener does the job in exactly the same way. “This will really come down to the art of the algorithm—what it is that we’re trying to optimize,” she says. Sensors can already detect when a person is drowsy, distracted, overloaded, or engaged. But it would be ideal to be able to determine other states such as frustration, or even to distinguish between different types of frustration.

Some companies are already applying these ideas. Mercedes, for example, has developed algorithms that watch how a driver operates the steering wheel to detect signs of drowsiness. Stanney says the approach could also make personal computers more useful. For example, a computer might eventually be able to detect when a user is overloaded and then suggest focusing on one application.

Hans Lee, chief technical officer of EmSense, a San Francisco company  that measures users’ cognitive and emotional state for the purpose of market research, says there are plenty of potential applications for a computer that can read a human’s mood. “No matter what you do, emotion matters,” he says.

Lee says studies suggest that 40 percent of people verbally abuse their computers. A device capable of recognizing a user’s frustration and addressing it could make workers more efficient, and mean fewer broken monitors.  “What if your computer could apologize to you?” he says.

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement