Computers could be a lot more useful if they paid attention to how you felt. With the emergence of new tools that can measure a person’s biological state, computer interfaces are starting to do exactly that: take users’ feelings into account. So claim several speakers at Blur, a conference this week in Orlando, Florida, that focused on human-computer interaction.
Kay Stanney, owner of Design Interactive, an engineering and consulting firm that works with the Defense Advanced Research Projects Agency and the Office of Naval Research, says that a lot of information about a user’s mental and physiological state can be measured, and that this data can help computers cater to that user’s needs.
Design Interactive is prototyping Next Generation Interactive Systems, or NexIS, a system that will place biological sensors on soldiers. If a sensor detects that a soldier’s pulse is weakening, for example, the system might call for help or administer adrenaline. Similar technology could prove useful in civilian conditions, Stanney says. Sensors on air traffic controllers or baggage screeners could help prevent errors or poor performance, she says.
Design Interactive is working on another project called Auto-Diagnostic Adaptive Precision Training for Baggage Screeners (Screen-ADAPT), which would aid in training by using measurements including electroencephalography, eye tracking, and heart-rate monitoring to assess performance. The idea is to learn how successful screeners scan an image so others can apply similar techniques.
Stanney admits this is challenging, because not every successful baggage screener does the job in exactly the same way. “This will really come down to the art of the algorithm—what it is that we’re trying to optimize,” she says. Sensors can already detect when a person is drowsy, distracted, overloaded, or engaged. But it would be ideal to be able to determine other states such as frustration, or even to distinguish between different types of frustration.
Some companies are already applying these ideas. Mercedes, for example, has developed algorithms that watch how a driver operates the steering wheel to detect signs of drowsiness. Stanney says the approach could also make personal computers more useful. For example, a computer might eventually be able to detect when a user is overloaded and then suggest focusing on one application.
Hans Lee, chief technical officer of EmSense, a San Francisco company that measures users’ cognitive and emotional state for the purpose of market research, says there are plenty of potential applications for a computer that can read a human’s mood. “No matter what you do, emotion matters,” he says.
Lee says studies suggest that 40 percent of people verbally abuse their computers. A device capable of recognizing a user’s frustration and addressing it could make workers more efficient, and mean fewer broken monitors. “What if your computer could apologize to you?” he says.
A horrifying new AI app swaps women into porn videos with a click
Deepfake researchers have long feared the day this would arrive.
The worst technology of 2021
Face filters, billionaires in space, and home-buying algorithms that overpay all made our annual list of technology gone wrong.
The radical intervention that might save the “doomsday” glacier
Researchers are exploring whether building massive berms or unfurling underwater curtains could hold back the warm waters degrading ice sheets.
In a further blow to the China Initiative, prosecutors move to dismiss a high-profile case
MIT professor Gang Chen was one of the most prominent scientists charged under the China Initiative, a Justice Department effort meant to counter economic espionage and national security threats.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.