A Smart Phone that Knows You're Angry
New system detects emotions based on variables ranging from typing speeds to the weather.
Researchers at Samsung have developed a smart phone that can detect people’s emotions. Rather than relying on specialized sensors or cameras, the phone infers a user’s emotional state based on how he’s using the phone.
For example, it monitors certain inputs, such as the speed at which a user types, how often the “backspace” or “special symbol” buttons are pressed, and how much the device shakes. These measures let the phone postulate whether the user is happy, sad, surprised, fearful, angry, or disgusted, says Hosub Lee, a researcher with Samsung Electronics and the Samsung Advanced Institute of Technology’s Intelligence Group, in South Korea. Lee led the work on the new system. He says that such inputs may seem to have little to do with emotions, but there are subtle correlations between these behaviors and one’s mental state, which the software’s machine-learning algorithms can detect with an accuracy of 67.5 percent.
The prototype system, to be presented in Las Vegas next week at the Consumer Communications and Networking Conference, is designed to work as part of a Twitter client on an Android-based Samsung Galaxy S II. It enables people in a social network to view symbols alongside tweets that indicate that person’s emotional state. But there are many more potential applications, says Lee. The system could trigger different ringtones on a phone to convey the caller’s emotional state or cheer up someone who’s feeling low. “The smart phone might show a funny cartoon to make the user feel better,” he says.
Further down the line, this sort of emotion detection is likely to have a broader appeal, says Lee. “Emotion recognition technology will be an entry point for elaborate context-aware systems for future consumer electronics devices or services,” he says. “If we know the emotion of each user, we can provide more personalized services.”
Samsung’s system has to be trained to work with each individual user. During this stage, whenever the user tweets something, the system records a number of easily obtained variables, including actions that might reflect the user’s emotional state, as well as contextual cues, such as the weather or lighting conditions, that can affect mood, says Lee. The subject also records his or her emotion at the time of each tweet. This is all fed into a type of probabilistic machine-learning algorithm known as a Bayesian network, which analyzes the data to identify correlations between different emotions and the user’s behavior and context.
The accuracy is still pretty low, says Lee, but then the technology is still at a very early experimental stage, and has only been tested using inputs from a single user. Samsung won’t say whether it plans to commercialize this technology, but Lee says that with more training data, the process can be greatly improved. “Through this, we will be able to discover new features related to emotional states of users or ways to predict other affective phenomena like mood, personality, or attitude of users,” he says.
Reading emotion indirectly through normal cell phone use and context is a novel approach, and, despite the low accuracy, one worth pursuing, says Rosalind Picard, founder and director of MIT’s Affective Computing Research Group, and cofounder of Affectiva, which last year launched a commercial product to detect human emotions. “There is a huge growing market for technology that can help businesses show higher respect for customer feelings,” she says. “Recognizing when the customer is interested or bored, stressed, confused, or delighted is a vital first step for treating customers with respect,” she says.
Become an Insider to get the story behind the story — and before anyone else.