Skip to Content

Digital Summit: First Emotion-Reading Apps for Kids with Autism

Software meant to help people interpret emotions will soon be available in several apps.

The first mobile apps that use emotion-reading software to help kids with autism are nearing release, a startup reported today at the MIT Technology Review Digital Summit in San Francisco.

Rana el Ktk
Emo-app: Rana el Kaliouby says the first apps that use facial emotion-reading software are coming soon.

One of the apps is a game that challenges kids to match a face to the emotion it is projecting, said Rana el Kaliouby (see “Innovators Under 35: Rana el Kaliouby”), chief science officer of Affectiva, which is based in Waltham, Massachusetts. Another will allow people to submit face pictures, such as “selfies,” and get a readout on the mood of the person in the photo. This could be used to for social sharing of people’s moods in different locations. A third, intended for anyone, would allow people to make music with facial expressions: raise or lower eyebrows to make a tone rise or fall, or smile or frown to make music that sounds happy or sad.

Affectiva grew out of emotion-detecting research at MIT’s Media Lab. The company’s software, called Affdex, analyzes images of faces to detect features such as smiles, frowns, raised eyebrows, furrowed brows, and smirks. Though the early academic research focused on applications such as helping people with autism, so far the technology has been used commercially to help marketers understand whether ads are effective (see “Startup Gets Computers to Read Faces, Seeks Purpose Beyond Ads”).

Then, last year, the company released the software to app writers for iOS, the operating system used in iPhones and iPads. And now the first apps are coming, said el Kaliouby. “Autistic kids have trouble reading and understanding social and emotional cues,” she said. “Just as people with hearing problems benefit from a hearing aid, people with social and emotional problems can benefit from systems that help them understand emotions. We started out with research on autism, and we went out and did this commercial stuff.”  But now, she says, others “can take it and apply it back to autism again.”

The advertising work helped make the software more accurate by “training” it, she added. After three years analyzing faces seen on webcams, Affectiva’s database now holds more than a billion facial expressions. 

Keep Reading

Most Popular

light and shadow on floor
light and shadow on floor

How Facebook and Google fund global misinformation

The tech giants are paying millions of dollars to the operators of clickbait pages, bankrolling the deterioration of information ecosystems around the world.

protein structures
protein structures

DeepMind says it will release the structure of every protein known to science

The company has already used its protein-folding AI, AlphaFold, to generate structures for the human proteome, as well as yeast, fruit flies, mice, and more.

ASML machine
ASML machine

Inside the machine that saved Moore’s Law

The Dutch firm ASML spent $9 billion and 17 years developing a way to keep making denser computer chips.

brain map
brain map

This is what happens when you see the face of someone you love

The moment we recognize someone, a lot happens all at once. We aren’t aware of any of it.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.