Skip to Content

Kinect-Powered Depression Detector is Amazing and Creepy

By analyzing a surprisingly simple set of facial tics, a depth camera can see right into your soul.
April 1, 2013

People have been baring their souls to soulless machines ever since ELIZA successfully impersonated a Rogerian therapist using nothing but a text display in the 1960s. Still, any value gotten out of these interactions was totally projected by the human user; the dumb code couldn’t actually interpret your feelings. Now, thanks to an Kinect depth camera and some ingenious computer vision algorithms, a machine exists that can really, accurately diagnose whether you’re depressed or not with 90% accuracy.

The system, called SimSensei, uses an interactive digital avatar to conduct a verbal interview with the person being screened for depression. In some respects, the performance of this avatar isn’t much better than old ELIZA: it asks leading questions, lingers in silence to prompt you to elaborate, and generally exhibits a close-but-no-cigar facility with normal conversational rhythms. The avatar appearance is pretty decent—nowhere near sophisticated enough to stumble into the dreaded uncanny valley, but not so crude that it’s distracting, either. It’s not difficult at all to imagine the system being effective at eliciting “real” enough conversation from the human it’s screening.

But it’s the machine vision under the hood—which SimSensei uses to actively sense and analyze your emotional state in realtime—that’s at once amazing and kind of disturbing. The YouTube demo of SimSensei exposes all the algorithmic pattern-matching at work, and it looks like some kind of unholy self-driving Voight-Kampff machine. Skeletal polygonal overlays map the depressed human’s posture, gaze direction, and even “smile level” (yes, that’s an actual graph in the upper right hand corner), converting his ineffable, phenomenological states into a flat stream of bits. It reminded me of Timo Arnall’s eerie “Robot Readable World”: both fascinating and alienating. Are these coarse signals, captured by a cheap piece of commodity technology, really all it takes to detect that another conscious being is mentally suffering? Apparently so. What’s weird is that when we humans intuit that another person is depressed, our own pattern-matching “software” may not be doing anything much more sophisticated. (After all, we can’t see “into” another person’s subjective awareness anymore than SimSensei can; all we can do is interpret their outward behavior.) SimSensei, like any anthropomimetic technology, may be just as useful as a “user interface” for understanding our own wetware as it is for outsourcing it. ELIZA, eat your heart out.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.