Skip to Content

Kinect-Powered Depression Detector is Amazing and Creepy

By analyzing a surprisingly simple set of facial tics, a depth camera can see right into your soul.
April 1, 2013

People have been baring their souls to soulless machines ever since ELIZA successfully impersonated a Rogerian therapist using nothing but a text display in the 1960s. Still, any value gotten out of these interactions was totally projected by the human user; the dumb code couldn’t actually interpret your feelings. Now, thanks to an Kinect depth camera and some ingenious computer vision algorithms, a machine exists that can really, accurately diagnose whether you’re depressed or not with 90% accuracy.

The system, called SimSensei, uses an interactive digital avatar to conduct a verbal interview with the person being screened for depression. In some respects, the performance of this avatar isn’t much better than old ELIZA: it asks leading questions, lingers in silence to prompt you to elaborate, and generally exhibits a close-but-no-cigar facility with normal conversational rhythms. The avatar appearance is pretty decent—nowhere near sophisticated enough to stumble into the dreaded uncanny valley, but not so crude that it’s distracting, either. It’s not difficult at all to imagine the system being effective at eliciting “real” enough conversation from the human it’s screening.

But it’s the machine vision under the hood—which SimSensei uses to actively sense and analyze your emotional state in realtime—that’s at once amazing and kind of disturbing. The YouTube demo of SimSensei exposes all the algorithmic pattern-matching at work, and it looks like some kind of unholy self-driving Voight-Kampff machine. Skeletal polygonal overlays map the depressed human’s posture, gaze direction, and even “smile level” (yes, that’s an actual graph in the upper right hand corner), converting his ineffable, phenomenological states into a flat stream of bits. It reminded me of Timo Arnall’s eerie “Robot Readable World”: both fascinating and alienating. Are these coarse signals, captured by a cheap piece of commodity technology, really all it takes to detect that another conscious being is mentally suffering? Apparently so. What’s weird is that when we humans intuit that another person is depressed, our own pattern-matching “software” may not be doing anything much more sophisticated. (After all, we can’t see “into” another person’s subjective awareness anymore than SimSensei can; all we can do is interpret their outward behavior.) SimSensei, like any anthropomimetic technology, may be just as useful as a “user interface” for understanding our own wetware as it is for outsourcing it. ELIZA, eat your heart out.

Keep Reading

Most Popular

still from Embodied Intelligence video
still from Embodied Intelligence video

These weird virtual creatures evolve their bodies to solve problems

They show how intelligence and body plans are closely linked—and could unlock AI for robots.

conceptual illustration showing various women's faces being scanned
conceptual illustration showing various women's faces being scanned

A horrifying new AI app swaps women into porn videos with a click

Deepfake researchers have long feared the day this would arrive.

protein structures
protein structures

DeepMind says it will release the structure of every protein known to science

The company has already used its protein-folding AI, AlphaFold, to generate structures for the human proteome, as well as yeast, fruit flies, mice, and more.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.