A View from John Pavlus
Kinect-Powered Depression Detector is Amazing and Creepy
By analyzing a surprisingly simple set of facial tics, a depth camera can see right into your soul.
People have been baring their souls to soulless machines ever since ELIZA successfully impersonated a Rogerian therapist using nothing but a text display in the 1960s. Still, any value gotten out of these interactions was totally projected by the human user; the dumb code couldn’t actually interpret your feelings. Now, thanks to an Kinect depth camera and some ingenious computer vision algorithms, a machine exists that can really, accurately diagnose whether you’re depressed or not with 90% accuracy.
The system, called SimSensei, uses an interactive digital avatar to conduct a verbal interview with the person being screened for depression. In some respects, the performance of this avatar isn’t much better than old ELIZA: it asks leading questions, lingers in silence to prompt you to elaborate, and generally exhibits a close-but-no-cigar facility with normal conversational rhythms. The avatar appearance is pretty decent—nowhere near sophisticated enough to stumble into the dreaded uncanny valley, but not so crude that it’s distracting, either. It’s not difficult at all to imagine the system being effective at eliciting “real” enough conversation from the human it’s screening.
But it’s the machine vision under the hood—which SimSensei uses to actively sense and analyze your emotional state in realtime—that’s at once amazing and kind of disturbing. The YouTube demo of SimSensei exposes all the algorithmic pattern-matching at work, and it looks like some kind of unholy self-driving Voight-Kampff machine. Skeletal polygonal overlays map the depressed human’s posture, gaze direction, and even “smile level” (yes, that’s an actual graph in the upper right hand corner), converting his ineffable, phenomenological states into a flat stream of bits. It reminded me of Timo Arnall’s eerie “Robot Readable World”: both fascinating and alienating. Are these coarse signals, captured by a cheap piece of commodity technology, really all it takes to detect that another conscious being is mentally suffering? Apparently so. What’s weird is that when we humans intuit that another person is depressed, our own pattern-matching “software” may not be doing anything much more sophisticated. (After all, we can’t see “into” another person’s subjective awareness anymore than SimSensei can; all we can do is interpret their outward behavior.) SimSensei, like any anthropomimetic technology, may be just as useful as a “user interface” for understanding our own wetware as it is for outsourcing it. ELIZA, eat your heart out.
Be the leader your company needs. Implement ethical AI.
Join us at EmTech Digital 2019.