Electrodes for Your Face Bring Your Emotions to Augmented and Virtual Reality
Tej Tadi is really excited about the foam face pad that sits inside a virtual-reality headset on a desk in his office.
It’s the kind of mundane piece of cushioning normally found on many kinds of head-mounted displays, but the one his startup, MindMaze, has is different. It’s embedded with electrodes so it can pick up the electrical signals associated with your facial muscle movements, and it’s connected to a computer where software analyzes these signals to reproduce your facial expressions on an on-screen avatar.
Called Mask, Tadi sees it as a way to bring natural-looking grimaces, smiles, and eyebrow raises to virtual characters without adding much bulk to headsets. Making it easier for users to express emotions—and interact with each other—in virtual reality could encourage more people to try it out, he thinks, and make it more effective.
“The only way to do it is to bring emotions back into the game,” he says. “That makes us human, right? The non-verbal cues.”
For now, virtual and augmented reality headsets are still a tough sell to most consumers for many reasons: they’re annoying to wear, don’t yet have many clear practical applications, and can feel isolating, to name just a few.
Switzerland-based MindMaze, which has raised $100 million, already offers virtual-reality hardware that combines features like games, motion tracking, and brain-signal monitoring to help rehabilitate stroke victims; it’s currently used in several dozen hospitals in Europe.
Mask builds on the company’s existing work, Tadi says, and can currently recognize 10 different expressions, including winking, smiling, smirking, grimacing, and eyebrow-raising. And with a microphone attached to it, it can mimic the wearer’s mouth while they’re speaking, too.
Tadi expects such a gadget will be available to consumers later this year, either as a product from MindMaze itself or from a headset maker.
As I watched a member of the MindMaze team try out a prototype of the device, it seemed to work quite well: with an OSVR virtual-reality headset outfitted with MindMaze’s technology, he made a variety of expressions that a cartoony male character also made on a desktop display. The speaking mimicry didn’t seem much better than basic “talking” motions I’ve seen on virtual characters’ mouths, however.
There was no need to calibrate the headset for me to try it, but it had issues recognizing some of my facial expressions. Tadi and his team explained that this may have been due to stray hairs getting in the way of one of the electrodes.
And it’s still unwieldy and messy: the electrode-laden foam insert was connected to some wires and electronics as well as to a computer, and a gel-swabbed electrode was connected to each of my earlobes to serve as a reference.
Tadi says the ear electrodes could be replaced by dry ones that are embedded in headphones connected to the headset, and that the electronics could be whittled down. But the fact remains that the feature would be adding even more baggage to headsets that are already criticized for their size—it remains to be seen whether the addition of emotional expression can outweigh that issue.
The inside story of how ChatGPT was built from the people who made it
Exclusive conversations that take us behind the scenes of a cultural phenomenon.
Sam Altman invested $180 million into a company trying to delay death
Can anti-aging breakthroughs add 10 healthy years to the human life span? The CEO of OpenAI is paying to find out.
ChatGPT is about to revolutionize the economy. We need to decide what that looks like.
New large language models will transform many jobs. Whether they will lead to widespread prosperity or not is up to us.
GPT-4 is bigger and better than ChatGPT—but OpenAI won’t say why
We got a first look at the much-anticipated big new language model from OpenAI. But this time how it works is even more deeply under wraps.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.