One of the greatest living psychologists is an American called Paul Ekman. In the 1970s, Ekman and a colleague developed a way to categorise and assess human facial expressions.
At the time, many psychologists believed that the expressions conveying specific emotions vary from one culture to another. But in a ground-breaking set of experiments carried out with cultures all over the world, Ekman showed that all humans share the same facial expressions for six basic emotions–anger, fear, joy, surprise, disgust and sadness.
He went to develop a taxonomy of facial expressions called the Facial Action Coding System or FACS which identifies the facial muscle movements associated with each expression.
This work has been hugely influential. FACS is particularly useful for psychologists studying the role that emotion plays in everyday life.
But there is another group that has benefited too–animators. FACS provides a straightforward way to give computer-generated characters realistic expressions relatively easily. Indeed, FACS has inspired an MPEG4 standard for encoding facial expressions in computer generated characters.
That in turn has helped psychologists who can now produce exactly reproducible emotions on-demand in the expressions of virtual characters. That’s hugely useful in research projects.
However, there’s a problem. While plenty of people have evaluated and calibrated expressions in humans, nobody has done the same for virtual characters. That’s significant because humans may not interpret facial expressions in virtual characters in the same way as they do in humans.
What’s more, since researchers generate their own virtual characters, the way expressions vary from one project to another may mean the results are not be comparable.
All that could be solved with a standard set of expressions that have been comprehensively evaluated and calibrated by real human subjects.
Today, Joost Broekens and pals at the Man-Machine-Interaction department at Delft University in The Netherlandsthat’s do exactly that.
These guys have created a set of six virtual expressions based on FACS. Each expression is a set of vectors that together specify how different parts of an animated face should move to simulate a basic emotion. A virtual character simply imports these vectors to take on that expression.
They then asked human volunteers to evaluate each expression, asking them to determine the emotion it represents and its intensity when the virtual character is near and further away and when viewed from the side.
The results show that these virtual expressions communicate emotions in more or less the same way as human faces. There are one or two minor differences: a fearful expression also tends to look surprised and disgust can be confused with anger, something that other researchers have also found. But these are minor concerns
As a further check, the team also asked the volunteers to evaluate blends of two basic expressions to produce so-called blended emotions. For example, joy and anger together communicate evil or naughtiness but this has never been properly measured in virtual characters before.
One of the team’s important findings is that the volunteers were all able to identify anger easily in the tests. However other blended emotions such as enthusiasm (joy + surprise) did not fare so well.
An important point here is that Broekens and buddies are not attempting to create the most realistic or believable expressions. Instead, they have produced a set of clearly calibrated expressions that are easy to reproduce exactly in more or less any virtual environment.
That will be hugely useful for researchers wanting to produce comparable data in a variety of different settings and tests. It will also be useful for any animators who want an easy way to make their characters convey a very specific emotion.
For those who want to download the expressions, the team has made them publicly available from http://www.joostbroekens.com. The Microsoft paper clip may never look the same again.
Ref: arxiv.org/abs/1211.4500: Dynamic Facial Expression of Emotion Made Easy
Embracing CX in the metaverse
More than just meeting customers where they are, the metaverse offers opportunities to transform customer experience.
Identity protection is key to metaverse innovation
As immersive experiences in the metaverse become more sophisticated, so does the threat landscape.
The modern enterprise imaging and data value chain
For both patients and providers, intelligent, interoperable, and open workflow solutions will make all the difference.
Scientists have created synthetic mouse embryos with developed brains
The stem-cell-derived embryos could shed new light on the earliest stages of human pregnancy.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.