Skip to Content
Humans and technology

The metaverse is the next venue for body dysmorphia online

Some people are excited to see realistic avatars that look like them. Others worry it might make body image issues even worse.

November 16, 2021
artwork of a person trying to break out of a box facebook metaverse
Daniel Zender

In Facebook’s vision of the metaverse, we will all interact in a mashup of the digital and physical worlds. Digital representations of ourselves will eat, talk, date, shop, and more. That’s the picture Mark Zuckerberg painted as he rebranded his company Meta a couple of weeks ago.

The Facebook founder’s typically awkward presentation used a cartoon avatar of himself doing things like scuba diving or conducting meetings. But Zuckerberg ultimately expects the metaverse to include lifelike avatars whose features would be much more realistic, and which would engage in many of the same activities we do in the real world—just digitally.

“The goal here is to have both realistic and stylized avatars that create a deep feeling that we’re present with people,” Zuckerberg said at the rebranding.

If avatars really are on their way, then we’ll need to face some tough questions about how we present ourselves to others. How might these virtual versions of ourselves change the way we feel about our bodies, for better or worse?

Avatars are not a new concept, of course. Gamers have used them for decades: the pixelated, boxy creatures of Super Mario have given way to the hyperrealistic forms ofDeath Stranding,which emote and move eerily like a living, breathing human. 

But how we use avatars becomes more complicated when we expect them to act as representations of ourselves beyond the context of a particular game. It’s one thing to inhabit the overalls and twang of Mario. It’s another to create an avatar that acts as your ambassador, your representation, your very self. The avatars of the metaverse will be participating in situations that might involve higher stakes than treasure in a race. In interviews or meetings, this self-presentation might play a bigger, far more consequential role. 

For some people, avatars that reflect who they are would be a powerful source of validation. But creating one can be a struggle. Gamer Kirby Crane, for example, recently ran an experiment where he tried to do one simple thing: make an avatar that looked like him in 10 different video games.

“My goal wasn’t so much to explore the philosophy of avatars but more to explore the representation that’s available in current avatars and see if I could portray myself accurately,” says Crane, who describes himself as a “fat, gay, pre–medical transition trans man.”

Some games allowed him to bulk up his body but bizarrely had him burst out of his clothes if he tried to make the character fat. Other games didn’t allow for an avatar to be male with breasts, which Crane found isolating, as it suggested that the only way to be male was to be male-presenting.

None of the avatars, in the end, felt like Crane—a result he wasn’t surprised by. “Not that I need validation from random game developers, but it’s dehumanizing to see the default man and the accepted parameters of what it means,” he says. 

Crane’s experiment isn’t scientific, nor is it any indication of how the metaverse will operate. But it offers a peek into why avatars in the metaverse could have far-reaching consequences for how people feel and live in the real, physical world. 

What complicates the issue further is Meta’s announcement of Codec Avatars, a project within Facebook’s VR/AR research arm, Reality Labs, that is working toward making photorealistic avatars. Zuckerberg highlighted some of the advances the group has made in making avatars seem more human, such as clearer emotions and better rendering of hair and skin.

“You’re not always going to want to look exactly like yourself,” he said. “That’s why people shave their beards, dress up, style their hair, put on makeup, or get tattoos, and of course, you’ll be able to do all of that and more in the metaverse.”

That hyperpersonalization could allow avatars to realistically portray the lived experience of millions of people who, like Crane, have thus far found the technology limiting. But people might also do the opposite and create avatars that are idealized, unhealthy versions of themselves: puffing out their lips and butt to Kardashian-ify their appearance, lightening their skin to play into racist stereotypes, whitewashing their culture by changing features outright.

In other words, what happens if the avatar you present isn’t who you are? Does it matter?

Jennifer Ogle of Colorado State University and Juyeon Park of Seoul National University conducted a small study this year that might shed light on how avatars affect body image. They recruited 18 women between the ages of 18 and 21 who said they had some body image concerns but had not received any treatment for them. The women were separated into two groups. One attended a body positivity program before creating a virtual avatar that looked exactly like themselves; the other only participated in the body positivity program.

The results illustrated how difficult it was for women to see themselves from a third-person point of view. One woman said, “I did not like how [my avatar looked] … I don’t know, I just didn’t think I looked like that … it kind of made me feel self-conscious. Just kind of bad about myself.” The body positivity courses led to a momentary rise in self-esteem, but it was nullified once they saw their avatars.

That doesn’t bode well for the metaverse, where avatars are likely to be the primary way we communicate and interact with each other. Noelle Martin, a legal researcher at the University of Western Australia and coauthor of a forthcoming paper on Meta’s metaverse, is raising just such concerns. “If people are able to customize their 3D hyperrealistic virtual human avatars, or alter, filter, and manipulate their digital identities, [there is] a concerning potential to impact body dysmorphia, selfie dysmorphia, and eating disorders … producing] ‘unrealistic and unattainable’ standards of beauty, particularly for young girls,” she said via email.

That fear is not unfounded. Facebook has been criticized for silencing internal research indicating that Instagram has a toxic effect on body image for teenage girls. A report in the Wall Street Journalfound that the app’s content focus on body and lifestyle leaves users more susceptible to body dysmorphia. But in the metaverse, where avatars will be the main way to present oneself in many situations, vulnerable people could feel even more pressure to adjust the way they look. And Martin says that customizable avatars in the metaverse may be used to “inflame racial injustices and inequities” as well.

Meta spokesperson Eloise Quintanilla said that the company is aware of potential problems: “We’re asking ourselves important questions such as how much modification makes sense to ensure avatars are a positive and safe experience.” Microsoft, which recently announced its own metaverse plans, has also been studying avatar use, though its research has been heavily focused on workplace settings like meetings.

The prospect of metaverse avatars for kids raises a whole other set of legal and ethical questions. Roblox, the wildly successful gaming platform whose primary market is children, has long used avatars as the primary means by which players interact with each other. And the company announced its own plans for a metaverse last month; CEO and founder David Baszucki declared that Roblox’s metaverse would be a place “where you have to be whoever you want to be.” Thus far, Roblox avatars have been playful, but Baszucki said that the company is pursuing completely customizable ones: “Any body, any face, any hair, any clothing, any motion, any facial tracking, all coming together … We have a hunch that if we do this right, we will see an explosion of creativity, not just among our creators but also our users.”

Ultimately, avatars represent how we want to be seen. Yet there is no plan for what might happen if and when things inevitably go wrong. The technology has to walk a fine line, staying realistic enough to be true to people’s identities without threatening the mental health of the humans behind the avatars. As Park says: “We won’t be able to stop the … metaverse. So we should wisely prepare.” If the Facebook papers show anything, it’s that social media companies are well aware of the health effects of their technology, but governments and social safety nets are behind in protecting the most vulnerable.

Crane understands the risks of more realistic avatars for those who might have body dysmorphia, but he says the power of being able to see himself in the virtual world would be indescribable. “For me, the joy of seeing myself represented accurately would mean that I am not the only person who believes my existence is valid,” he says. “It means a team of developers also see the potential of me existing, as I look, as a man.”

Component"undefined"is not configured.

Deep Dive

Humans and technology

Unlocking the power of sustainability

A comprehensive sustainability effort embraces technology, shifting from risk reduction to innovation opportunity.

Building a data-driven health-care ecosystem

Harnessing data to improve the equity, affordability, and quality of the health care system.

Let’s not make the same mistakes with AI that we made with social media

Social media’s unregulated evolution over the past decade holds a lot of lessons that apply directly to AI companies and technologies.

People are worried that AI will take everyone’s jobs. We’ve been here before.

In a 1938 article, MIT’s president argued that technical progress didn’t mean fewer jobs. He’s still right.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.