Skip to Content

How digital beauty filters perpetuate colorism

An ancient form of prejudice about skin color is flourishing in the modern internet age.

Conceptual illustration of a young black woman's face with circles that zoom in on certain features, image is black and white with pink highlights
Joan Wong

When Lise was a young teenager in Georgia, her classmates bullied her relentlessly. She had moved with her family from Haiti a few years earlier, and she didn’t fit in with the other students. They teased her about her accent, claimed she “smelled weird,” and criticized the food she ate.  But most often they would attack her with remarks about her dark complexion. Sometimes teachers would send her home from school because she couldn’t stop crying. “I remember going home and I would take those copper wire things that you scrub dishes with,” she says. “I would go to the bathroom and I would take my mom’s bleach cream and scrub my skin with it.” 

And it wasn’t just white classmates. Black students harassed her too—for being an outsider, for being too different. She remembers them asking, “Why is she so dark?” 

Listen to the story here

Just when she thought it couldn’t get worse, the phone in her palm became an endless stream of pictures of beautiful, lighter-skinned women getting dozens, hundreds, or even thousands of likes and affirming comments. She slowly began to notice that the world wanted parts of her—like her curves and her lips—but not things like her dark skin or her hair. Not her whole self, all together. 

As she struggled to cope with the abuse, Lise convinced herself that the darkness of her skin was to blame. And social media platforms and the visual culture of the internet suggested the same thing. 

Even among those closest to her, the undesirability of her darkness was reinforced. She grew to realize that her mom, aunts, and friends all used the skin-lightening creams she’d borrowed after school, many of which contain toxins and even carcinogens. It was confusing: her community fought hard against racism, but some of the prejudice she experienced came from Black people themselves. 

And social media was just making it worse.

The prejudice Lise experienced—colorism—has a long history, driven by European ideals of beauty that associate lighter skin with purity and wealth, darker tones with sin and poverty. Though related to racism, it’s distinct in that it can affect people regardless of their race, and can have different effects on people of the same background. 

Colorism exists in many countries. In India, people with darker skin were traditionally ranked lower in the caste system. In China, light skin is linked to beauty and nobility. In the US, people across many races experience colorism as it is prejudice rooted primarily in complexion rather race. Historically, when African-Americans were enslaved, those with lighter skin were often given more domestic tasks where those with darker skin were more likely to work in the fields.

These prejudices have been part of the social and media landscape for a long time, but the advent of digital images and Photoshop created new ways for colorism to manifest. In June 1994, notoriously, Newsweek and Time both ran cover images of O.J. Simpson’s mug shot during his murder trial—but on Time’s cover, his skin was markedly darker. The difference sparked outrage: Time had darkened the image in what the magazine’s photo illustrator claimed was an attempt to evoke a more “dramatic tone”. But the editing reflected that the darker the man, the more criminal the American public assumes him to be. 

This association has very real consequences. A 2011 study from Villanova University found a direct link between the severity of sentences for 12,000 incarcerated women and the darkness of their complexion

And today, thanks to the prevalence of selfies and face filters, digital colorism has spread. With Snapchat, Instagram, TikTok, and Facebook a part of billions of people’s everyday lives, many of us find that people see far more pictures of us than ever before. But there are biases built into these systems. At a basic level, the imaging chips found in most personal cameras have pre-set ranges for skin tones, making it technically impossible to accurately capture the real variety of complexions. 

Over 200 million people use Snapchat Lenses every day, some of them to lighten their skin tone. Other filters and automatic enhancing features can do the same on Instagram and TikTok.

And the images that do get taken are often subject to alteration. Snapchat reports that over 200 million people use its filter product, Lenses, every day. Some of them use it to lighten their skin tone; other filters and automatic enhancing features can do the same on Instagram and TikTok. Photo technologies and image filters can do this in ways that are almost imperceptible. Meanwhile, social media algorithms reinforce the popularity of people with lighter skin to the detriment of those with darker skin. Just this week, Twitter’s image-cropping algorithm was found to prefer faces that are lighter, thinner, and younger.  


We’ve reported before on the ways in which digital technologies are narrowing beauty standards. The phenomenon has led to the concept of the “Instagram face,” a particular look that’s easily accessible through the proliferation of editing tools. Photos reflecting this look, with a small nose, big eyes, and fuller lips, attract more comments and likes, leading recommendation algorithms to prioritize them. We also interviewed researchers who say beauty ideals are narrowing even more dramatically and quickly than they expected—with especially profound effects on the way young girls, in particular, see themselves and shape their identity. 

But it could be particularly catastrophic for women with darker complexions, says Ronald Hall, a professor at Michigan State University and an expert on colorism. As more European looks are increasingly held up as an ideal, “these young girls imitate these behaviors, and those who are super dark-complected see no way out,” he says. “Those are the ones who are most at risk for harming themselves.” 

That harm can involve bleaching or other risky body treatments: the skin-lightening industry has grown rapidly and is now worth more than $8 billion worldwide each year. But beyond physical risks, researchers and activists have also begun documenting troubling emotional and psychological effects of online colorism.

Amy Niu researches selfie-editing behavior as part of her PhD in psychology at the University of Wisconsin, Madison. In 2019, she conducted a study to determine the effect of beauty filters on self-image for American and Chinese women. She took pictures of 325 college-aged women and, without telling them, applied a filter to some photos. She then surveyed the women to measure their emotions and self-esteem when they saw edited or unedited photos. Her results, which have not yet been published, found that Chinese women viewing edited photos felt better about themselves, while American women (87% of whom were white) felt about the same whether their photos were edited or not.

Niu believes that the results show there are huge differences between cultures when it comes to “beauty standards and how susceptible people are to those beauty filters.” She adds, “Technology companies are realizing it, and they are making different versions [of their filters] to tailor to the needs of different groups of people.” 

This has some very obvious manifestations. Niu, a Chinese woman living in America, uses both TikTok and Douyin, the Chinese version (both are made by the same company, and share many of the same features, although not the same content.) The two apps both have “beautify” modes, but they are different: Chinese users are given more extreme smoothing and complexion lightening effects. 

She says the differences don’t just reflect cultural beauty standards—they perpetuate them. White Americans tend to prefer filters that make their skin tanner, teeth whiter, and eyelashes longer, while Chinese women prefer filters that make their skin lighter.  

Niu worries that the vast proliferation of filtered images is making beauty standards more uniform over time, especially for Chinese women. “In China, the beauty standard is more homogeneous,” she says, adding that the filters “erase lots of differences to our faces” and reinforce one particular look. 

“It’s really bad”

Amira Adawe has observed the same dynamic in the way young girls of color use filters on social media. Adawe is the founder and  executive director of Beautywell, a Minnesota-based nonprofit aimed at combating colorism and skin-lightening practices. The organization runs programs to educate young girls of color about online safety, healthy digital behaviors, and the dangers of physical skin lightening. 

Adawe says she often has to inform the girls in her workshops that their skin is being lightened by social media filters. “They think it’s normal. They’re like, ‘Oh, this is not skin lightening, Amira. This is just a filter,’” she says. “A lot of these young girls use these filters and think, ‘Oh my God, I look beautiful.’”

"They think it's normal… [but] it's contributing to this notion that you're not beautiful enough."

Amira Adawe, Beautywell

It’s so easy to do—with a few clicks, users can make their appearance more similar to everyone else’s ideal—that many young women end up assuming a lighter-skinned identity online. This makes it easier to find acceptance in the digital world, but it can also make it harder for them to identify with their real complexion. 

When Adawe explains how using a face filter can be part of a cycle of colorism, she is often met with resistance. The filters have become essential to the way some girls see themselves. 

“It’s really bad.” she says. “And it’s contributing to this notion that you’re not beautiful enough.” 

And it’s complicated regardless of your skin tone.

Halle, a single biracial woman in her mid-20s, thinks a lot about her own racial identity. She says most people would use the term “ambiguous” to describe her appearance. “I have whiter features,” she says. “My skin complexion is lighter than some other mixed-race girls’, and my hair is less curly.” She also used to be a regular user of dating apps. And from conversations with her friends who have darker complexions, she realized that her experience on dating apps was very different from theirs.

“Quite candidly, we compare matches and number of matches,” she says. “That is where I started to realize: wait a minute, there’s something going on here. My friends who identify as Black or Afro-Latina don’t get as many matches.” 

It’s already known that beauty-scoring algorithms, which rank the attractiveness of images, give higher scores to whiter women. In March, we reported on how the world’s largest face recognition company, Face++, sells a racially biased beauty scoring algorithm that it markets to digital platforms, and online dating sites in particular.

Halle says her experience on these apps reflects the wider world, too. “This is deeply rooted in racism, colorism, and everything that’s happening in our society,” she says. The experience became so frustrating for her that she deleted all her dating apps. MIT Technology Review has reached out to many dating sites to ask whether they use beauty-scoring algorithms for matches, but none will confirm or deny. 

Even if they do not use systems like Face++, however, they do use recommendation algorithms to learn user preferences over time. And this is another way that colorism and bias can creep in and be perpetuated. 

Recommendations based on user preferences often reflect the biases of the world—in this case, the diversity problems that have long been apparent in media and modeling. Those biases have in turn shaped the world of online influencers, so that many of the most popular images are, by default, of people with lighter skin. An algorithm that interprets your behavior inside such a filter bubble might assume that you dislike people with darker skin. And it gets worse: recommendation algorithms are also known to have an anchoring effect, in which their output reinforces users’ unconscious biases and can even change their preferences over time. 

Meanwhile, platforms including TikTok have been accused of intentionally “shadow-banning” content from some Black creators, especially those discussing the Black Lives Matter movement or racism in general. That diminishes their reach, and the cycle reinforces itself further. (In a statement, a TikTok spokesperson said "We unequivocally do not moderate content or accounts on the basis of race.")

Michigan State’s Ronald Hall says he’s “extremely worried” about the impact on women of color in particular: “Women of color are constantly bombarded with these messages that you gotta be light in order to be attractive.”

Adawe, meanwhile, thinks the only solution is an all-out ban on filters that lighten faces. She says she has emailed Snapchat asking for just that. “Social media companies keep [creating] filters because the demand is so high,” she says. “But to me, I think they’re promoting colorism, whether they realize it and whether it’s intentional or not.” 

A spokesperson for Snap told MIT Technology Review, “Our goal is to build products that are fully inclusive of all Snapchatters, and we’ve put in place a number of processes and initiatives to help us do that. Our guidelines for all Snapchatters—which also apply to Lens submissions—prohibit discrimination and the promotion of stereotypes, and we have an extensive review process in place for Lenses, which includes testing them on a wide range of skin tones.” 

The company says it is partnering with experts for advice, and earlier this year it launched an initiative to build an “inclusive camera”, which is meant to be better at capturing a broader range of skin tones.

A completely different lens

Lise, who now lives in Minnesota, struggled with the effects of colorism for a long time. She went to therapy, watched endless YouTube tutorials on photo editing, and even bought a $600 camera that she hoped would make her look less dark in photos. Eventually she came to realize how harmful it had been.

“Now I just view everyone’s social media page with a completely different lens,” she says.

Today, she’s a new mom: when we spoke via Zoom, I was greeted by her cooing and wiggling baby. I was delighted, but Lise apologized profusely while she adjusted the lens. 

She says she wants to see more raw photos online that show beautiful women who look like her. She no longer edits her skin color in photos, and she tries hard to stop the negative thoughts in her head, though it can be hard. “Oh, I’ll be darned if I see someone saying anything to a beautiful dark-skinned woman,” she says. “I don’t care if it’s online, I don’t care if it’s in person—I’m going to call you out. I just can’t be quiet about it anymore, but it’s taken years. I’m going to be more conscious about what I’m teaching my son.”

We have clarified language to make it clear that colorism affects people of all races.

Deep Dive

Artificial intelligence

How to opt out of Meta’s AI training

Your posts are a gold mine, especially as companies start to run out of AI training data.

Apple is promising personalized AI in a private cloud. Here’s how that will work.

Apple’s first big salvo in the AI wars makes a bet that people will care about data privacy when automating tasks.

This AI-powered “black box” could make surgery safer

A new smart monitoring system could help doctors avoid mistakes—but it’s also alarming some surgeons and leading to sabotage.

An AI startup made a hyperrealistic deepfake of me that’s so good it’s scary

Synthesia's new technology is impressive but raises big questions about a world where we increasingly can’t tell what’s real.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.