Skip to Content
Artificial intelligence

A horrifying new AI app swaps women into porn videos with a click

Deepfake researchers have long feared the day this would arrive.

September 13, 2021
conceptual illustration showing various women's faces being scanned
Pedro Nekoi

Update: As of September 14, a day after this story published, Y posted a new notice saying it is now unavailable. We will continue to monitor the site for more changes.

The website is eye-catching for its simplicity. Against a white backdrop, a giant blue button invites visitors to upload a picture of a face. Below the button, four AI-generated faces allow you to test the service. Above it, the tag line boldly proclaims the purpose: turn anyone into a porn star by using deepfake technology to swap the person’s face into an adult video. All it requires is the picture and the push of a button.

MIT Technology Review has chosen not to name the service, which we will call Y, or use any direct quotes and screenshots of its contents, to avoid driving traffic to the site. It was discovered and brought to our attention by deepfake researcher Henry Ajder, who has been tracking the evolution and rise of synthetic media online.

For now, Y exists in relative obscurity, with a small user base actively giving the creator development feedback in online forums. But researchers have feared that an app like this would emerge, breaching an ethical line no other service has crossed before.

From the beginning, deepfakes, or AI-generated synthetic media, have primarily been used to create pornographic representations of women, who often find this psychologically devastating. The original Reddit creator who popularized the technology face-swapped female celebrities’ faces into porn videos. To this day, the research company Sensity AI estimates, between 90% and 95% of all online deepfake videos are nonconsensual porn, and around 90% of those feature women.

As the technology has advanced, numerous easy-to-use no-code tools have also emerged, allowing users to “strip” the clothes off female bodies in images. Many of these services have since been forced offline, but the code still exists in open-source repositories and has continued to resurface in new forms. The latest such site received over 6.7 million visits in August, according to the researcher Genevieve Oh, who discovered it. It has yet to be taken offline.

There have been other single-photo face-swapping apps, like ZAO or ReFace, that place users into selected scenes from mainstream movies or pop videos. But as the first dedicated pornographic face-swapping app, Y takes this to a new level. It’s “tailor-made” to create pornographic images of people without their consent, says Adam Dodge, the founder of EndTAB, a nonprofit that educates people about technology-enabled abuse. This makes it easier for the creators to improve the technology for this specific use case and entices people who otherwise wouldn’t have thought about creating deepfake porn. “Anytime you specialize like that, it creates a new corner of the internet that will draw in new users,” Dodge says.

Y is incredibly easy to use. Once a user uploads a photo of a face, the site opens up a library of porn videos. The vast majority feature women, though a small handful also feature men, mostly in gay porn. A user can then select any video to generate a preview of the face-swapped result within seconds—and pay to download the full version.

The results are far from perfect. Many of the face swaps are obviously fake, with the faces shimmering and distorting as they turn different angles. But to a casual observer, some are subtle enough to pass, and the trajectory of deepfakes has already shown how quickly they can become indistinguishable from reality. Some experts argue that the quality of the deepfake also doesn’t really matter because the psychological toll on victims can be the same either way. And many members of the public remain unaware that such technology exists, so even low-quality face swaps can be capable of fooling people.

Y bills itself as a safe and responsible tool for exploring sexual fantasies. The language on the site encourages users to upload their own face. But nothing prevents them from uploading other people’s faces, and comments on online forums suggest that users have already been doing just that.

The consequences for women and girls targeted by such activity can be crushing. At a psychological level, these videos can feel as violating as revenge porn—real intimate videos filmed or released without consent. “This kind of abuse—where people misrepresent your identity, name, reputation, and alter it in such violating ways—shatters you to the core,” says Noelle Martin, an Australian activist who has been targeted by a deepfake porn campaign.

To this day, I’ve never been successful fully in getting any of the images taken down. Forever, that will be out there. No matter what I do.

Noelle Martin, an Australian activist

And the repercussions can stay with victims for life. The images and videos are difficult to remove from the internet, and new material can be created at any time. “It affects your interpersonal relations; it affects you with getting jobs. Every single job interview you ever go for, this might be brought up. Potential romantic relationships,” Martin says. “To this day, I’ve never been successful fully in getting any of the images taken down. Forever, that will be out there. No matter what I do.”

Sometimes it’s even more complicated than revenge porn. Because the content is not real, women can doubt whether they deserve to feel traumatized and whether they should report it, says Dodge. “If somebody is wrestling with whether they’re even really a victim, it impairs their ability to recover,” he says.

Nonconsensual deepfake porn can also have economic and career impacts. Rana Ayyub, an Indian journalist who became a victim of a deepfake porn campaign, received such intense online harassment in its aftermath that she had to minimize her online presence and thus the public profile required to do her work. Helen Mort, a UK-based poet and broadcaster who previously shared her story with MIT Technology Review, said she felt pressure to do the same after discovering that photos of her had been stolen from private social media accounts to create fake nudes.

The Revenge Porn Helpline funded by the UK government recently received a case from a teacher who lost her job after deepfake pornographic images of her were circulated on social media and brought to her school’s attention, says Sophie Mortimer, who manages the service. “It’s getting worse, not better,” Dodge says. “More women are being targeted this way.”

Y’s option to create deepfake gay porn, though limited, poses an additional threat to men in countries where homosexuality is criminalized, says Ajder. This is the case in 71 jurisdictions globally, 11 of which punish the offense by death.

Ajder, who has discovered numerous deepfake porn apps in the last few years, says he has attempted to contact Y’s hosting service and force it offline. But he’s pessimistic about preventing similar tools from being created. Already, another site has popped up that seems to be attempting the same thing. He thinks banning such content from social media platforms, and perhaps even making their creation or consumption illegal, would prove a more sustainable solution. “That means that these websites are treated in the same way as dark web material,” he says. “Even if it gets driven underground, at least it puts that out of the eyes of everyday people.”

Y did not respond to multiple requests for comment at the press email listed on its site. The registration information associated with the domain is also blocked by the privacy service Withheld for Privacy. On August 17, after MIT Technology Review made a third attempt to reach the creator, the site put up a notice on its homepage saying it’s no longer available to new users. As of September 12, the notice was still there.

Deep Dive

Artificial intelligence

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

Providing the right products at the right time with machine learning

Amid shifting customer needs, CPG enterprises look to machine learning to bolster their data strategy, says global head of MLOps and platforms at Kraft Heinz Company, Jorge Balestra.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.