Skip to Content
Policy

Deepfake Putin is here to warn Americans about their self-inflicted doom

AI-generated synthetic media is being used in a political ad campaign—not to disrupt the election, but to save it.
September 29, 2020
A deepfake of Putin standing at a podium.
Deepfake PutinMischief/RepresentUs

The news: Two political ads will broadcast on social media today, featuring deepfake versions of Russian president Vladimir Putin and North Korean leader Kim Jong-un. Both deepfake leaders will be giving the same message: that America doesn’t need any election interference from them; it will ruin its democracy by itself.

What are they for? Yes, the ads sound creepy, but they’re meant for a good cause. They’re part of a campaign from the nonpartisan advocacy group RepresentUs to protect voting rights during the upcoming US presidential election, amid president Trump’s attacks on mail-in voting and suggestions that he may refuse a peaceful transition. The goal is to shock Americans into understanding the fragility of democracy as well as provoke them to take various actions, including checking their voter registration and volunteering for the polls. It flips the script on the typical narrative of political deepfakes, which experts often worry could be abused to confuse voters and disrupt elections.

How they were made: RepresentUs worked with the creative agency Mischief at No Fixed Address, which came up with the idea of using dictators to deliver the message. They filmed two actors with the right face shape and authentic accents to recite the script. They then worked with a deepfake artist who used an open-source algorithm to swap in Putin’s and Kim’s faces. A post-production crew cleaned up the leftover artifacts of the algorithm to make the video look more realistic. All in all the process took only 10 days. Attempting the equivalent with CGI likely would have taken months, the team says. It also could have been prohibitively expensive.

Are we ready? The ads were supposed to broadcast on Fox, CNN, and MSNBC in their Washington, DC, markets, but the stations pulled them last-minute from airing. A spokesperson for the campaign said they were still waiting on an explanation. The ads include a disclaimer at the end, stating: “The footage is not real, but the threat is.” But given the sensitive nature of using deepfakes in a political context, it’s possible the networks felt the American public just wasn’t ready.

Deep Dive

Policy

Is there anything more fascinating than a hidden world?

Some hidden worlds--whether in space, deep in the ocean, or in the form of waves or microbes--remain stubbornly unseen. Here's how technology is being used to reveal them.

Yes, remote learning can work for preschoolers

The largest-ever humanitarian intervention in early childhood education shows that remote learning can produce results comparable to a year of in-person teaching.

Three technology trends shaping 2024’s elections

The biggest story of this year will be elections in the US and all around the globe

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.