Skip to Content
Humans and technology

How to avoid sharing bad information about Russia’s invasion of Ukraine

Even well-meaning attempts to participate in the news can play into bad actors’ campaigns.

February 25, 2022
Close up of typing on laptop
Getty

The fast-paced online coverage of the Russian invasion of Ukraine on Wednesday followed a pattern that’s become familiar in other recent crises that have unfolded around the world. Photos, videos, and other information are posted and reshared across platforms much faster than they can be verified.

The result is that falsehoods are mistaken for truth and amplified, even by well-intentioned people. This can help bad actors to terrorize innocent civilians or advance disturbing ideologies, causing real harm.

Disinformation has been a prominent and explicit part of the Russian government’s campaign to justify the invasion. Russia falsely claimed that Ukrainian forces in Donbas, a city in the southeastern part of the country that harbors a large number of pro-Russian separatists, were planning violent attacks, engaging in antagonistic shelling, and committing genocide. Fake videos of those nonexistent attacks became part of a domestic propaganda campaign. (The US government, meanwhile, has been working to debunk and “prebunk” these lies.)

Meanwhile, even people who are not part of such government campaigns may intentionally share bad, misleading, or false information about the invasion to promote ideological narratives, or simply to harvest clicks, with little care about the harm they’re causing. In other cases, honest mistakes made amid the fog of war take off and go viral.

Already, bad information about the Russian invasion has found large audiences on platforms fundamentally designed to promote content that gets engagement.

On TikTok, a 2016 video of a training exercise was repurposed to create the false impression that Russian soldiers were parachuting into Ukraine; it was viewed millions of times. A mistranslation of a statement that circulated widely on Twitter, and was shared by journalists, falsely stated that fighting near Chernobyl had disturbed a nuclear waste site (the original statement actually warned that fighting might disturb nuclear waste).

Harmful propaganda and misinformation are often inadvertently amplified as people face the firehose of breaking news and interact with viral posts about a terrible event. This guide is for those who want to avoid helping bad actors.

We’ve published some of this advice before—during the Black Lives Matter protests in 2020, and again before the US election later that year. The information below has been updated and expanded to include some specific considerations for news coming out of Ukraine.

Your attention matters …

First, realize that what you do online makes a difference. “People often think that because they’re not influencers, they’re not politicians, they’re not journalists, that what they do [online] doesn’t matter,” Whitney Phillips, an assistant professor of communication and rhetorical studies at Syracuse University, told me in 2020. But it does matter. Sharing dubious information with even a small circle of friends and family can lead to its wider dissemination.

… and so do your angry quote tweets and duets.

While an urgent news story is developing, well-meaning people may quote, tweet, share, or duet with a post on social media to challenge and condemn it. Twitter and Facebook have introduced new rules, moderation tactics, and fact-checking provisions to try to combat misinformation. But interacting with misinformation at all risks amplifying the content you’re trying to minimize, because it signals to the platform that you find it interesting. Instead of engaging with a post you know to be wrong, try flagging it for review by the platform where you saw it.

Stop.

Mike Caulfield, a digital literacy expert, developed a method for evaluating online information that he calls SIFT: “Stop, Investigate the source, Find better coverage, and Trace claims, quotes, and media to the original context.” When it comes to news about Ukraine, he says, the emphasis should be on “Stop”—that is, pause before you react to or share what you’re seeing.

“There’s just a human impulse to be the first person in your group to share the story and get known as the person who reported this thing,” he says. And while this impulse is a daily hazard for journalists, it applies to everyone, particularly during moments of information overload.

Shireen Mitchell, a disinformation researcher and digital analyst, says that if you’re consuming news about Ukraine and want to do something to help, “what you should be doing is following people from Ukraine who are telling their stories about what's happening to them.”

Don’t just retweet anything you see that seems to be from Ukraine, though. Only share information from authentic accounts. Journalists have worked to verify TikTok videos that appear to show Russian military movements, and they are sharing tweets from those who appear to be in Ukraine documenting their own stories.

Even then, experts urge you to be extra cautious. Disinformation researcher Kate Starbird tweeted an excellent thread on how to vet social media posts about the invasion, noting that this is a situation in which even reliable sources within your network may be “moving fast and maybe not vetting so well.”

Starbird pointed to a few clues that an account might be inauthentic:

Pick a role you can handle.

If you’re reading this guide, you’re probably not a breaking news reporter or an expert in Ukraine-Russia relations. Experts caution against trying to act like one right now by evaluating and spreading information you find online. While it’s always good to try to verify the information you’re seeing, think hard about actually sharing any new findings or theories with your networks.

“People feel that they’ve been able to do their own research” on the internet, Mitchell says, driven in part by an increase in attention to the spread of disinformation and other forms of bad information on social media. “And now that they think that they’ve garnered some skill sets, they think that they can do it at this moment.” Neither assumption is necessarily true. And bad actors have a well-documented history of exploiting the impulse to “do your own research” to draw people into coordinated networks of disinformation.

One of the best things you can do is to find your anchor to reality.

“Just frankly, the language issue is a big issue,” Caulfield says, referring to English-speaking people trying to fact-check news out of Ukraine in real time. Trying to determine what’s authentic should not mean scanning “video of places that you don’t know, stuff that’s narrated in languages you don’t understand,” he says.

Before you share, ask yourself: Can you personally translate the language being spoken? Are you equipped to research and analyze videos and photos from sources you’ve never encountered before? Although citizen journalism is often deeply valuable, it requires real skill and training to do well. Be realistic about what you’re able to do, and why.

Remind yourself that getting the story wrong has consequences. Sharing false or misleading information about a developing situation could get people hurt or killed.

Instead, spread good information and amplify reliable sources.

One of the best things you can do in a situation like this, for your own sanity and for the people who listen to you on the Internet, is to find your anchor to reality. Who are the reliable sources who post English-speaking coverage? Who can you follow and amplify to spread good information?

Journalists like Jane Lytvynenko, who is herself Ukrainian and has a background in misinformation reporting, are identifying and sharing resources for those who want to donate to support Ukrainian charities and media outlets, and providing vital context on the invasion. Others have crowdsourced a list of propaganda-laden news outlets and social media accounts to avoid. Bellingcat has a running public spreadsheet of debunked claims. The news outlet Kyiv Independent is tweeting constant updates.

“Maybe your role in this is not being the sort of beat reporter who's breaking stories for your friends,” says Caulfield. There are plenty of people doing that work who are very good at it, he says. “Maybe your role in this is finding a reputable lecture that explains what's going on … maybe it is providing people background on how Russia used disinformation in Crimea in 2014.”

If you do make a mess, clean up after yourself.

Anyone is capable of sharing misinformation, including experts in detecting it. If you’re going to share information about a developing situation—no matter what your role, or the size of your platform—be prepared to responsibly correct it and handle the fallout if you get something wrong.

Both Mitchell and Caulfield outlined similar best practices here: If you share bad information on Twitter, screenshot your mistake, post a correction by replying to or quote-tweeting the incorrect information, and then delete the tweet that contains the misinformation. 

Although TikTok works differently, similar principles apply: delete the misinformation, acknowledge why the video was deleted, post a correction, and encourage your followers to share that correction.

Mitchell added that everyone should be prepared to take accountability for getting it wrong by reaching out to supply the correct information to those who reshared your mistake.

Consider logging off.

Sometimes when an important and horrible thing is happening in the world, looking away or taking a break feels like apathy. It’s not. Stop doomscrolling.

Deep Dive

Humans and technology

Unlocking the power of sustainability

A comprehensive sustainability effort embraces technology, shifting from risk reduction to innovation opportunity.

Building a data-driven health-care ecosystem

Harnessing data to improve the equity, affordability, and quality of the health care system.

Let’s not make the same mistakes with AI that we made with social media

Social media’s unregulated evolution over the past decade holds a lot of lessons that apply directly to AI companies and technologies.

People are worried that AI will take everyone’s jobs. We’ve been here before.

In a 1938 article, MIT’s president argued that technical progress didn’t mean fewer jobs. He’s still right.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.