Skip to Content
Policy

A leaked excerpt of TikTok moderation rules shows how political content gets buried

November 25, 2019
tiktok
tiktokAP / Da Qing

German publication Netzpolitik has posted an excerpt from TikTok’s new moderation guidelines. The document shows that moderators don’t take down political and protest content, but are still told to prevent these videos from becoming popular. 

The background: China-owned TikTok is a wildly popular video app, and also the target of a US national security investigation. Politicians fear that the app could become a source of foreign-controlled disinformation and have expressed concern over a Guardian report that claimed the company censors political videos Beijing doesn’t like. TikTok's parent company insisted that the guidelines from the Guardian report were outdated and that the policies had changed. That may be the case, but the new rules accessed by Netzpolitik aren’t much better. 

The news: According to a whistleblower who spoke with Netzpolitik, controversial content on the app is divided into the categories of “deleted,” “visible to self” (meaning other users can’t see it), “not recommended,” and “not for feed.” Videos in these last two categories won’t be curated by the main TikTok discovery engine, and “not for feed” also makes a video harder to find in search.

According to the guidelines, most political content during election periods should be marked “not recommended.” Political content includes everything from partisan speeches to party banners. Police content—including filming inside a police station or jail—is marked “not for feed.”

The document also shows some changes that TikTok has made. Once, content about riots and protests—including reference to Tibet, Taiwan, Northern Ireland, and Tiananmen Square—would be marked “not recommended.” That category has now been replaced by a category covering content that might result in “real-world harm,” according to the guidelines. Moderators are told to mark this “not for feed.” 

What does TikTok say? In response to Netzpolitik, TikTok claims that it does not moderate content because of political sensitivities and that its moderation decisions are not influenced by any foreign governments. TikTok has also said that it does not remove content related to political demonstrations—which, of course,  doesn’t address the question of whether it keeps these videos from finding an audience. 

Deep Dive

Policy

Three things to know about the White House’s executive order on AI

Experts say its emphasis on content labeling, watermarking, and transparency represents important steps forward.

How generative AI is boosting the spread of disinformation and propaganda

In a new report, Freedom House documents the ways governments are now using the tech to amplify censorship.

Meta is giving researchers more access to Facebook and Instagram data

There’s still so much we don’t know about social media’s impact. But Meta president of global affairs Nick Clegg tells MIT Technology Review that he hopes new tools the company just released will start to change that.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.