Skip to Content
Tech policy

A leaked excerpt of TikTok moderation rules shows how political content gets buried

November 25, 2019
tiktok
tiktok
tiktokAP / Da Qing

German publication Netzpolitik has posted an excerpt from TikTok’s new moderation guidelines. The document shows that moderators don’t take down political and protest content, but are still told to prevent these videos from becoming popular. 

The background: China-owned TikTok is a wildly popular video app, and also the target of a US national security investigation. Politicians fear that the app could become a source of foreign-controlled disinformation and have expressed concern over a Guardian report that claimed the company censors political videos Beijing doesn’t like. TikTok's parent company insisted that the guidelines from the Guardian report were outdated and that the policies had changed. That may be the case, but the new rules accessed by Netzpolitik aren’t much better. 

The news: According to a whistleblower who spoke with Netzpolitik, controversial content on the app is divided into the categories of “deleted,” “visible to self” (meaning other users can’t see it), “not recommended,” and “not for feed.” Videos in these last two categories won’t be curated by the main TikTok discovery engine, and “not for feed” also makes a video harder to find in search.

According to the guidelines, most political content during election periods should be marked “not recommended.” Political content includes everything from partisan speeches to party banners. Police content—including filming inside a police station or jail—is marked “not for feed.”

The document also shows some changes that TikTok has made. Once, content about riots and protests—including reference to Tibet, Taiwan, Northern Ireland, and Tiananmen Square—would be marked “not recommended.” That category has now been replaced by a category covering content that might result in “real-world harm,” according to the guidelines. Moderators are told to mark this “not for feed.” 

What does TikTok say? In response to Netzpolitik, TikTok claims that it does not moderate content because of political sensitivities and that its moderation decisions are not influenced by any foreign governments. TikTok has also said that it does not remove content related to political demonstrations—which, of course,  doesn’t address the question of whether it keeps these videos from finding an audience. 

Deep Dive

Tech policy

surveillance and control concept
surveillance and control concept

South Africa’s private surveillance machine is fueling a digital apartheid

As firms have dumped their AI technologies into the country, it’s created a blueprint for how to surveil citizens and serves as a warning to the world.

Europe's AI Act concept
Europe's AI Act concept

A quick guide to the most important AI law you’ve never heard of

The European Union is planning new legislation aimed at curbing the worst harms associated with artificial intelligence.

Inside the fierce, messy fight over “healthy” sugar tech

Yi-Heng “Percival” Zhang was a leader in rare sugar research. Then things got sticky.

Conceptual illustration showing text blacked out
Conceptual illustration showing text blacked out

The secret police: Cops built a shadowy surveillance machine in Minnesota after George Floyd’s murder

An investigation by MIT Technology Review reveals a sprawling, technologically sophisticated system in Minnesota designed for closely monitoring protesters.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.