Skip to Content
Tech policy

A leaked excerpt of TikTok moderation rules shows how political content gets buried

November 25, 2019
tiktok
tiktok
tiktokAP / Da Qing

German publication Netzpolitik has posted an excerpt from TikTok’s new moderation guidelines. The document shows that moderators don’t take down political and protest content, but are still told to prevent these videos from becoming popular. 

The background: China-owned TikTok is a wildly popular video app, and also the target of a US national security investigation. Politicians fear that the app could become a source of foreign-controlled disinformation and have expressed concern over a Guardian report that claimed the company censors political videos Beijing doesn’t like. TikTok's parent company insisted that the guidelines from the Guardian report were outdated and that the policies had changed. That may be the case, but the new rules accessed by Netzpolitik aren’t much better. 

The news: According to a whistleblower who spoke with Netzpolitik, controversial content on the app is divided into the categories of “deleted,” “visible to self” (meaning other users can’t see it), “not recommended,” and “not for feed.” Videos in these last two categories won’t be curated by the main TikTok discovery engine, and “not for feed” also makes a video harder to find in search.

According to the guidelines, most political content during election periods should be marked “not recommended.” Political content includes everything from partisan speeches to party banners. Police content—including filming inside a police station or jail—is marked “not for feed.”

The document also shows some changes that TikTok has made. Once, content about riots and protests—including reference to Tibet, Taiwan, Northern Ireland, and Tiananmen Square—would be marked “not recommended.” That category has now been replaced by a category covering content that might result in “real-world harm,” according to the guidelines. Moderators are told to mark this “not for feed.” 

What does TikTok say? In response to Netzpolitik, TikTok claims that it does not moderate content because of political sensitivities and that its moderation decisions are not influenced by any foreign governments. TikTok has also said that it does not remove content related to political demonstrations—which, of course,  doesn’t address the question of whether it keeps these videos from finding an audience. 

Deep Dive

Tech policy

hired guns concept
hired guns concept

The secret police: A private security group regularly sent Minnesota police misinformation about protestors

There are 13 private security guards for every one police officer in downtown Minneapolis, but these groups are far less regulated than police departments.

censorship of online docs concept
censorship of online docs concept

A million-word novel got censored before it was even shared. Now Chinese users want answers.

After a writer was locked out of her novel for including illegal content, Chinese web users are asking questions about just how far the state’s censorship reaches.

security cameraa
security cameraa

The world’s biggest surveillance company you’ve never heard of

Hikvision could be sanctioned for aiding the Chinese government’s human rights violations in Xinjiang. Here’s everything you need to know.

Mifiprex pill
Mifiprex pill

Where to get abortion pills and how to use them

New US restrictions could turn abortion into do-it-yourself medicine, but there might be legal risks.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.