The news: Video platform TikTok has published a set of new, more detailed guidelines governing which videos will be deleted from the app. It says it will take down videos promoting terrorism, crime, violence, hate speech, or self-harm, for example.
The rules also ban “misleading information” that could cause harm to either an individual or the general public, going further than US competitors like Facebook, which have (controversially) tried to avoid making those sorts of judgments. TikTok also explicitly bans denying the reality of “well-documented and violent events” like the Holocaust, while Facebook permits it.
What’s next: Writing the policies is the easy part—enforcing them is much harder, and TikTok hasn’t provided much detail on how it does that. However, German publication Netzpolitik revealed leaked moderation guidelines at the end of last year, which showed how TikTok algorithmically suppresses certain videos from becoming popular by making them harder for users to find. Controversially, this included videos created by people with disabilities. TikTok also came in for criticism for banning a girl who had posted a video attacking the treatment of Uighur people by the Chinese state.
Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.