TikTok has announced an update to the moderation system, which enforces account warnings similar to warnings in YouTube’s Community Guidelines. The company says it is doing this because the previous system could be confusing for creators and could be exploited by bad actors.
If TikTok removes content, such as a video or comment, for violating a community guideline, the account behind it will receive a strike, which will expire after 90 days. There are multiple types of warnings – you can get them for specific product features like Comments or Live or for sections of the TikTok policy (so warnings for entering a dangerous challenge are not necessarily counted with warnings for leaving a spammy comment).
Getting enough warnings in any category results in a permanent ban, although the threshold differs depending on “the potential of an offense to cause harm to our community members”. TikTok doesn’t say exactly what those limits are, possibly to prevent people from following the line. The company also says it will ban people who get “a large number of cumulative warnings about policies and features.”
The company says the warning system does not apply to “serious violations” – anyone caught posting child sexual abuse material, threats of real violence or other extreme content can still be immediately banned.
TikTok says it is rolling out an update to the app’s Safety Center that will allow creators to see and object to strikes, and warn users if they come close to that permanent ban. It’s also testing a feature that tells you if your video isn’t algorithmically being posted to people’s For You pages, along with an explanation of why it’s been flagged as ineligible.
The company says all of this is a move to increase transparency around its moderation decisions and clarify things for creators who only occasionally accidentally break rules while tackling repeat offenders. It also admits that the previous account enforcement system, which involved temporary bans and restrictions, could have been confusing for some creators.
TikTok has received a lot of criticism for transparency and accountability, especially when it comes to how it recommends content. The company faces the prospect of being completely banned in the US as an increasing number of lawmakers implement blocks on certain government-owned devices.