TikTok has shared its latest report on the volume and nature of violations on its platform and added new safety features to protect users.
Following the release of its Q2 Community Guidelines Enforcement Report, detailing the volume and nature of violations of its Community Guidelines, TikTok has shared further updates on its continuing work to protect people from abusive behavior.
The report details that TikTok removed more than 80 million violative videos in three months from April 2021. Though the figure looks high, it represents less than 1% of all videos uploaded on TikTok.
Compared to the previous period, the platform has improved the speed at which it is able to remove offensive content, and TikTok removed 94% of violative posts before any user reported them or even saw them.
The platform’s proactive detection has been increasingly able to pick up on hateful behavior, bullying, and harassment. Its systems proactively flag hate symbols, words, and other abuse signals and bring the content up for review by its safety teams.
TikTok’s safety teams are regularly trained in detecting hate speech and understanding nuanced and contextual issues that are typically more difficult to detect, such as reappropriation of a term and slurs or satire.
Furthermore, the company has also rolled out unconscious bias training for its moderators and hired policy experts in civil rights, equity, and inclusion.
In addition to improving moderation, the platform offers a range of tools and resources to empower people to customize their experience, including filtering comments on content, deleting or reporting multiple comments at once, and blocking accounts in bulk.
TikTok has also added prompts that suggest users reconsider their potentially unkind words before posting, an intervention feature that has seen nearly 4 in 10 users editing or deleting their comments.
TikTok is also building safety around live streaming. The recent announcement also mentions that TikTok is improving muting settings for comments and questions during livestreams. Hosts (or their trusted assistants) can temporarily mute an unkind viewer.
If an account is muted for a long time, their entire comment history will also be removed. The feature adds to the existing ability for hosts on LIVE to turn off comments or limit potentially harmful comments using a keyword filter.
You might also like
More from TikTok
In response to the war in Ukraine, TikTok is rolling out its state media policy to label content from some …
The free online course provides easy-to-follow best practices to level-up marketers' approaches to advertising on the platform.
TikTok has introduced a new customizable shopping ad display option to streamline the customer journey for users.
TikTok has announced a new partnership with brand suitability platform Zefr to ensure ads won't end up next to undesirable, …
TikTok may be working on reinstating a feature that allows users to know who viewed their profile in the past …