TikTok has shared its latest report on the volume and nature of violations on its platform and added new safety features to protect users.
Following the release of its Q2 Community Guidelines Enforcement Report, detailing the volume and nature of violations of its Community Guidelines, TikTok has shared further updates on its continuing work to protect people from abusive behavior.
Related | TikTok Announces New Partners To Help Brands Use Music And Sounds
The report details that TikTok removed more than 80 million violative videos in three months from April 2021. Though the figure looks high, it represents less than 1% of all videos uploaded on TikTok.
Compared to the previous period, the platform has improved the speed at which it is able to remove offensive content, and TikTok removed 94% of violative posts before any user reported them or even saw them.
The platform’s proactive detection has been increasingly able to pick up on hateful behavior, bullying, and harassment. Its systems proactively flag hate symbols, words, and other abuse signals and bring the content up for review by its safety teams.
TikTok’s safety teams are regularly trained in detecting hate speech and understanding nuanced and contextual issues that are typically more difficult to detect, such as reappropriation of a term and slurs or satire.
Furthermore, the company has also rolled out unconscious bias training for its moderators and hired policy experts in civil rights, equity, and inclusion.
In addition to improving moderation, the platform offers a range of tools and resources to empower people to customize their experience, including filtering comments on content, deleting or reporting multiple comments at once, and blocking accounts in bulk.
TikTok has also added prompts that suggest users reconsider their potentially unkind words before posting, an intervention feature that has seen nearly 4 in 10 users editing or deleting their comments.
TikTok is also building safety around live streaming. The recent announcement also mentions that TikTok is improving muting settings for comments and questions during livestreams. Hosts (or their trusted assistants) can temporarily mute an unkind viewer.
If an account is muted for a long time, their entire comment history will also be removed. The feature adds to the existing ability for hosts on LIVE to turn off comments or limit potentially harmful comments using a keyword filter.
Lastly, TikTok, a participant of the Malmö International Forum on Holocaust Remembrance and Combating Antisemitism, also announced its continued commitment to combating antisemitic content.
Photo by Salman Hossain Saif on Unsplash