TikTok has shared its latest report on the volume and nature of violations on its platform and added new safety features to protect users.
Following the release of its Q2 Community Guidelines Enforcement Report, detailing the volume and nature of violations of its Community Guidelines, TikTok has shared further updates on its continuing work to protect people from abusive behavior.
Related | TikTok Announces New Partners To Help Brands Use Music And Sounds
The report details that TikTok removed more than 80 million violative videos in three months from April 2021. Though the figure looks high, it represents less than 1% of all videos uploaded on TikTok.
Compared to the previous period, the platform has improved the speed at which it is able to remove offensive content, and TikTok removed 94% of violative posts before any user reported them or even saw them.
The platform’s proactive detection has been increasingly able to pick up on hateful behavior, bullying, and harassment. Its systems proactively flag hate symbols, words, and other abuse signals and bring the content up for review by its safety teams.
TikTok’s safety teams are regularly trained in detecting hate speech and understanding nuanced and contextual issues that are typically more difficult to detect, such as reappropriation of a term and slurs or satire.
Furthermore, the company has also rolled out unconscious bias training for its moderators and hired policy experts in civil rights, equity, and inclusion.
In addition to improving moderation, the platform offers a range of tools and resources to empower people to customize their experience, including filtering comments on content, deleting or reporting multiple comments at once, and blocking accounts in bulk.
TikTok has also added prompts that suggest users reconsider their potentially unkind words before posting, an intervention feature that has seen nearly 4 in 10 users editing or deleting their comments.
TikTok is also building safety around live streaming. The recent announcement also mentions that TikTok is improving muting settings for comments and questions during livestreams. Hosts (or their trusted assistants) can temporarily mute an unkind viewer.
If an account is muted for a long time, their entire comment history will also be removed. The feature adds to the existing ability for hosts on LIVE to turn off comments or limit potentially harmful comments using a keyword filter.
Lastly, TikTok, a participant of the Malmö International Forum on Holocaust Remembrance and Combating Antisemitism, also announced its continued commitment to combating antisemitic content.
Photo by Salman Hossain Saif on Unsplash
You might also like
More from TikTok
TikTok Begins Testing Tako, Its Own AI ChatBot
TikTok Tako is a new AI chatbot that could “radically change search and navigation” in the app. News of the test …
TikTok Keyword Insights: Find The Right Keywords For Your Ads
TikTok unveiled a new "Keyword Insights" tool to help marketers find the keywords and phrases that resonate with audiences in …
TikTok Introduces Sounds For Business For Quick And Engaging Content Creation
TikTok is launching Sounds for Business, a new collection of sounds designed as templates for quick and engaging content creation. TikTok …
TikTok Is Testing Refresh, A New Feature To Reset Your For You Page
TikTok Refresh will help you reset your For You page algorithm, your watch history, and past interactions, so you can …
TikTok Is Now Banned On All Devices Issued By The US House Of Representatives
The chief administrative office of the House issued a directive establishing TikTok, the popular social media app, as a security …
TikTok Is Testing A Full-Screen Horizontal Mode
The new TikTok Full-screen horizontal mode means Creators could now create a single video that's optimized for both TikTok and …
TikTok Academy: A New ‘TikTok For Business’ Education Platform
TikTok last week announced the launch of TikTok Academy, a new education platform to help marketers develop TikTok expertise.