YouTube reports that 1% of videos pulled from its platform in the first half of 2021 had sexually explicit content involving children.
In written testimony to a US Senate subcommittee hearing regarding “Protecting Kids Online,” YouTube‘s VP of government affairs and public policy, Leslie Miller, outlines the platform’s continued efforts to ensure the online safety and well-being of children.
In the testimony, Miller states that while it’s improving its machine learning technology to detect early and remove explicit and misinformative content, in the first half of 2021 YouTube removed more than 120,000 videos containing sexually explicit content involving children.
Related | YouTube Music Free Tier Is Becoming Audio-Only
The figure represents less than 1% of the 15.8 million videos YouTube removed in the same period for policy violations. Of the 1.87 million videos removed in the second quarter of 2021, 85% of them were pulled before ever reaching 10 views.
Miller states that YouTube “invested extensively in industry-leading machine learning technologies that identify potential harms quickly and at scale,” adding, “some speculate that we hesitate to address problematic content or ignore the well-being of youth online because it benefits our business; this is simply not true.”
Miller also added that YouTube reports all videos it finds in violation of YouTube’s child safety policy to the National Center for Missing and Exploited Children.
The report on the state of the platform’s progress in detecting and removing violative content comes as social media platforms continue to combat the propagation of illegal or dangerous content.
Earlier this summer, Twitter started testing the application of misinformation labels on tweets, and Facebook started to penalize individual accounts and groups who repeatedly share fake news on the platform.
YouTube itself started blocking the spread of videos containing false information regarding vaccines last month. “We’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines,” the statement reads. “This would include content that falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them.”
Photo by Caleb Woods on Unsplash
You might also like
More from YouTube
Smaller Creators Can Now Access YouTube Partner Program
The YouTube Partner Program will now be open to creators once they reach 500 subscribers, allowing smaller creator to monetize …
YouTube Brings Unskippable 30-Second Ads To TV
YouTube is bringing longer unskippable ads to its TV app, unless you're paying for the platform's premium subscription YouTube says these …
YouTube Brings Shopping Features To Shorts
YouTube is testing affiliate marketing and in-app shopping features on Shorts, less than a week after TikTok began testing similar …
Get Ready To Choose Your YouTube Handle
YouTube is introducing handles to make it easier for members to find and connect with each other. Your handle will be …
YouTube Now Lets You Add A Voiceover To Your Shorts
YouTube is adding one of its most-requested features to Shorts; the ability to add a voiceover, like narration and original …
YouTube Starts Adding Watermarks To Downloaded Shorts Clips
YouTube has started adding watermarks to downloaded Shorts clips to stop users from cross-posting them to other platforms.
YouTube Expands Offense Warnings To Desktop, Updates Payments System
New updates for creators! A dedicated payments account within AdSense, and warnings on potentially offensive text detected by AI.
YouTube Offering Podcast Networks Up To $300k To Film Episodes
YouTube is offering financial incentives to podcasting networks to adapt their shows to a video format and share them on …
YouTube Adds Guided Support For Community Guideline Violations
YouTube has updated its review process to provide creators with specifics about the reportedly violative video and facilitate a resolution.