YouTube has updated the enforcement of its live streaming policy to prevent young children from going live unless they are supervised by an adult.
In response to widespread criticism that it’s not doing enough to curb the spread of inappropriate content and actions related to young children on its platform, YouTube is taking further steps to limit what kids can do on its platform.
Apart from limiting the recommendation of videos that show “minors in risky situations,” YouTube also updated its policies that pertain to minors, restricting live features for younger minors.
From now on, unless they are “clearly accompanied by an adult,” kids will not be able to stream live on YouTube, with the company also last week explaining that “channels not in compliance with this policy may lose their ability to live stream.”
Furthermore, YouTube also announced the launch of “new classifiers (machine learning tools that help […] identify specific types of content) on […] live products to find and remove more of this content.”
The announcement comes in the wake of a New York Times report that found YouTube’s recommendation system “has been suggesting videos of ‘prepubescent, partially clothed children’ to users who had watched sexually themed content. According to YouTube, it has now applied several new restrictions to its algorithm-based recommendations system, thus curbing recommendations of videos with minors to “tens of millions of videos.”
YouTube says that an all-out ban of videos with children “would hurt creators who rely on the recommendation engine to generate views.”
“Responsibility is our number one priority, and chief among our areas of focus is protecting minors and families,” the company explains in the blog post. “With this update, we’ll be able to better identify videos that may put minors at risk and apply our protections.”
Furthermore, YouTube explains that videos featuring minors “do not violate […] policies and are innocently posted.” However, there is still a large number of videos that do: The company says that in the first quarter of 2019 alone, it removed more than “800,000 videos for violations of […] child-safety policies (claiming the majority of those were deleted before they had ten views).”
Taking it one step further, YouTube also announced that it already works with “law-enforcement agencies to investigate crimes against children.” As it explains in the blog post, “reports sent to the National Center for Missing and Exploited Children prompted more than 6,000 such investigations in the past two years.”
You might also like
More from YouTube
Smaller Creators Can Now Access YouTube Partner Program
The YouTube Partner Program will now be open to creators once they reach 500 subscribers, allowing smaller creator to monetize …
YouTube Brings Unskippable 30-Second Ads To TV
YouTube is bringing longer unskippable ads to its TV app, unless you're paying for the platform's premium subscription YouTube says these …
YouTube Brings Shopping Features To Shorts
YouTube is testing affiliate marketing and in-app shopping features on Shorts, less than a week after TikTok began testing similar …
Get Ready To Choose Your YouTube Handle
YouTube is introducing handles to make it easier for members to find and connect with each other. Your handle will be …
YouTube Now Lets You Add A Voiceover To Your Shorts
YouTube is adding one of its most-requested features to Shorts; the ability to add a voiceover, like narration and original …
YouTube Starts Adding Watermarks To Downloaded Shorts Clips
YouTube has started adding watermarks to downloaded Shorts clips to stop users from cross-posting them to other platforms.
YouTube Expands Offense Warnings To Desktop, Updates Payments System
New updates for creators! A dedicated payments account within AdSense, and warnings on potentially offensive text detected by AI.
YouTube Offering Podcast Networks Up To $300k To Film Episodes
YouTube is offering financial incentives to podcasting networks to adapt their shows to a video format and share them on …
YouTube Adds Guided Support For Community Guideline Violations
YouTube has updated its review process to provide creators with specifics about the reportedly violative video and facilitate a resolution.