YouTube has updated the enforcement of its live streaming policy to prevent young children from going live unless they are supervised by an adult.
In response to widespread criticism that it’s not doing enough to curb the spread of inappropriate content and actions related to young children on its platform, YouTube is taking further steps to limit what kids can do on its platform.
Apart from limiting the recommendation of videos that show “minors in risky situations,” YouTube also updated its policies that pertain to minors, restricting live features for younger minors.
From now on, unless they are “clearly accompanied by an adult,” kids will not be able to stream live on YouTube, with the company also last week explaining that “channels not in compliance with this policy may lose their ability to live stream.”
Furthermore, YouTube also announced the launch of “new classifiers (machine learning tools that help […] identify specific types of content) on […] live products to find and remove more of this content.”
The announcement comes in the wake of a New York Times report that found YouTube’s recommendation system “has been suggesting videos of ‘prepubescent, partially clothed children’ to users who had watched sexually themed content. According to YouTube, it has now applied several new restrictions to its algorithm-based recommendations system, thus curbing recommendations of videos with minors to “tens of millions of videos.”
YouTube says that an all-out ban of videos with children “would hurt creators who rely on the recommendation engine to generate views.”
“Responsibility is our number one priority, and chief among our areas of focus is protecting minors and families,” the company explains in the blog post. “With this update, we’ll be able to better identify videos that may put minors at risk and apply our protections.”
Furthermore, YouTube explains that videos featuring minors “do not violate […] policies and are innocently posted.” However, there is still a large number of videos that do: The company says that in the first quarter of 2019 alone, it removed more than “800,000 videos for violations of […] child-safety policies (claiming the majority of those were deleted before they had ten views).”
Taking it one step further, YouTube also announced that it already works with “law-enforcement agencies to investigate crimes against children.” As it explains in the blog post, “reports sent to the National Center for Missing and Exploited Children prompted more than 6,000 such investigations in the past two years.”
You might also like
More from Youtube
Google has announced a series of tools in Display & Video 360 that make it easier for media buyers to …
CEO Susan Wojcicki announced YouTube is creating a multiyear $100 million fund to amplify and develop the voices of Black …
YouTube was once launched as a video-dating website. Thankfully, that didn't work out and it soon became the world's leading …
After months of testing, YouTube is opening up its Video Builder to more businesses. The free beta tool lets them …
According to reports, YouTube will be reducing the amount of content spreading conspiracy theories linking 5G technology and COVID-19.
YouTube is reportedly working on a new app called Shorts, to compete directly with the fast-growing influence of TikTok.
YouTube, the Alphabet-owned video giant, has decided to lower the default video quality to standard definition for all users globally, …
After testing its Explore tab on mobile, YouTube is now replacing it's Trending tab on both Android and iOS and …