YouTube has updated the enforcement of its live streaming policy to prevent young children from going live unless they are supervised by an adult.
In response to widespread criticism that it’s not doing enough to curb the spread of inappropriate content and actions related to young children on its platform, YouTube is taking further steps to limit what kids can do on its platform.
Apart from limiting the recommendation of videos that show “minors in risky situations,” YouTube also updated its policies that pertain to minors, restricting live features for younger minors.
From now on, unless they are “clearly accompanied by an adult,” kids will not be able to stream live on YouTube, with the company also last week explaining that “channels not in compliance with this policy may lose their ability to live stream.”
Furthermore, YouTube also announced the launch of “new classifiers (machine learning tools that help […] identify specific types of content) on […] live products to find and remove more of this content.”
The announcement comes in the wake of a New York Times report that found YouTube’s recommendation system “has been suggesting videos of ‘prepubescent, partially clothed children’ to users who had watched sexually themed content. According to YouTube, it has now applied several new restrictions to its algorithm-based recommendations system, thus curbing recommendations of videos with minors to “tens of millions of videos.”
YouTube says that an all-out ban of videos with children “would hurt creators who rely on the recommendation engine to generate views.”
“Responsibility is our number one priority, and chief among our areas of focus is protecting minors and families,” the company explains in the blog post. “With this update, we’ll be able to better identify videos that may put minors at risk and apply our protections.”
Furthermore, YouTube explains that videos featuring minors “do not violate […] policies and are innocently posted.” However, there is still a large number of videos that do: The company says that in the first quarter of 2019 alone, it removed more than “800,000 videos for violations of […] child-safety policies (claiming the majority of those were deleted before they had ten views).”
Taking it one step further, YouTube also announced that it already works with “law-enforcement agencies to investigate crimes against children.” As it explains in the blog post, “reports sent to the National Center for Missing and Exploited Children prompted more than 6,000 such investigations in the past two years.”
More from Youtube
YouTube is testing a new tool that uses machine learning to create six-second "Bumper" ads from longer existing ones.