The Challenge Of Keeping Families Safe On YouTube And YouTube Kids

by • November 29, 2017 • YoutubeComments Off on The Challenge Of Keeping Families Safe On YouTube And YouTube Kids2054

YouTube has been facing some issues lately, with non family-friendly content appearing where it shouldn’t. As a result, it’s getting tougher on how it enforces its guidelines – but that’s not all. 

There’s been a growing trend on YouTube in the last few months, where video content is uploaded as family-friendly – but is absolutely not. So, YouTube is doing its best to remove the content in question (especially from YouTube Kids), and to make sure that it doesn’t return. In a recent blog post, Johanna Wright, Vice President of Product Management at YouTube describes what the company is doing to address these issues.

YouTube has always had strict policies agains child endangerment, but has now expanded its enforcement guidelines, removing content that includes minors endangering a child, even if it wasn’t the intent of the uploader to do so. Wright explains that in the last week, YouTube has “terminated over 50 channels and [has] removed thousands of videos under these guidelines.” It has also implemented age-restriction policies on content featuring family entertainment characters but “containing mature themes or adult humor.” Additionally, it is applying machine learning to find and surface content like this.

In June, YouTube announced an update to its advertiser-friendly guidelines, under which it would remove ads from “content depicting family entertainment characters engaged in violent, offensive, or otherwise inappropriate behavior, even if done for comedic or satirical purposes.” Wright explained that since then, the company has removed ads from roughly 3 million videos using the new policy, and has made its current policy stronger to remove ads from 500,000 more videos that are classified as “violative.”

In addition to the above, YouTube is also “blocking inappropriate comments on videos featuring minors.” In the past, the company has used both human flagging and automation to both review and remove comments on videos portraying minors. The comments YouTube removes range from sexual to predatory and as Wright explains, the company also works closely “with NCMEC to report illegal behavior to law enforcement.” This week, YouTube will start to take even more of an aggressive stance – turning off comments altogether, on videos of minors where this type of comments are found.

In terms of keeping viewers safe on YouTube Kids, the company will be releasing a “comprehensive guide” for creators to make content that is enriching and suitable for kids.

Finally, as there has been a rise in content that is more “nuanced,” making it more difficult to decide upon, YouTube is making a serious point of making them available only to users who are over 18. While cartoons that target adults and “feature characters doing things you normally wouldn’t want your children to see” are fine for YouTube.com, they are definitely deemed questionable for children. The challenge is how to identify this content at scale. For this, YouTube relies on experts and a growing number of Trusted Flaggers.

Monitoring the huge volume of content that is uploaded on YouTube and YouTube Kids every day is not easy. However, YouTube is working hard with the measures above (and more) to make sure it can enforce its policies more effectively.


Did you like this post? Subscribe to our Newsletter!

We don't spam, we will just send you a daily email with the best of our posts.




Comments are closed.