Following its second quarterly Community Guidelines Enforcement Report, YouTube shows how its efforts to flag and remove offensive content are paying off.
YouTube is expanding its Community Guidelines Enforcement Report to include additional data like channel removals, the number of comments removed and the policy reason for a video or channel removal. The results show that YouTube is getting faster at flagging and removing content, and tackling comments.
YouTube has always used a “mix of human reviewers and technology to address violative content” on its platform, and over a year ago it started to use better machine learning to flag content for review. The combination of the two has allowed YouTube’s teams to enforce its policies a lot faster. As YouTube explains in a recent announcement, “finding all violative content on YouTube is an immense challenge, but we see this as one of our core responsibilities and are focused on continuously working towards removing this content before it is widely viewed.”
Its efforts are paying off:
- From July to September 2018, its teams removed 7.8 million videos
- 81% of these videos were first detected by machines
- Of those detected by machines, 74.5% of videos had never received a single view
When YouTube detects a video that violates its Guidelines, it removes the video and applies a strike to the channel. It will then “terminate entire channels if they are dedicated to posting content” that is prohibited or that “contain a single egregious violation, like child sexual exploitation.” YouTube admits that “the vast majority of attempted abuse comes from bad actors trying to upload spam or adult content: over 90% of the channels and over 80% of the videos that we removed in September 2018 were removed for violating our policies on spam or adult content.”
Furthermore, when looking at “the most egregious, but low-volume areas, like violent extremism and child safety,” YouTube says its “significant investment in fighting this type of content is having an impact,” with “over 90% of the videos uploaded in September 2018 and removed for Violent Extremism or Child Safety had fewer than 10 views.”
In addition to removing offensive content, YouTube has also worked very hard on making comments on its platform safer. The company uses the same combination of “smart detection technology and human reviewers to flag, review, and remove spam, hate speech, and other abuse in comments.” It has also added tools to allow creators to be able to moderate comments on their videos effectively. There are now over one million creators who use these tools to moderate their channel’s comments.
Following an increase in enforcement against “violative comments,”
- From July to September of 2018, YouTube’s teams removed over 224 million comments for violating its Community Guidelines.
- The majority of those removals were for spam, while the “total number of removals” is only a small part of the billions of comments posted on its platform.
One might think that this enforcement would lead to the comments ecosystem to shrink. According to YouTube, this is not the case. In fact, the platform’s daily users are 11% more likely to comment than they were last year.
You might also like
More from YouTube
YouTube Is Adding AI-Powered Dubbing
YouTube is integrating the team from Aloud to help you dub your videos in other languages with the help of …
Smaller Creators Can Now Access YouTube Partner Program
The YouTube Partner Program will now be open to creators once they reach 500 subscribers, allowing smaller creator to monetize …
YouTube Brings Unskippable 30-Second Ads To TV
YouTube is bringing longer unskippable ads to its TV app, unless you're paying for the platform's premium subscription YouTube says these …
YouTube Brings Shopping Features To Shorts
YouTube is testing affiliate marketing and in-app shopping features on Shorts, less than a week after TikTok began testing similar …
Get Ready To Choose Your YouTube Handle
YouTube is introducing handles to make it easier for members to find and connect with each other. Your handle will be …
YouTube Now Lets You Add A Voiceover To Your Shorts
YouTube is adding one of its most-requested features to Shorts; the ability to add a voiceover, like narration and original …
YouTube Starts Adding Watermarks To Downloaded Shorts Clips
YouTube has started adding watermarks to downloaded Shorts clips to stop users from cross-posting them to other platforms.
YouTube Expands Offense Warnings To Desktop, Updates Payments System
New updates for creators! A dedicated payments account within AdSense, and warnings on potentially offensive text detected by AI.
YouTube Offering Podcast Networks Up To $300k To Film Episodes
YouTube is offering financial incentives to podcasting networks to adapt their shows to a video format and share them on …