In a new quarterly report it released to show how it’s enforcing its community guidelines, YouTube proves just how successful its addition of AI to flag videos has been so far.
YouTube, just like most other big platforms had a big problem with objectionable content being uploaded and shared on its platform. The problem certainly hasn’t gone away, but now it is being dealt with – rather effectively it seems. In its first “Community Guidelines Enforcement Report,” YouTube shows the progress its making “in removing violative content” from its platform over time.
Among other things, the report shows just how successful YouTube has been in removing “violative videos before they are ever viewed.” With only humans to flag videos, this has been and continues to be a huge challenge. But with the introduction of machines to the process, YouTube has been able to flag and review content effectively, and at scale.
For example, in the beginning of 2017, only 8% of videos “flagged and removed for violent extremism” were removed before they reached 10 views. After introducing machine learning flagging in the summer of 2017, this number jumped to over 50%.
Consequently, in Q4 2017, out of the 8 million videos removed (mostly spam and adult content), 6.7 million were flagged by machine learning – not humans. And 76% of those, were removed before receiving even a single view!
However, it’s not all about machines. YouTube values the human contribution to this process. Ultimately, its “systems rely on human review to assess whether content violates” policies. The company is committed to increasing its staff working on addressing violative content to 10,000 across Google by the end of the year.
It has also already “hired full-time specialists with expertise in violent extremism, counterterrorism, and human rights,” and continues to invest in its network of academics, NGOs, and government partners.
Finally, users have an important role in this as well. To highlight this, YouTube is also introducing a “Reporting History dashboard” so that each user can see what the status of each video they flagged is.
More from Youtube
YouTube is testing a new tool that uses machine learning to create six-second "Bumper" ads from longer existing ones.