Have you ever wondered just how much “bad stuff” goes up on Facebook, and how the platform enforces its Community Standards? Well, now you can take a good look at the numbers. Published for the first time ever, they are a real eye-opener.
There’s a lot of “bad stuff” out there on Facebook, and the sheer size of the platform makes it pretty hard to moderate what stays up and what comes down. Reviewers often have just a few seconds to decide, and that’s where the problem lies a lot of the time. Facebook has simply gotten way too big for its own good and finds it really hard to enforce its Community Standards. However, it is enforcing, and it has the numbers to prove it.
Three weeks ago, the company published its internal standards used to decide whether something stays up or is taken down; now, for the first time ever, it has published its enforcement of these standards, in numbers. The report covering Facebook‘s enforcement efforts between October 2017 to March 2018 covers six areas that are also part of its Community Standards for easy reference: graphic violence, adult nudity and sexual activity, terrorist propaganda, hate speech, spam, and fake accounts.
The numbers in the report will show you how much content people saw that violates those standards; how much content Facebook removed; and how much content it detected proactively using AI (before people reported it).
Most of the action it took to remove the “bad stuff” involved spam. In fact, Facebook took down 837 million pieces of spam in Q1 2018 — nearly 100% of which was found and flagged before anyone on the platform was able to report it. Facebook also disabled about 583 million fake accounts which spread the spam in question. Most of these were “disabled within minutes of registration.” These fake accounts join the millions that Facebook prevents from registering for its platform. Still, Facebook estimates that “around 3 to 4% of the active Facebook accounts on the site during this period were still fake.”
In Q1 2018, also took down “21 million pieces of adult nudity and sexual activity” — 96% of which was found and flagged by AI before being flagged. The company estimates that out of every 10,000 pieces of content viewed on its platform, “7 to 9 views were of content that violated our adult nudity and pornography standards.”
In the same period, Facebook removed 3.5 million pieces of violent content — 86% of which was identified by AI before users reported it. In terms of hate speech, Facebook says its “technology still doesn’t work that well” and still needs to be checked by review teams. Yet, it removed 2.5 million pieces of hate speech in Q1 2018 — 38% of which was flagged by its “technology.”
You can access the first Community Standards Enforcement Report here.
You might also like
More from Facebook
The Oculus Quest Platform Now Supports Premium Content Subscriptions
As VR content expands beyond gaming, Facebook is bringing support for subscriptions on the Oculus Quest platform.
It Cost Facebook $23 Million To Keep CEO Mark Zuckerberg Safe In 2020
Facebook has spent more than $23 million to guarantee Mark Zuckerberg's security in 2020, as revealed by a SEC filing …
Facebook NPE Team Introduces Hotline, A New Competitor For Clubhouse
Facebook Hotline is a slightly different take on Clubhouse that puts more power in the hands of attendees who can …
Facebook Is Testing QR Codes For Peer-To-Peer Payments In The US
In the US, Facebook has begun testing Venmo-like QR codes to facilitate peer-to-peer payments using its app.
Introducing Facebook Dynamic Ads For Streaming
Facebook has announced Dynamic Ads for Streaming - a new ad solution that will help streaming brands highlight their content …
Facebook Data Breach: Did They Get Your Data?
The latest Facebook data breach concerned over 533 million users. We help you check if your personal data was compromised.
Facebook Analytics Will Disappear On June 30
Facebook will retire its Analytics tool on June 30, the company announced in a Business Help Center post today.
Facebook Expands Supports For Unemployed Or Furloughed Ad Agency Professionals
The Rise Initiative Facebook began in Brazil to aid ad agency professionals is now expanding to Canada, Italy, Singapore, and …
You Can Now Control Who Comments On Your Facebook Posts
Facebook has introduced a new feature that lets you control who can comment on your posts when shared in the …