Have you ever wondered just how much “bad stuff” goes up on Facebook, and how the platform enforces its Community Standards? Well, now you can take a good look at the numbers. Published for the first time ever, they are a real eye-opener.
There’s a lot of “bad stuff” out there on Facebook, and the sheer size of the platform makes it pretty hard to moderate what stays up and what comes down. Reviewers often have just a few seconds to decide, and that’s where the problem lies a lot of the time. Facebook has simply gotten way too big for its own good and finds it really hard to enforce its Community Standards. However, it is enforcing, and it has the numbers to prove it.
Three weeks ago, the company published its internal standards used to decide whether something stays up or is taken down; now, for the first time ever, it has published its enforcement of these standards, in numbers. The report covering Facebook‘s enforcement efforts between October 2017 to March 2018 covers six areas that are also part of its Community Standards for easy reference: graphic violence, adult nudity and sexual activity, terrorist propaganda, hate speech, spam, and fake accounts.
The numbers in the report will show you how much content people saw that violates those standards; how much content Facebook removed; and how much content it detected proactively using AI (before people reported it).
Most of the action it took to remove the “bad stuff” involved spam. In fact, Facebook took down 837 million pieces of spam in Q1 2018 — nearly 100% of which was found and flagged before anyone on the platform was able to report it. Facebook also disabled about 583 million fake accounts which spread the spam in question. Most of these were “disabled within minutes of registration.” These fake accounts join the millions that Facebook prevents from registering for its platform. Still, Facebook estimates that “around 3 to 4% of the active Facebook accounts on the site during this period were still fake.”
In Q1 2018, also took down “21 million pieces of adult nudity and sexual activity” — 96% of which was found and flagged by AI before being flagged. The company estimates that out of every 10,000 pieces of content viewed on its platform, “7 to 9 views were of content that violated our adult nudity and pornography standards.”
In the same period, Facebook removed 3.5 million pieces of violent content — 86% of which was identified by AI before users reported it. In terms of hate speech, Facebook says its “technology still doesn’t work that well” and still needs to be checked by review teams. Yet, it removed 2.5 million pieces of hate speech in Q1 2018 — 38% of which was flagged by its “technology.”
You can access the first Community Standards Enforcement Report here.
You might also like
More from Facebook
Facebook Bans Ads Promoting Weapon Accessories Ahead Of Inauguration Day
Facebook has announced a ban, on its platforms, of ads promoting weapon accessories and protective equipment - at least through …
Facebook Bans Donald Trump Indefinitely From Its Platforms
Following the recent events in Washington DC, Facebook CEO Mark Zuckerberg has announced an indefinite ban on Donald Trump from …
New Facebook Pages Come Without A Like Button
Facebook is rolling out a new design for Facebook Pages. It comes with new features, but without a Like button.
New Year’s Eve 2020 Sets New Records For Use Of Facebook Apps
Facebook was expecting a traffic spike on its apps over New Year's Eve, but this year broke all previous records …
3 Courses To Help You Create Better Video Content For Your Ads
Here are three free Facebook Blueprint courses to help you take your content to the next level.
Facebook Attacks Apple With Full-Page Newspaper Ads
Facebook is running newspaper ads newspapers criticizing Apple for its policy giving iOS users the choice of whether they want …
New Facebook Collab App Lets You Create Music With Friends
Facebook Collab is a new experimental music-making app available in the US, that lets you create music with friends.
Facebook Expands Brand Collabs Manager To Public Groups
Facebook is rolling out it s Brand Collabs Manager to public groups this week, giving them a new opportunity for …
The US Government Is Taking Legal Action To Break Up Facebook
Forty-eight attorneys general and the Federal Trade Commission say Facebook Needs To Sell Instagram And WhatsApp.