How Does Facebook Moderate Content? [Infographic]

by • April 15, 2015 • FacebookComments Off on How Does Facebook Moderate Content? [Infographic]9225

A huge amount of content is created and shared on Facebook each and every minute, and most of it is completely unmoderated by anyone when it goes up. Therefore, we’ve all come across content that is definitely not supposed to be there. In fact, most of us see it on a day to day basis. Some of us report it, and some not. All we know is that there is a lot of inappropriate content out there, and it inevitably end up on Facebook pages, groups, or personal accounts. So, who moderates content on the platform, what are their challenges, what type of content is moderated, and how does the process work?

Also Read: Facebook Is Tweaking Its Trending Section

This infographic from answers all these questions and more… To make it easier, we give you a brief rundown below.


Facebook employs roughly 800-1000 moderators who speak 24 languages (not each of them) to moderate 4,75 billion shared posts, and 300 million photo uploads from the 864 million logins each day – Many are from countries like India or the Philippines whereas most of the Facebook moderators in the US are recent college graduates. They don’t seem to last long in the position though. Most leave within 3-6 months of starting.

If you like our stories, there is an easy way to stay updated:


It has been suggested that moderators end up with PTSD because of the nature of content that they view. What kind of content can give you PTSD I you ask? Cue, content related to pedophilia, necrophilia or other inappropriate sexual topics, violence like beheadings, suicides, murders or animal abuse, domestic abuse, drug use, or abusive threats… We’ve all come across some of this, which proves that moderation might not be effective enough and many complain that Facebook has inconsistent policies regarding what it sensors and what it doesn’t.

If you like our stories, there is an easy way to stay updated:


You didn’t think that 1000 people shift through each and every bit of content now did you? Well, they don’t. Most content is examined by an algorithm before anyone sees it. Then, there is everything us users report. This is all sorted depending on category and send to moderation teams. These categories are, safety (graphic violence), hate and harassment, access (hacker and impostor issues) and abusive content (sexually explicit). Moderators then decide if the content violates community standards, and they can choose to delete, ignore, or escalate each report.

More info in the infographic below.


[wysija_form id=”5″]

Did you like this post? Subscribe to our Newsletter!

We don't spam, we will just send you a daily email with the best of our posts.

Comments are closed.