A huge amount of content is created and shared on Facebook each and every minute, and most of it is completely unmoderated by anyone when it goes up. Therefore, we’ve all come across content that is definitely not supposed to be there. In fact, most of us see it on a day to day basis. Some of us report it, and some not. All we know is that there is a lot of inappropriate content out there, and it inevitably end up on Facebook pages, groups, or personal accounts. So, who moderates content on the platform, what are their challenges, what type of content is moderated, and how does the process work?
Also Read: Facebook Is Tweaking Its Trending Section
This infographic from WhoIsHostingThis.com answers all these questions and more… To make it easier, we give you a brief rundown below.
Who?
Facebook employs roughly 800-1000 moderators who speak 24 languages (not each of them) to moderate 4,75 billion shared posts, and 300 million photo uploads from the 864 million logins each day – Many are from countries like India or the Philippines whereas most of the Facebook moderators in the US are recent college graduates. They don’t seem to last long in the position though. Most leave within 3-6 months of starting.
If you like our stories, there is an easy way to stay updated:
Follow @wersm
What?
It has been suggested that moderators end up with PTSD because of the nature of content that they view. What kind of content can give you PTSD I you ask? Cue, content related to pedophilia, necrophilia or other inappropriate sexual topics, violence like beheadings, suicides, murders or animal abuse, domestic abuse, drug use, or abusive threats… We’ve all come across some of this, which proves that moderation might not be effective enough and many complain that Facebook has inconsistent policies regarding what it sensors and what it doesn’t.
If you like our stories, there is an easy way to stay updated:
Follow @wersm
How?
You didn’t think that 1000 people shift through each and every bit of content now did you? Well, they don’t. Most content is examined by an algorithm before anyone sees it. Then, there is everything us users report. This is all sorted depending on category and send to moderation teams. These categories are, safety (graphic violence), hate and harassment, access (hacker and impostor issues) and abusive content (sexually explicit). Moderators then decide if the content violates community standards, and they can choose to delete, ignore, or escalate each report.
More info in the infographic below.
You might also like
More from Facebook
Facebook Is Giving You More Controls Over What You See On Your Feed
Facebook is introducing new "show more" and "show less" controls to let you adjust what you want to see on …
Meta Introduces Facebook Reels API, Offering An Option To ‘Share To Reels’
Meta has introduced the Facebook Reels API, a solution allowing developers to build a 'share to reels' option into their …
Facebook Gets Into Delivery With DoorDash Partnership
DoorDash is partnering with Meta to pilot Facebook Marketplace deliveries across multiple cities in the U.S. Drivers will only transport items …