A huge amount of content is created and shared on Facebook each and every minute, and most of it is completely unmoderated by anyone when it goes up. Therefore, we’ve all come across content that is definitely not supposed to be there. In fact, most of us see it on a day to day basis. Some of us report it, and some not. All we know is that there is a lot of inappropriate content out there, and it inevitably end up on Facebook pages, groups, or personal accounts. So, who moderates content on the platform, what are their challenges, what type of content is moderated, and how does the process work?
Also Read: Facebook Is Tweaking Its Trending Section
This infographic from WhoIsHostingThis.com answers all these questions and more… To make it easier, we give you a brief rundown below.
Who?
Facebook employs roughly 800-1000 moderators who speak 24 languages (not each of them) to moderate 4,75 billion shared posts, and 300 million photo uploads from the 864 million logins each day – Many are from countries like India or the Philippines whereas most of the Facebook moderators in the US are recent college graduates. They don’t seem to last long in the position though. Most leave within 3-6 months of starting.
If you like our stories, there is an easy way to stay updated:
Follow @wersm
What?
It has been suggested that moderators end up with PTSD because of the nature of content that they view. What kind of content can give you PTSD I you ask? Cue, content related to pedophilia, necrophilia or other inappropriate sexual topics, violence like beheadings, suicides, murders or animal abuse, domestic abuse, drug use, or abusive threats… We’ve all come across some of this, which proves that moderation might not be effective enough and many complain that Facebook has inconsistent policies regarding what it sensors and what it doesn’t.
If you like our stories, there is an easy way to stay updated:
Follow @wersm
How?
You didn’t think that 1000 people shift through each and every bit of content now did you? Well, they don’t. Most content is examined by an algorithm before anyone sees it. Then, there is everything us users report. This is all sorted depending on category and send to moderation teams. These categories are, safety (graphic violence), hate and harassment, access (hacker and impostor issues) and abusive content (sexually explicit). Moderators then decide if the content violates community standards, and they can choose to delete, ignore, or escalate each report.
More info in the infographic below.
You might also like
More from Facebook
The Oculus Quest Platform Now Supports Premium Content Subscriptions
As VR content expands beyond gaming, Facebook is bringing support for subscriptions on the Oculus Quest platform.
It Cost Facebook $23 Million To Keep CEO Mark Zuckerberg Safe In 2020
Facebook has spent more than $23 million to guarantee Mark Zuckerberg's security in 2020, as revealed by a SEC filing …
Facebook NPE Team Introduces Hotline, A New Competitor For Clubhouse
Facebook Hotline is a slightly different take on Clubhouse that puts more power in the hands of attendees who can …
Facebook Is Testing QR Codes For Peer-To-Peer Payments In The US
In the US, Facebook has begun testing Venmo-like QR codes to facilitate peer-to-peer payments using its app.
Introducing Facebook Dynamic Ads For Streaming
Facebook has announced Dynamic Ads for Streaming - a new ad solution that will help streaming brands highlight their content …
Facebook Data Breach: Did They Get Your Data?
The latest Facebook data breach concerned over 533 million users. We help you check if your personal data was compromised.
Facebook Analytics Will Disappear On June 30
Facebook will retire its Analytics tool on June 30, the company announced in a Business Help Center post today.
Facebook Expands Supports For Unemployed Or Furloughed Ad Agency Professionals
The Rise Initiative Facebook began in Brazil to aid ad agency professionals is now expanding to Canada, Italy, Singapore, and …
You Can Now Control Who Comments On Your Facebook Posts
Facebook has introduced a new feature that lets you control who can comment on your posts when shared in the …