Ever Wondered How Facebook Moderates Its Content?

by • May 24, 2017 • FacebookComments Off on Ever Wondered How Facebook Moderates Its Content?2538

The Guardian was recently able to get its hands on some content moderation training manuals and other materials, that show just how difficult it is for Facebook to moderate its content.

I report a lot of stuff on Facebook on a daily basis. A LOT! So, I do get annoyed when I get the canned response that Facebook will not remove the content in question, as it doesn’t go against any specific guidelines. The Guardian recently got its hands on “more than 100 internal training manuals, spreadsheets and flowcharts,” offering an insight into the processes involved in moderating content on Facebook.

In a recent article Nick Hopkins reported that content moderators have an overwhelming workload of having to deal with over 6.5  million FNRP (fake, not real person”) reports alone, every day. This allows them only around 10 seconds to process content and to make a decision based on the site’s guidelines. Also, he explained that sexual content could easily be the most “inconsistent, complex and confusing.”

A massive challenge is of course context. So, for example, while content can violate specific policies in one context, it could be ok in others. Facebook‘s head of global policy management, Monika Bickert, who spoke to Hopkins, explained

We have a really diverse global community, and people are going to have very different ideas about what is OK to share. No matter where you draw the line, there are always going to be some grey areas. For instance, the line between satire and humor and inappropriate content is sometimes very grey.

As Hopkins explains in his analysis of the leaked materials, “Someone shoot Trump,” should be deleted, as the POTUS is in a “protected category.” However, saying “To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat,” would not be seen as a credible threat. Similarly, some videos of violent deaths, may just be marked as disturbing and not be deleted, as Facebook says they can generate awareness of certain issues.

Also, as long as certain images or videos are not sadistic or celebratory in their violence, they may not be deleted. Finally, things like animal abuse is allowed on the site, except for what Facebook calls “disturbing” imagery which is extremely upsetting. You can find the entire cache of leaked documents, here.

Facebook acknowledges that its moderators face a “challenging and difficult job.” That in itself is an understatement, as the “high turnover of moderators, who say they suffer from anxiety and post-traumatic stress” shows just how much of a problem their job can be for their mental state.


[wysija_form id=”5″]

Did you like this post? Subscribe to our Newsletter!

We don't spam, we will just send you a daily email with the best of our posts.




Comments are closed.