Under intense scrutiny due to a series of scandals, Facebook is now releasing its internal enforcement guidelines that govern what content is allowed on its platform and what is not. More importantly, the company is working on a new appeals process for reported content.
Nearly a year ago, The Guardian got hold of Facebook’s internal content moderation guidelines, and made it public. What was in those leaked documents, shocked many people, particularly because of the way in which Facebook deals with obviously objectionable content. Now, Facebook is publishing an expanded version of those guidelines, opening up the rules to input from users around the world.
Also, Facebook is introducing a new appeals process for those who feel their content has been removed in error, and those who’ve reported content and Facebook has done nothing about it.
The community standards, found here, contain 27 pages and cover wide variety of topics – explaining what Facebook’s rules are in specific cases pertaining to uploaded content. As you would expect, topics covered include bullying, violent threats, self-harm, and nudity, among others.
The guidelines, Facebook says have been created together with experts and advocacy groups have also been translated into over 40 languages and apply to all countries that Facebook operates. Of course, they are not set in stone and will evolve.
The policies are not new, however. It’s just that Facebook is changing “the level of explanation about how [its applies] those policies” as Monika Bickert, head of global policy management at Facebook explains.
To enforce the policies, Facebook has also promised to double its 10,000-person safety and security team that deals with reports, by the end of the year. Finally, Facebook also announced that it’s working on a better process for users who’ve had content taken down to appeal Facebook’s decision. If a post has been taken down, users will be notified and will be given the option to “request review.” Facebook promises to review the request within 24 hours.
Also, by the end of the year, if a user has reported a post but has been told that it “does not violate the community standards,” they can request a review for that as well.
More from Facebook
With election season in full swing, Facebook may need a helping hand or two when it comes to third-party apps. …
To help identify and take action against more types of misinformation, faster, Facebook announced that it is expanding fact-checking for …