Under intense scrutiny due to a series of scandals, Facebook is now releasing its internal enforcement guidelines that govern what content is allowed on its platform and what is not. More importantly, the company is working on a new appeals process for reported content.
Nearly a year ago, The Guardian got hold of Facebook’s internal content moderation guidelines, and made it public. What was in those leaked documents, shocked many people, particularly because of the way in which Facebook deals with obviously objectionable content. Now, Facebook is publishing an expanded version of those guidelines, opening up the rules to input from users around the world.
Also, Facebook is introducing a new appeals process for those who feel their content has been removed in error, and those who’ve reported content and Facebook has done nothing about it.
The community standards, found here, contain 27 pages and cover wide variety of topics – explaining what Facebook’s rules are in specific cases pertaining to uploaded content. As you would expect, topics covered include bullying, violent threats, self-harm, and nudity, among others.
The guidelines, Facebook says have been created together with experts and advocacy groups have also been translated into over 40 languages and apply to all countries that Facebook operates. Of course, they are not set in stone and will evolve.
The policies are not new, however. It’s just that Facebook is changing “the level of explanation about how [its applies] those policies” as Monika Bickert, head of global policy management at Facebook explains.
To enforce the policies, Facebook has also promised to double its 10,000-person safety and security team that deals with reports, by the end of the year. Finally, Facebook also announced that it’s working on a better process for users who’ve had content taken down to appeal Facebook’s decision. If a post has been taken down, users will be notified and will be given the option to “request review.” Facebook promises to review the request within 24 hours.
Also, by the end of the year, if a user has reported a post but has been told that it “does not violate the community standards,” they can request a review for that as well.
You might also like
More from Facebook
Facebook To Begin Testing Topic Exclusion Controls For Advertisers
Facebook is to begin testing a new brand safety options, including new Facebook Ads topic exclusion controls for a small …
Facebook Oversight Board Publishes Decisions On First Cases
Back in December, Facebook's independent Oversight Board took on its first cases. Its decisions on these cases are now published.
Facebook Continues To Lose Users In The US And Canada
Facebook’s daily active users decline for a second quarter in the United States and Canada, caused by user fatigue and …
Facebook Launches Facebook News Portal In The UK
Facebook is starting to roll out Facebook News in the UK, a destination in its app featuring curated news from …
Facebook Revamps Access Your Information On iOS And Android
Facebook has announced a revamped version of its Access Your Information tool, making it easier for users to find and …
Facebook Bans Ads Promoting Weapon Accessories Ahead Of Inauguration Day
Facebook has announced a ban, on its platforms, of ads promoting weapon accessories and protective equipment - at least through …
Facebook Bans Donald Trump Indefinitely From Its Platforms
Following the recent events in Washington DC, Facebook CEO Mark Zuckerberg has announced an indefinite ban on Donald Trump from …
New Facebook Pages Come Without A Like Button
Facebook is rolling out a new design for Facebook Pages. It comes with new features, but without a Like button.
New Year’s Eve 2020 Sets New Records For Use Of Facebook Apps
Facebook was expecting a traffic spike on its apps over New Year's Eve, but this year broke all previous records …