Facebook announced updates to its “remove, reduce, and inform” strategy, used to manage problematic content across Facebook, Instagram, and Messenger.
It’s been clear for years, that Facebook’s has an issue with questionable content on its platform. That’s why it’s been using a so-called “remove, reduce, and inform” strategy since 2016, to remove content that violates its policies, reduce the spread of content that is problematic but doesn’t violate any policies, and inform people with “additional information,” allowing them then to decide what to click on, read, or share. Now, Facebook is announcing several updates that improve this strategy.
The announcement came in the form of a meeting between Guy Rosen, Facebook’s VP of Integrity, Tessa Lyons, the company’s Head of News Feed Integrity, and a small group of journalists, in Menlo Park earlier this week.
Facebook is rolling out a new section within its Community Standards site where the platform’s users can track the updates that are made each month. Also, Facebook is updating the enforcement policy for Facebook groups, and launching a new Group Quality feature as part of the Safe Communities Initiative. The Group Quality feature offers an overview of content removed and flagged for most violations. It also includes a section especially for false news found in a group. This way, the enforcement of Facebook’s Community Standards becomes more public and clear.
Also, from now on, Facebook will be holding the admins of Facebook Groups more accountable for Community Standards violations. As explained by Rosen and Lyons, “when reviewing a group to decide whether or not to take it down, we will look at admin and moderator content violations in that group, including member posts they have approved, as a stronger signal that the group violates our standards.”
Problematic content might not be removed for violating Facebook’s Community Standards, but it needs to be reduced. To do so, Facebook is now working with outside experts to find new ways to fight fake news on Facebook, more quickly, and increasing the content that the Associated Press will review as a third-party fact-checker. The fact-checking program now includes 45 certified fact-checking partners reviewing in 24 languages.
To reduce misinformation, Facebook will also be reducing the reach of Facebook Groups that repeatedly share such content. Finally, Facebook is incorporating a “Click-Gap” signal into News Feed ranking to make sure that people see “less low-quality content in their News Feed.” As explained by Rosen and Lyons,
“This new signal, Click-Gap, relies on the web graph, a conceptual “map” of the internet in which domains with a lot of inbound and outbound links are at the center of the graph and domains with fewer inbound and outbound links are at the edges. Click-Gap looks for domains with a disproportionate number of outbound Facebook clicks compared to their place in the web graph. This can be a sign that the domain is succeeding on News Feed in a way that doesn’t reflect the authority they’ve built outside it and is producing low-quality content.”
On Instagram, the algorithm will automatically lower the reach of posts that contain problematic content – such as a sexually suggestive image, for example.
To inform people more effectively, Facebook is expanding the News Feed Context Button to images that have been reviewed by its third-party fact-checkers. Also, it’s adding Trust Indicators to the News Feed Context Button – the feature was initially launched back in April of last year.
The change will first apply to English and Spanish content.
As Rosen and Lyons explain, “Trust Indicators are standardized disclosures, created by a consortium of news organizations known as the Trust Project, that provide clarity on a news organization’s ethics and other standards for fairness and accuracy. The indicators we display in the context button cover the publication’s fact-checking practices, ethics statements, corrections, ownership and funding and editorial team.”
The other updates on Facebook and Messenger are:
- Adding more information to the Facebook Page Quality tab.
- Allowing people to remove their posts and comments from a Facebook Group after they leave the group.
- Combatting impersonations by bringing the Verified Badge from Facebook into Messenger.
- Launching Messaging Settings and an Updated Block feature on Messenger for greater control.
- Launched Forward Indicator and Context Button on Messenger to help prevent the spread of misinformation.
You might also like
More from Facebook
Facebook is rolling out a new Community Management course on Blueprint, which is free to join for everyone. What are …
Late last month, Facebook's NPE team announced an invite-only beta for Forecast, an app that lets users make crowdsourced predictions.
Facebook is changing its News Feed algorithm again - this time to prioritize original news reporting and transparent authorship.
Facebook continues its fight against fake news with a new notification warning you when you are about to share old …
Facebook has announced an additional commitment of $200 million to support black-owned businesses and organizations.
Facebook successfully removed ads published by the reelection campaign for President Donald Trump because they used Nazi iconography.