A new academic study found that misinformation on Facebook received six times more engagement than real news.
According to a recent report from The Washington Post Researchers at New York University and the Université Grenoble Alpes have compiled a peer-reviewed study demonstrating how Facebook’s algorithm has fuelled the spread of misinformation.
The study, which has taken into account the period between August 2020 and January 2021, establishes that unreliable news sources known for spreading misinformation received six times as much likes, shares, and interactions than official news sources like CNN, or the World Health Organization (WHO).
As the 2016 election saw a massive spread of highly engaging “fake news” on Facebook, it comes at no surprise that such information receives more attention than official news.
The forthcoming study, however, will represent one of the first comprehensive attempts to measure the impact of misinformation across 2,500 news publishers on the social media platform. Furthermore, the findings establish a support base for the argument that Facebook rewards publishers that put out fake and constructed information.
According to The Washington Post report, even though higher engagement was detected for pages posting misinformation regularly across the political spectrum, right-wing publishers have reportedly displayed a much higher propensity to share misleading information than other publishers.
Facebook’s response suggests one must look beyond one metric only – only looking at engagement numbers doesn’t tell the whole story. According to the report, a Facebook spokesperson explains that the study runs short of presenting a full picture because it only looks at engagement, and not “reach.”
However, Facebook apparently hasn’t made reach data available to researchers. Without access to this data, researchers have only been able to draw information from a tool called CrowdTangle, which is actually owned by Facebook.
At least that was up until August this year, when Facebook cut off access to this data and to the library of political ads on the platform for researchers. The official reason is that giving continuous information access to third-party researchers could expose Facebook to violating a settlement it made with the Federal Trade Commission following the Cambridge Analytica scandal — a claim the FTC said was “inaccurate.”
Meanwhile, New York Times reporter, Kevin Roose, regularly listed right-wing pages responsible for posting a lot of misinformation, as the most engaged news sources on Facebook. And he did so using CrowdTangle.
In August this year, Facebook addressed public concerns by releasing a “transparency report” where it listed the most-viewed posts of the second quarter of the year. However, according to The New York Times, Facebook intentionally skipped the release of a report about the first quarter because the most-viewed post in that period was an article that wrongly linked the coronavirus vaccine to a Florida doctor’s death.
The popular post was widely used by right-wing pages to support claims about the inefficacy of the vaccine. The researchers will share the study as part of the 2021 Internet Measurement Conference in November.
You might also like
More from Facebook
Facebook is celebrating Messenger's 10th birthday with the release of an epic bundle of 10 new product features.
Facebook is testing the integration of Messenger features back into its main app, seven years after separating the two.
Facebook is working with the International Paralympic Committee (IPC) to bring the Tokyo 2020 Paralympic Games to people via Facebook …
Facebook is reimagining the collaborative workspace with Horizon Workrooms, letting people meet and work together in VR and mixed-reality.
Facebook is expanding its encryption option to cover voice chats and video calls instead of just text chats.
Facebook is expanding its support for black-owned businesses, with its first #BuyBlack Summit and a range of other initiatives during …