In its fight against fake news, Facebook has announced two changes that it hopes will help ensure the accuracy of information on its platform and shed more light on how people decide what information is accurate or not.
Facebook has been on the warpath with fake news for over a year now. And it’s actually been doing quite a lot in the past year to help ensure that the information its users get, is as accurate as possible. After all, accurate information generates meaningful conversations, which Facebook loves. False news undermine the ability to connect. But fearing that it will be criticised as a censor, Facebook needs to tread carefully. So now, instead of identifying false news with Disputed Flags, it will rather use Related Articles “to help give people more context” about a story.
Facebook started using Disputed Flags in March as a method of flagging false news, but recent academic research has found that “putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs.” This has exactly the opposite effect of what Facebook intended.
So, instead of Disputed Flags, Facebook is now favouring the use of Related Articles, that, “by contrast, are simply designed to give more context.” In this respect, Facebook’s research shows that context is a much more effective way to help people determine the facts. It actually found, that showing “Related Articles next to a false news story leads to fewer shares than when the Disputed Flag is shown.”
The second update relates to a process of understanding “how people decide whether information is accurate or not based on the news sources they depend upon.” It’s an initiative that will help Facebook better measure its success in improving information on its platform, and will not be directly impacting News Feed for the time being.
Facebook says that it’s constantly “investing in better technology and more people to help prevent the spread of misinformation” and overall, I think we can agree that it’s doing a pretty good job. One of its best weapons, is “demoting false news” that have been identified as such by fact-checkers. As Facebook explains, “demoted articles typically lose 80 percent of their traffic. This destroys the economic incentives spammers and troll farms have to generate these articles in the first place.”