The idea that fake news on Facebook heavily influenced the U.S. election might sound ludicrous to many, but that doesn’t mean that it’s a total impossibility. And it definitely doesn’t mean that Facebook is safe from blame. We all know that the platform has a pretty serious fake news problem. It’s just that now, the company is planning to address the issue.
The first time Mark Zuckerberg spoke about “fake news” was I bet he was hoping that would be enough. But it didn’t go away. And now, this is the third time he is releasing a statement about the issue. Sure, Facebook takes “misinformation very seriously.” Just acknowledging this is a start, but now Facebook is actually putting forward a plan to tackle fake news.
[quote]Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information,[/quote]
We agree, but how is Facebook going to do this? Well, for a start, it’s almost as if its News Feed needs a full-on tweak, and it seems it’s already updating its News Feed prediction algorithm to do so. Zuckerberg explains that,
[quote]the most important thing [Facebook] can do is improve [its] ability to classify misinformation. This means better technical systems to detect what people will flag as false before they do it themselves.[/quote]
Facebook already uses AI and machine learning to automatically filter out offensive images – it may just be a simple case of using that same technology on fake news. But Facebook has become a very politicised arena for discussion, and it’s going to be hard to restrict content without a proper strategy in place. There’s always the issue of news being biased, whether they come from an official news source or not. And this is something users will always argue. The idea is to get rid of misinformation, without alienating Facebook users.
Zuckerberg himself admits that it’s not going to be easy:
[quote]The problems here are complex, both technically and philosophically. We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.[/quote]
So, Facebook plans not to pass judgement or delete content, but rather rely on the community to detect and flag. It will also work together with third-party fact-checking journalists and organisations to spot inconsistencies, but will also improve its reporting system for users. We all know Facebook’s reporting system is pretty terrible, so this is definitely something we need in place, soon!
More from Facebook
Workplace by Facebook has announced new pricing plans to help organisations better connect with their frontline workers, predict costs, and …