Facebook is assigning its users a trustworthiness score on a scale from o to 1. This previously unreported system has been running for over a year.
Remember Black Mirror? Well, it’s already a reality and you did not even know it. A recent report from the Washington Post, show that Facebook started to assign a reputation score to its users a little over a year ago. The objective seems clear: identify malicious actors, and fight the spread of fake news. However, many are questioning the hidden motives or consequences of such a ranking system.
Tessa Lyons, the product manager in charge of fighting misinformation, confirmed in an interview that this system was developed to evaluate a user’s credibility, in order to fight the spread of fake news and problematic content.
Since Facebook never really manages to fact check information, evaluating the credibility of the user who shares the content could seem like a good idea. It becomes an even better idea when you dive into the new information warfare trend:
“It’s not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher,” Lyons said.
Facebook’s trustworthiness score is not meant to be the absolute indicator of a user’s credibility. Lyons explained that Facebook monitors over one thousand behavioral clues. Now, is that really good news?
Every user is has a score. But this information is not public, nor does Facebook ever intend to share the information with you or any other third party for that matter. However, in light of the recent scandals and personal data breaches, can we trust that this information will never be used against us? Probably not, but the world has changed and there is little we can do about it.
Now, let’s make sure we keep in mind all the life lessons we took from watching Black Mirror, shall we?