Facebook has begun to assign its users a reputation score, predicting their trustworthiness on a scale from zero to 1.
As part of a broader effort to reduce the spread of misinformation, the social media platform has been calculating and assigning reputation scores to users who report content as fake, the company confirmed. Facebook then takes a user’s score into account when the individual flags future stories as false or misleading.
The goal of the system is to account for instances when users report accurate news reports as false just because they disagree with their premise, Facebook product manager Tessa Lyons said. The score is presumably also meant to counteract organized disinformation campaigns that rely on mass reporting of unwanted posts.
Right now, it isn’t clear if the trust score is being used for anything other than reports on news stories, as well as reports on whether another Facebook user has posted something inappropriate or otherwise needing the company’s attention.