In a hideous reflection of China’s already-prevalent ‘Social Credit’ system – which is a rating assigned to each citizen based on government data regarding their economic and social status – The Washington Post reports that Facebook has begun to assign its users a reputation score, predicting their trustworthiness on a scale from zero to one.
Under the guise of its effort to combat ‘fake news’, WaPo notes (citing an interview with Tessa Lyons, the product manager who is in charge of fighting misinformation) that the previously unreported ratings system, which Facebook has developed over the last year, has evolved to include measuring the credibility of users to help identify malicious actors.
Users’ trustworthiness score between zero and one isn’t meant to be an absolute indicator of a person’s credibility, Lyons told the publication, nor is there is a single unified reputation score that users are assigned.
“One of the signals we use is how people interact with articles,” Lyons said in a follow-up email.
“For example, if someone previously gave us feedback that an article was false and the article was confirmed false by a fact-checker, then we might weight that person’s future false news feedback more than someone who indiscriminately provides false news feedback on lots of articles, including ones that end up being rated as true.”
The score is one measurement among thousands of behavioral clues that Facebook now takes into account as it seeks to understand risk.
“I like to make the joke that, if people only reported things that were [actually] false, this job would be so easy!” said Lyons in the interview. “People often report things that they just disagree with.”
…click on the above link to read the rest of the article…