Social Score: Facebook Now Rating Trustworthiness of Users

By Joseph Jankowski

Facebook is now rating the trustworthiness of its users as a part of the effort to tackle “fake news.”

The new trustworthiness score, which was first reported in the Washington Post, will work on a scale of zero to 1 and is one of thousands of new behavioral tools Facebook uses to monitor its users with.

From the WaPo:

Facebook developed its reputation assessments as part of its effort against fake news, Tessa Lyons, the product manager who is in charge of fighting misinformation, said in an interview. The company, like others in tech, has long relied on its users to report problematic content — but as Facebook has given people more options, some users began falsely reporting items as untrue, a new twist on information warfare for which it had to account.

It’s “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher,” Lyons said.

Users’ trustworthiness score between zero and 1 isn’t meant to be an absolute indicator of a person’s credibility, Lyons said, nor is there is a single unified reputation score that users are assigned. Rather, the score is one measurement among thousands of new behavioral clues that Facebook now takes into account as it seeks to understand risk. Facebook is also monitoring which users have a propensity to flag content published by others as problematic and which publishers are considered trustworthy by users.

According to Lyons, one of the signals that will be used in the companies rating decision will be how interactions with articles are handled:

For example, if someone previously gave us feedback that an article was false and the article was confirmed false by a fact-checker, then we might weight that person’s future false news feedback more than someone who indiscriminately provides false news feedback on lots of articles, including ones that end up being rated as true.

This previously unreported rating system has been in development over the past year.

As the Washington Post notes, Facebook is likely wary to discuss the trustworthiness rating process in detail, in part because doing so might invite further gaming.

At a time when Facebook is subjectively enforcing policies which are wiping right-leaning journalists and commentators off their platform, the move to hand out a social reputation score will likely be perceived as a partisan weapon.

It wasn’t long ago that former Facebook employees revealed the company was rigging its trending news section to prevent conservative media outlets from ever reaching top ranks.

While Facebook has yet to make blatantly announcement of where its bias stands (although actions speak louder than words), Twitter CEO Jack Dorsey may have shined a light on the overall silicon big-tech attitude when he admitted this week that the bias over at his company leaned left.

The move to rank users’ trustworthiness follows a decision the Facebook made at the beginning of the year to use a similar scoring system for news organizations. All in the name of combating “fake news.”

“There’s too much sensationalism, misinformation and polarization in the world today,” Facebook CEO Mark Zuckerberg said in a January post. “Social media enables people to spread information faster than ever before, and if we don’t specifically tackle these problems, then we end up amplifying them.”

********

Joseph Jankowski is a contributor to PlanetFreeWill.com. His works have been published by globally recognizable news sites like Infowars.comZeroHedge.com, GlobalResearch.ca, and ActivistPost.com.

Follow PFW on MindsTwitterSteemitGab and sign up for our NEWSLETTER

Image credit: Anthony Freda Art


Activist Post Daily Newsletter

Subscription is FREE and CONFIDENTIAL
Free Report: How To Survive The Job Automation Apocalypse with subscription

1 Comment on "Social Score: Facebook Now Rating Trustworthiness of Users"

  1. Who will watch the watchers…hmmm

Leave a comment