ShortList is supported by you, our amazing readers. When you click through the links on our site and make a purchase we may earn a commission. Learn more

Facebook is secretly giving you the side-eye and ranking how trustworthy you are

Could this help win the fake news war?

Facebook is secretly giving you the side-eye and ranking how trustworthy you are
Tom Victor
23 August 2018

You might think you’re the most trustworthy person around - but Facebook is not going to believe you until it’s had a proper look, according to new reports.

The social network has been embarking on a war on ‘fake news’, something which has seemingly led to more people getting their news from WhatsApp than Facebook Messenger.

However, while choosing to do things differently is one thing, having your movements monitored as part of a crackdown is another entirely.

Dealing with ‘fake news’ has become a priority in the Trump era

According to the Washington Post, Facebook has begun ranking its users based on the level to which they can be trusted to share accurate news.

It’s not quite on the ‘Black Mirror becoming real life’ level of China’s ‘Social Credit System’, but you can see why comparisons are being made.

Users are given a trustworthiness rating on a scale of 0 to 1, according to insiders, but the basis for the ratings are currently unclear.

There have been complaints, for example, about the lack of transparency when it comes to just how the judging system works, and whether it can be trusted to remain impartial.

Tessa Lyons, product manager at Facebook, explained that the company cannot rely purely on the reports of users, as many will cry ‘fake news’ in response to things which, while true, they simply don’t agree with.

Consequently, users’ overall habits are taken into account when determining the accuracy of claims that others on their timeline are spreading falsehoods.

“For example, if someone previously gave us feedback that an article was false and the article was confirmed false by a fact-checker, then we might weight that person’s future false-news feedback more than someone who indiscriminately provides false-news feedback on lots of articles, including ones that end up being rated as true,” Lyons said.

The Washington Post notes that the score is “among thousands of new behavioral clues” used by Facebook, though it is unclear how much each of these clues factors in to the overall impressions.

It could be one of those things where the mere presence of a trustworthiness rating is enough to get users to take more care about the ‘news’ they share, though we’re not counting on it. Perhaps they’ll just listen to Brian Acton, the co-founder of Whatsapp, and delete Facebook altogether.

(Images: Getty/Netflix)