Facebook staff mount secret push to tackle fake news, reports say

Facebook is facing increasing pressure to improve the way it deals with fake news in the wake of the shock 2016 US presidential election result, amid reports that even some of its own staff have formed an unofficial task force to address the problem.

Employees from across the company have secretly come together to try and tackle the problem, BuzzFeed reported on Monday, despite Facebook publicly playing down the role of fake news in the election. CEO Mark Zuckerberg insisted on Sunday that more than 99% of what people see on the platform is authentic, rejecting the “crazy idea” that fake news swayed voters.

“It’s not a crazy idea. What’s crazy is for him to come out and dismiss it like that, when he knows, and those of us at the company know, that fake news ran wild on our platform during the entire campaign season,” one Facebook employee told BuzzFeed.

Privately, however, the Guardian understands that fake news is being taken very seriously and has been debated at Facebook for months.

According to Gizmodo, Facebook executives have been reviewing its products to eliminate the appearance of political bias. One source said high-ranking officials were briefed on a planned news feed update that would have identified fake or hoax news stories but it disproportionately impacted rightwing news sites. The update was consequently shelved, Gizmodo claims, although Facebook denies it.

The scrutiny over Facebook’s treatment of editorial content has been intensifying for months, reflecting the site’s unrivaled power and influence in distributing news alongside everything else its users share on the site.

Fake or misleading news spreads like wildfire on Facebook because of confirmation bias, a quirk in human psychology that makes us more likely to accept information that conforms to our existing world views.
The conspiracy theories are also amplified by a network of highly partisan media outlets with questionable editorial policies, including a website called the Denver Guardian peddling stories about Clinton murdering people and a cluster of pro-Trump sites founded by teenagers in Veles, Macedonia, motivated only by the advertising dollars they can accrue if enough people click on their links.

The Pew Research Center found that 62% of Americans get all or some of their news from social media, of which Facebook accounts for the lion’s share. Yet an analysis by BuzzFeed found that 38% of posts shared on Facebook by three rightwing politics sites included “false or misleading information”, while three large leftwing pages did so 19% of the time.

Earlier in 2016, Facebook faced criticism for bias against conservative news after former Facebook workers revealed that its trending news team was run by human curators who were told to routinely suppress stories on conservative topics. That followed public comments Zuckerberg made at the F8 developer conference in April 2016 criticizing “fearful voices calling for building walls” and halting immigration.

Facebook denied the allegations and fired the trending topics team, replacing them with an algorithm that subsequently trended several fake stories – including one that labelled Fox News host Megyn Kelly a “closet liberal who actually wants Hillary to win”.

According to the New York Times, this episode “paralysed Facebook’s willingness to make any serious changes to its products that might compromise the perception of its objectivity”. Of course, even technology isn’t objective. The algorithms that power Facebook are embedded with biases based on a series of decisions made by humans, so claiming otherwise is disingenuous.

Plus we know that Facebook can already identify truly fake news – Zuckerberg pointed this out over the weekend. He said that 99% of all content on the social network is “authentic”, which implies that the company knows which 1% of content isn’t.

The problem is not unique to Facebook. If you are to believe the top Google result for “final election results”, you’d think that Trump won the popular vote in the 2016 election. He did not.

The slip-up was widely reported on Monday, demonstrating that though Google’s algorithms are also susceptible to fake news, the company wants to be seen as better at tackling it than Facebook. “The goal of search is to provide the most relevant and useful results for our users,” a spokeswoman said. “In this case we clearly didn’t get it right, but we are continually working to improve our algorithms.”

Google has since revealed it is also working on a policy update to restrict its ads from being placed on fake news sites, subtly highlighting its ability to discern misleading news. “We will restrict ad serving on pages that misrepresent, misstate, or conceal information about the publisher, the publisher’s content, or the primary purpose of the web property,” Google spokeswoman Andrea Faville told Reuters.

Part of the reason Google is better at tackling fake news than Facebook is that it lacks a popular social network (its own Google Plus service lags an order of magnitude behind Facebook) where stories get shared among users and where misinformation can spread like wildfire. It’s also because Google’s search engine favors webpages that are linked to by other established sites, which typically means fake news ranks lower.

Furthermore, Google handpicks the publications that appear within its news aggregator Google News. Around 75,000 websites in 45 countries have been approved to appear in this list (although these include questionable alt-right media outlets like Breitbart News).

Of course, it’s not always a clear distinction between “fake” or “real” news. As pointed out by former Facebook designer Bobby Goodlatte on Medium, you can fact-check stories, but it’s not always easy to filter out quasi-truths or highly editorialized takes on events – what he describes as “bullshit”.

Powered by Guardian.co.ukThis article was written by Olivia Solon in San Francisco, for theguardian.com on Tuesday 15th November 2016 01.29 Europe/Londonguardian.co.uk © Guardian News and Media Limited 2010


Have something to tell us about this article?