Facebook and fake news
In a post-truth ecosystem, truth needs its arbiters more than ever. Sites such as Facebook need to acknowledge their professional and ethical responsibilities
In August this year, Facebook replaced the entire team that curated its Trending Topics section with an algorithm. This came soon after news reports claimed the curators where suppressing news from the conservative side of the U.S. political spectrum. Though an internal investigation found no systemic effort to suppress news, a statement from Facebook said the algorithm “allows our team to make fewer individual decisions about topics”.
There was a hiccup, though. Soon after it went active, the algorithm picked up and promoted a story about a conservative television anchor supporting Hillary Clinton that turned out to be highly controversial, and equally false. The algorithm was designed to look for what most people were talking about or were interested in, and it did its job.
A platform for news
This would not be alarming but for a recent report from the Pew Research Centre which found that 44 per cent of American adults now get their news on Facebook. Read this together with the fact that 64 per cent of them depend on just one site for the news and we begin to grasp the grip Facebook has over the information available to the average, voting-age American.
The political role of Facebook and the other Internet giant, Google, has come under scrutiny, after Donald Trump won the U.S. presidential polls. Several fingers are pointing at fake news stories online as a reason for this election going off agenda. According to an analysis by BuzzFeed, a false post saying Pope Francis had endorsed Mr. Trump, put out by a fake news site WTOE 5, had a Facebook engagement of 960,000 (the sum of all comments, likes and shares for a post).
The highest engagement for a real news on the election was 849,000, for Washington Post’s “Trump’s history of corruption is mind-boggling. So why is Clinton supposedly the corrupt one?” According to BuzzFeed’s Craig Silverman, the top 20 false news items in the three months prior to the election had a collective Facebook engagement of 8,711,000, while the top 20 real news items garnered 7,367,000 engagements.
Facebook’s initial reaction to the allegation was denial, with founder Mark Zuckerberg saying it was “pretty crazy” to think fake Facebook posts could swing elections. They may not have, but the Buzzfeed numbers on false news are no small sums. And the Pew numbers are an indicator of how many people probably consumed those as real news. And, considering the margins at which the election was won, no number is too small.
The impact of false news is much more considering given how Facebook delivers it to us. In June this year, Facebook changed its news feed algorithm to give more weight to ‘friends and family’. That is, if someone on your friends list liked or commented on a post, the chances of you seeing that post are now much higher. Then as you engage with those posts more, Facebook further fine-tunes your newsfeed to show similar posts. This streamlines your news feed to the interests of your circle, limited or diverse as it is. Essentially the algorithm places us in a jury of our peers, locking us in echo chambers of thought, validating each other in an infinite loop. Throw in a false news into this mix and, if it aligns with the ideological tilt of your clique, it goes viral (the disease metaphor being most apt here).
Facebook later accepted the dangers of fake news, with Mr. Zuckerberg posting a blog saying Facebook would “penalise this content in News Feed so it’s much less likely to spread.” Both Facebook and Google have also decided to disconnect fake news websites from their powerful advertisement networks, the money from which is the main incentive for fake news and click-bait articles.
However, Mr. Zuckerberg went on to say that the problem was “complex, both technically and philosophically,” and that Facebook did not want to curtail people’s ability to “share what they want whenever possible.” The high point of his argumentwas: “We do not want to be arbiters of truth ourselves.”
Arbiter of truth
Truth and its arbiters are not a highly regarded lot nowadays. We are now in the ‘post-truth’ era, where truth is “a matter of perspective” and those who seek to tell the truth are “agenda driven”. Mr. Zuckerberg is smart to distance himself from being an “arbiter of truth”. In fact, both Facebook and Google have always avoided taking on a news media tag, though they are for all practical purposes the biggest news media outlets in the world. Being in the news media business brings with it a legal, professional and ethical responsibilities to be an arbiter of truth.
In his mea culpa on fake news, Mr. Zuckerberg has said that he will reach out to journalists and news media organisation for tips on content verification. He is forgetting that the media is already there, fighting it out for a piece of the content distribution pie offered by the platform. These media outlets are putting expensive, fact-checked reports on your platform for free, Mr. Zuckerberg, all you have to do is give them precedence over un-corroborated news. Right now they are pushing listicles and click-baits to break into the aforementioned cliques that Facebook’s algorithms have created.
Of course, trust in the mainstream media is not exactly a distinctive feature of the Internet. We have terms like “presstitute” and “Lugenpresse” to describe the extent to which mainstream media is distrusted by the champions and foot soldiers of the politics of populism that is sweeping the globe. But the fact remains that the journalist who goes out on to the street to physically verify a fact remains one among the best arbiters of truth. And truth needs its arbiters, now more than ever.
george.pj@thehindu.co.in
Source: The Hindu, 29-11-2016