Followers

Monday, April 11, 2022

Social media has a serious disinformation problem. But it can be fixed

 Social media platforms have effectively supplanted traditional information networks in India. The dialectical relationship between online content, traditional media and political networks means that the messages propagated online effectively touch even those who are not yet online.

This ubiquity could have been a golden moment for India — democratising access to information, fostering community, increasing citizen participation and reducing the distance between ordinary people and decision-makers. However, social media platforms have adopted design choices that have led to a proliferation and mainstreaming of misinformation while allowing themselves to be weaponised by powerful vested interests for political and commercial benefit. The consequent free flow of disinformation, hate and targeted intimidation has led to real-world harm and degradation of democracy in India: Mainstreamed anti-minority hate, polarised communities and sowed confusion have made it difficult to establish a shared foundation of truth.

Organised misinformation (disinformation) has a political and/or commercial agenda. However, even though there is growing recognition of the political motivations and impact of disinformation, the discourse in India has remained apolitical and episodic — focused on individual pieces of content and events, and generalised outrage against big tech instead of locating it in the larger political context or structural design issues. The evolution of the global discourse on misinformation too has allowed itself to get mired in the details of content standards, enforcement, fact checking, takedowns, deplatforming, etc — a framework which lends itself to bitter partisan contest over individual pieces of content while allowing platforms to disingenuously conflate the discourse on moderating misinformation with safeguards for freedom of expression. However, these issues are adjunct to the real issue of disinformation and our upcoming report establishes that the current system of content moderation is more a public relations exercise for platforms than being geared to stop the spread of disinformation.

A meaningful framework to combat disinformation at scale must be built on the understanding that it is a political problem: The issue is as much about bad actors as individual pieces of content. Content distribution and moderation are interventions in the political process. There is thus a need for a comprehensive transparency law to enforce relevant disclosures by social media platforms. Moreover, content moderation and allied functions such as standard setting, fact-checking and de-platforming must be embedded in the sovereign bipartisan political process if they are to have democratic legitimacy. If this is not to degrade into legal sanction for government censorship, any regulatory body must be grounded in democratic principles — its own and of platforms.

Given the political polarisation in our country (and most others), the constitution of such a regulator and its operational legitimacy is difficult. However, the failure of a polarised political ecosystem to come to a consensus is not a free pass for the platforms. Platforms are responsible for the speed and spread of distribution of disinformation and the design choices, which have made disinformation ubiquitous and indistinguishable from vetted information. It is thus the responsibility of the platforms to tamp down on the distribution of disinformation and their weaponisation. We argue that platforms are sentient about the users and content they are hosting and bear responsibility for their distribution choices. Moreover, just as any action against content is seen as an intervention in the political process, the artificial increase in distribution of content (amplification) too has political and commercial value.

We recommend three approaches to distribution that can be adopted by platforms: Constrain distribution to organic reach (chronological feed); take editorial responsibility for amplified content; or amplify only credible sources (irrespective of ideological affiliation). The current approach to misinformation that relies on fact-checking a small subset of content in a vast ocean of unreviewed content is inadequate for the task and needs to be supplemented by a review of content creators itself.

Finally, as the country with the largest youth population in the world, it is important that we actively think of how we want our youth to engage in our democratic processes and the role of social media platforms in it. There are three notable effects of social media on our politics, which require  deliberation.

First, social media has led to a dislocation of politics with people weighing in on abstractions online while being disengaged from their immediate surroundings. Second, social media has led to a degradation of our political discourse where serious engagement has been supplanted by “hot takes” and memes. Third, it has obscured the providence of consequential interventions in our

political discourse because of opacity in technology.

Meaningful politics, especially in democracies, is rooted in local organisation, discussion and negotiation. However, the structure of social media has facilitated a perception of engagement without organisation, action without consequence. This wasn’t and isn’t inevitable — there are more thoughtful ways to structure platforms, which would help connect and root people in their own communities instead of isolating them locally while “connecting” them virtually.

Instead of moving towards more grounded communities, there is an acceleration towards greater virtuality through “metaverse”. Social media cannot be wished away. But its structure and manner of use are choices we must make as a polity after deliberation instead of accepting as them fait accompli or simply being overtaken by developments along the way.

Written by Ruchi Gupta

Source: Indian Express, 11/04/22