News: “A group of biomedical scientists have found that countries without universal policies of BCG vaccination have been more severely affected compared to countries with universal and longstanding BCG policies.”
Social media inference over the next several days: “Indians are immune to coronavirus.”
In the party game of Chinese Whisper (or Whisper Game), a player whispers a story into the next person’s ear. It is then relayed to the next person, and so on until the end of the line. When the last player says it out loud, the message is very different. The game is a classic example of how people attribute meanings to messages differently and distortion happens along the way.
The damaging effects of fake news are well-documented. Some advice is even available on how to overcome mental health issues related to information overload leading to false interpretations of Covid-19 information. CNN urges us to “slim down” on information consumption and trust only a few verified sources.
But most studies around fake news revolve around its creation and transmission, i.e., around information that is false at the origin. Fake news-busting portals educate the consumer. Verification that comes after the diffusion of information at best sets the record straight. But is this enough?
Fake news is not always fiction that masquerades as news from its origin. It is often a cycle of nudges that helps it acquire fakeness. Genuine information often suffers at the hands of this cycle.
Information Osmosis
The nature of news diffusion is that it often travels in steps. The original news diffusion studies showed it as a two-step process, from the media to the first set of receivers, and then from those people to the rest of us.
Over the past decade of social media, this two-step flow from media to social media and the influential relay of messages has only picked up momentum, although the definition of an influencer may have broadened on the new medium.
The text that we write before we forward a message is called ‘post’. When news spreaders add their text, they may add their two-cents’ worth, exerting an interpersonal influence over their followers. That influence, coupled with individual biases—forming ‘echo chambers’ and ‘filter bubbles’—determines the direction in which distortion takes place. Trust in the source is of little consequence. Rather, consonance with one’s belief set is critical to sharing of information.
As news gets distorted along its path of diffusion, technology and its usage do the rest. All it takes is a small nudge from each user, conveniently adding or removing a detail from the original. This creates a snowball of false interpretations, widening the wedge with the original.
If the post is merely an opinion, it does not add fakeness. But we have often observed on social media how opinion is interpreted as information, and the next nudge may include that bit. As the diffusion continues, the nudges from people who receive and re-produce the post potentially create an increasingly inaccurate understanding of the information.
It is true that several receivers may push back at those interpretations. But if a post offers a confirmation of our existing beliefs, it is more likely that we will absorb it and react and perhaps even forward it with the next little nudge of our own.
Can ‘nudge’ be the solution?
Nudge Theory proposes that systemic tweaks can change minds over time. Although I have used the term ‘nudge’ liberally and in a general sense so far in this article, Nobel Prize winner Richard Thaler originally used the word in the sense of how nudge can be used for positive reinforcement. It is achieved by changing the ‘choice architecture’— the design of how choices are presented to a consumer and how it impacts decisions.
For example, if a government wants to promote organ donation, it can make donating the default option in a digital hospital application, for example. The user will then need to consciously un-click that option to opt out. When several European governments tried this method, the opt-ins shot up from 15% to 90%.
Studies on the subject show that nudging can help overcome cognitive bias. If we can explain distortion by the phenomenon of nudge, can we also resolve or undo the distortion using the same concept?
Yes, say some experts, stating that some initial technology is available to curb fake news reproduction through the nudge method. For example, researchers in Cyprus and Portugal claim to have identified 23 mechanisms of nudging to remind the social media user about fake news. They called this group of nudges the Nudge Deck. An example is a reminder of the consequences of the action. Users underestimate the consequences of their actions. These nudges help them understand those consequences more accurately. Some of these nudges accelerate a possible future regret, by projecting that regret before the action.
Separately, a group of researchers in Belgium claim to have developed a tool that can provide the user with credible choices of sources—including information and varied opinions (another distinction that users often may not distinguish).
If a nudge can create or enhance that tiny bit of fakeness, a nudge can also help in raising awareness about the bias that’s at play as we snowball fake news.
As prosumers of information, we receive and relay information. That’s enough reason why policymakers should focus on methods that address the information (re)producer in us. Whether the available technology and methods are being tested and furthered, and whether governments are even taking anti-fake news policies seriously, is a moot point. There is no short-term solution to fake news, but it is fair to say that policymakers are not doing enough to address it head-on. After all, fake news could be beneficial for one affected party in a news item. We can only hope that valuable projects that aim to nudge out fake news are picked up and funded, scaled up and plugged into existing systems. Policymakers in India must work towards sustainable and real achievement of these goals by nudging prosumers of the social consequences of fakeness they produce.
The author has led media institutes of repute and is the founder of media literacy organisation BeingResponsible