Countering Fake News? Learn from Cognitive Science!

05 November 2019   ·   Stephan Lewandowsky

Cognitive science has shown that if people know that they might be misled before any misinformation is presented, they become more resilient to being misinformed. To successfully debunk fake news, the German government could enhance its horizon scanning capabilities to identify the strategies and tools of disinformers. It should also support NGOs that tackle misinformation.

Truth isn’t the first casualty of war – it’s the casualty that often causes war. Iraq was invaded in 2003 based on the presumed threat posed by its “weapons of mass destruction” or WMDs. Those weapons did not in fact exist, but their presence was conjured up in a campaign of deception by the U.K. and U.S. governments. The Rwandan genocide of 1994 that claimed up to a million victims was fanned by radio broadcasts that encouraged Hutu hatred and slaughter of the Tutsis based on invented incidents and accusations. And misinformation spread via Facebook fueled the ethnic cleansing of the Rohingya in Myanmar last year.

A primary task of conflict prevention must therefore involve the fight against misinformation and “fake news”. Unfortunately, this is no easy task.

Any correction must be accompanied by an alternative explanation

Misinformation sticks. Erasing “fake news” from our memories is as difficult as getting rid of the residue from a price tag on a birthday present. After the constant drum beat of “WMDs, WMDs” in the lead-up to the invasion of Iraq, it doesn’t matter that none were found after the country was thoroughly searched. More than 10 years after the absence of WMDs became the official U.S. position, over 40 per cent of the American public continued to believe that U.S. forces had discovered an active WMD program in Iraq.

Misinformation can stick even when people acknowledge and accept a correction. In a study conducted during the initial stages of the invasion of Iraq, colleagues and I presented participants with specific war-related news items, some of which had been subsequently corrected, and asked for ratings of belief as well as memory of the original information and its correction. We found that U.S. participants who were certain that the information had been retracted, continued to believe it to be true.

So is there any way to unstick misinformation?

There is broad scientific agreement that combating misinformation requires that a correction be accompanied by an alternative explanation. People may accept that there were no WMDs in Iraq if they consider an alternative, namely that the war was launched for reasons other than to rid the country of its weapons. In confirmation, our study found that people who were sceptical of the official WMD-related reason for the war, were more accurate in their processing of war-related information.

Illustrate how misinformation and fake news work

An even better way to combat misinformation is to prevent it from sticking in the first place.

This approach is known as inoculation, and it works much like a vaccination. If people are made aware that they might be misled before the misinformation is presented, and if they are exposed to a small dose of misinformation to illustrate, people tend to become more resilient to being misinformed afterwards.

Even a simple up-front warning that people may be misled has been shown to be sufficient to reduce – but not eliminate – subsequent reliance on misinformation. A more thorough and effective variant of inoculation not only provides an explicit warning of the impending threat of misinformation, but additionally refutes an anticipated fallacious argument. For example, if people are reminded of the “fake experts” strategy used by the tobacco industry to defang actual medical experts, then people subsequently become resilient to the same strategy being used to undermine the scientific consensus on climate change. Once you’ve heard how others have been fooled, you are not easily fooled yourself.

In a nutshell, there are two successful debunking strategies: The correction must be accompanied by an alternative explanation for why the misinformation arose in the first place, or people must be warned and informed about how they will be misled. How can these findings from cognitive science be translated into practice?

Enhance capabilities to identify strategies and rhetorical tools of disinformers

An immediate policy implication, for governments such as the German one is to enhance their horizon scanning capabilities: if publics are to be inoculated against misinformation, then it is essential to recognize the strategies and rhetorical tools employed by disinformers the moment they arise, and before they have found a foothold. Recent work by Italian researchers has shown considerable promise, permitting automatic identification of 77 percent of topics that are susceptible to misinformation online. Along similar lines, the European Parliament has proposed the creation of a European observatory of disinformation.

Armed with knowledge of human cognition, and supported by an early-warning system, the same media technologies that have in the past contributed to conflicts, can also help prevent violence from breaking out in the first place. However, these technologies are more likely to work as a tool of prevention if they are not employed by the government. The U.S. government, for example, has made attempts to counter jihadist radicalization by debunking misinformation and propaganda with a “Counter-Misinformation Team.” However, those efforts have not only failed to yield measurable success, but may even have been counterproductive by causing further alienation in predominantly Muslim countries.

Support independent NGOs that tackle misinformation

A better approach is to provide support for suitable local non-government organizations (NGOs) that tackle misinformation independently and under their own control.

Successful precedents for this approach can be found in Kenya, whose presidential election in 2007-2008 was marred by widespread violence. In the lead-up to the subsequent election in 2013, a team of NGOs analyzed how the earlier rumors had spread, discovering that this occurred mainly via SMS. The NGOs built a 65,000-person mobile network spanning Kenya, thus setting up an infrastructure that allowed messages emphasizing peace to outpace messages spreading rumors. The network could immediately respond to trigger events and broadcast warnings against misinformation – thus inoculating the public against being misled – and emphasize local unity and the terrible consequences of violence – thus providing an alternative to the misinformation. A quantitative evaluation of the impact of the network found that the messages helped prevent the spread of rumors and contributed to the elections remaining peaceful.  

A similar initiative took place in the Tana Delta region of Kenya that had recently experienced outbursts of violence. The Una Hakika platform provides crowd-sourced rumor verification that is based on users submitting data regarding potential misinformation via their mobile phones. The information is then checked by the platform using a network of 200 “community ambassadors”. The provision of accurate information has reduced tensions amongst residents, where misinformation previously stirred up violence. The Una Hakika platform was launched by the Sentinel Project, a Canadian non-profit organization financed by, among other agencies, the Canadian Government.

Finally, if efforts to contain or reduce tensions fail, research on the effects of mass media can at least point to avenues for reconciliation. In Rwanda, a year-long field experiment conducted 10 years after the genocide showed the power of radio to foster reconciliation. A Dutch NGO produced a radio soap opera that combined education about the origins and prevention of intergroup violence with entertainment. Communities around Rwanda were assigned to a treatment group (who listened to the soap opera) and a control group (a show about general health issues). Listeners of the soap opera showed greater empathy with other groups and greater awareness of the traumatic impact of violence.

Kommunikation Misinformation

Stephan Lewandowsky

Stephan Lewandowsky is a Professor of Cognitive Psychology at the University of Bristol and a member of the UK Academy of Social Sciences. His research focuses on people’s responses to disinformation and populism. @STWorg