Economic and Social Research Council
Printable version

Inoculating against fake news?

Benjamin Franklin is said to have coined the phrase that an ounce of prevention is worth a pound of cure. This applies to many things, even combating 'fake news' and other forms of misinformation.

By Stephan Lewandowsky, Sander van der Linden and John Cook

Misinformation sticks. Erasing 'fake news' from your memory is as difficult as getting jam off your fingers after a Devonshire tea. Once you hammer into people that there are Weapons of Mass Destruction (WMDs) in Iraq, it doesn't matter that none were found after the country was thoroughly scoured by the invading forces. The constant drumbeat of 'WMD, WMD, WMD' in the lead-up to the invasion, followed by innumerable media reports of 'preliminary tests' that tested positive for chemical weapons during the early stages of the conflict – but ultimately were never confirmed by more thorough follow-up tests – created a powerful impression that those weapons had been discovered. An impression so powerful that four years after the absence of WMDs became the official US position, 60% of Republicans and 20% of Democrats believed either that the US had found WMDs or that Iraq had them, but had hidden the weapons so well that they escaped detection.

Misinformation can stick even when people acknowledge a correction, and know that a piece of information is false. In a study conducted during the initial stages of the invasion of Iraq, colleagues and ourselves presented participants with specific warrelated items from the news media, some of which had been subsequently corrected, and asked for ratings of belief as well as memory for the original information and its correction. We found that US participants who were certain that the information had been retracted, continued to believe it to be true.

This 'I know it's false but I think it's true' behaviour is the signature of the stickiness of misinformation. Misinformation sticks even in situations in which people have no ideological or motivational incentive to stick to their erroneous beliefs. In the laboratory, the original misinformation shines through in people's responses to inference questions when they are presented with entirely fictional but plausible scripts about various events. For example, people will act as though a fictitious warehouse fire was due to negligence even if, later in the script, they are told the evidence pointing to negligence turned out to be false.

Is there any way to unstick information?

There is broad agreement in the literature that combating misinformation requires that the correction be accompanied by a causal alternative. Telling people that negligence was not a factor in a warehouse fire is insufficient – but telling them that arson was to blame instead will successfully prevent any future reliance on the negligence idea.

Another way to combat misinformation is to prevent it from sticking in the first place. An ounce of inoculation turns out to be worth a pound of corrections and causal alternatives. If people are made aware that they might be misled before the misinformation is presented, there is evidence that people become resilient to the misinformation.

This process is variously known as 'inoculation' or 'prebunking' and it comes in a number of different forms. At the most general level, an upfront warning may be sufficient to reduce – but not eliminate – subsequent reliance on misinformation. In one of our studies, led by Ullrich Ecker, we found that telling participants at the outset that 'the media sometimes does not check facts before publishing information that turns out to be inaccurate' reduced reliance modestly (but significantly) in comparison to a retraction-only condition. A more specific warning that explained that 'research has shown that people continue to rely on outdated information even when it has been retracted or corrected', by contrast, reduced subsequent reliance on misinformation to the same level as was observed with a causal alternative.

A more involved variant of inoculation not only provides an explicit warning of the impending threat of misinformation, but it additionally refutes an anticipated argument that exposes the imminent fallacy. In the same way that a vaccination stimulates the body into generating antibodies by imitating an infection, which can then fight the real disease when an actual infection occurs, psychological inoculation stimulates the generation of counter-arguments that prevent subsequent misinformation from sticking.

The inoculation idea can be illustrated with an example from climate change. Although there is a pervasive scientific consensus – reliant on 150-yearold basic physics and 15,000 modern scientific articles – that the Earth is warming from the burning of fossil fuels, political operatives often seek to undermine that consensus to introduce doubt about those scientific facts in the public's mind.

Ullrich Ecker and ourselves showed that people can be inoculated against those disinformation efforts by presenting them with (1) a warning that attempts are made to cast doubt on the scientific consensus for political reasons, and (2) an explanation that one disinformation technique involves appeals to dissenting 'fake experts' to feign a lack of consensus. We illustrated the 'fake-expert' approach by revealing the attempts of the tobacco industry to undermine the medical consensus about the health risks from smoking with advertising claims such as '20,679 Physicians say 'Luckies are less irritating''.

By exposing the fake-expert disinformation strategy at the outset, the subsequent misinformation (in this case, the feigned lack of consensus on climate change) was defanged and people's responses did not differ from a control condition that received no misinformation about the consensus. (Whereas in the absence of inoculation, that misinformation had a detrimental effect.)

Misinformation sticks and is hard to dislodge. But we can prevent it from sticking in the first place by alerting people to how they might be misled.

Further information

This article was published in the winter 2018 issue of the Society Now magazine. It first appeared in CREST Security Review issue 8.

  • Stephan Lewandowsky is Professor of Cognitive Psychology at the University of Bristol. Dr Sander van der Linden is a lecturer in psychology at the University of Cambridge, and Dr John Cook is a research assistant professor at the Center for Climate Change Communication at George Mason University.
  • The Centre for Research and Evidence on Security Threats (CREST) is a national hub for understanding, countering and mitigating security threats. CREST Security Review has a new website and mobile app. @crest_research

 

Channel website: http://www.esrc.ac.uk

Original article link: https://esrc.ukri.org/news-events-and-publications/news/news-items/inoculating-against-fake-news/

Share this article

Latest News from
Economic and Social Research Council

Free, Secure, Compliant UK Public Sector IT Recycling Service