We all resist changing our beliefs about the world, but what happens when some of those beliefs are based on misinformation? Is there a right way to correct someone when they believe something that's wrong?
Stephen Lewandowsky and John Cook set out to review the science on this topic, and even carried out a few experiments of their own. This effort led to their "Debunker's Handbook", which gives practical, evidence-based techniques for correcting misinformation about, say, climate change or evolution. Yet the findings apply to any situation where you find the facts are falling on deaf ears.
The first thing their review turned up is the importance of “backfire effects” - when telling people that they are wrong only strengthens their belief. In one experiment, for example, researchers gave people newspaper corrections that contradicted their views and politics, on topics ranging from tax reform to the existence of weapons of mass destruction. The corrections were not only ignored – they entrenched people’s pre-existing positions.
Backfire effects pick up strength when you have no particular reason to trust the person you are talking to. This perhaps explains why climate sceptics with more scientific education tend to be the most sceptical that humans are causing global warming.
The irony is that understanding backfire effects requires that we debunk a false understanding of our own. Too often, argue Lewandowsky and Cook, communicators assume a 'deficit model' in their interactions with the misinformed. This is the idea that we have the right information, and all we need to do to make people believe is to somehow "fill in" the deficit in other people's understanding. Just telling people the evidence for the truth will be enough to replace their false beliefs. Beliefs don't work like that.
Psychological factors affect how we process information – such as what we already believe, who we trust and how we remember. Debunkers need to work with this, rather than against if they want the best chance of being believed.
The most important thing is to provide an alternative explanation. An experiment by Hollryn Johnson and Colleen Seifert, shows how to persuade people better. These two psychologists recruited participants to listen to news reports about a fictional warehouse fire, and then answer some comprehension questions.
Some of the participants were told that the fire was started by a short circuit in a closet near some cylinders containing potentially explosive gas. Yet when this information was corrected – by saying the closet was empty – they still clung to the belief.
A follow-up experiment showed the best way to effectively correct such misinformation. The follow-up was similar to the first experiment, except that it involved participants who were given a plausible alternative explanation: that evidence was found that arson caused the fire. It was only those who were given a plausible alternative that were able to let go of the misinformation about the gas cylinders.
Lewandowsky and Cook argue that experiments like these show the dangers of arguing against a misinformed position. If you try and debunk a myth, you may end up reinforcing that belief, strengthening the misinformation in people's mind without making the correct information take hold.
What you must do, they argue, is to start with the plausible alternative (that obviously you believe is correct). If you must mention a myth, you should mention this second, and only after clearly warning people that you're about to discuss something that isn't true.
This debunking advice is also worth bearing in mind if you find yourself clinging to your own beliefs in the face of contradictory facts. You can’t be right all of the time, after all.
Read more about the best way to win an argument.
If you have an everyday psychological phenomenon you'd like to see written about in these columns please get in touch @tomstafford or firstname.lastname@example.org. Thanks to Ullrich Ecker for advice on this topic.