Evidence based debunking

Fed up with futile internet arguments, a bunch of psychologists investigated how best to correct false ideas. Tom Stafford discovers how to debunk properly.

We all resist changing our beliefs about the world, but what happens when some of those beliefs are based on misinformation? Is there a right way to correct someone when they believe something that’s wrong?

Stephen Lewandowsky and John Cook set out to review the science on this topic, and even carried out a few experiments of their own. This effort led to their “Debunker’s Handbook“, which gives practical, evidence-based techniques for correcting misinformation about, say, climate change or evolution. Yet the findings apply to any situation where you find the facts are falling on deaf ears.

The first thing their review turned up is the importance of “backfire effects” – when telling people that they are wrong only strengthens their belief. In one experiment, for example, researchers gave people newspaper corrections that contradicted their views and politics, on topics ranging from tax reform to the existence of weapons of mass destruction. The corrections were not only ignored – they entrenched people’s pre-existing positions.

Backfire effects pick up strength when you have no particular reason to trust the person you are talking to. This perhaps explains why climate sceptics with more scientific education tend to be the most sceptical that humans are causing global warming.

The irony is that understanding backfire effects requires that we debunk a false understanding of our own. Too often, argue Lewandowsky and Cook, communicators assume a ‘deficit model’ in their interactions with the misinformed. This is the idea that we have the right information, and all we need to do to make people believe is to somehow “fill in” the deficit in other people’s understanding. Just telling people the evidence for the truth will be enough to replace their false beliefs. Beliefs don’t work like that.

Psychological factors affect how we process information – such as what we already believe, who we trust and how we remember. Debunkers need to work with this, rather than against if they want the best chance of being believed.

The most important thing is to provide an alternative explanation. An experiment by Hollryn Johnson and Colleen Seifert, shows how to persuade people better. These two psychologists recruited participants to listen to news reports about a fictional warehouse fire, and then answer some comprehension questions.

Some of the participants were told that the fire was started by a short circuit in a closet near some cylinders containing potentially explosive gas. Yet when this information was corrected – by saying the closet was empty – they still clung to the belief.

A follow-up experiment showed the best way to effectively correct such misinformation. The follow-up was similar to the first experiment, except that it involved participants who were given a plausible alternative explanation: that evidence was found that arson caused the fire. It was only those who were given a plausible alternative that were able to let go of the misinformation about the gas cylinders.

Lewandowsky and Cook argue that experiments like these show the dangers of arguing against a misinformed position. If you try and debunk a myth, you may end up reinforcing that belief, strengthening the misinformation in people’s mind without making the correct information take hold.

What you must do, they argue, is to start with the plausible alternative (that obviously you believe is correct). If you must mention a myth, you should mention this second, and only after clearly warning people that you’re about to discuss something that isn’t true.

This debunking advice is also worth bearing in mind if you find yourself clinging to your own beliefs in the face of contradictory facts. You can’t be right all of the time, after all.

Read more about the best way to win an argument.

If you have an everyday psychological phenomenon you’d like to see written about in these columns please get in touch @tomstafford or ideas@idiolect.org.uk. Thanks to Ullrich Ecker for advice on this topic.

This is my BBC Future column from last week, original here

12 thoughts on “Evidence based debunking”

  1. Geneticists.
    Biologist.
    Neurologist.
    Physicists.
    alChemists.

    Can determine the cause, but psychology, is everyones job.

    Including the supposedly impaired.

    Women love psychology jobs, just sayin bros.

    1. Modern psychology, such as this article, is based on science; for example, Carol Dweck’s research on growth vs fixed mindset:

      “According to Dweck, individuals can be placed on a continuum according to their implicit views of where ability comes from. Some believe their success is based on innate ability; these are said to have a “fixed” theory of intelligence (fixed mindset). Others, who believe their success is based on hard work, learning, training and doggedness are said to have a “growth” or an “incremental” theory of intelligence (growth mindset).”

      Modern psychology uses science to make models of the brain that are predictive enough to be very useful, but have no promises about actually being how the brain is working. High-level AI isn’t in reach yet[1] because psychology science is still working top-down, trying to understand people by watching what they do; but it’s effective at understanding how to behave around people.

      [1] Yet! Voice recognition in phones is improving because Google and Apple have made heavy use of research about deep-neural-network-based machine learning, and they’ve been able to build amazing low-level AI to recognize speech from it. Computational Neuroscience is approaching it from a direction that allows more effective machine learning integration; I’m hopeful we’ll be able to make real high-level AI someday.

  2. These days, in an era when information is so abundant and easily available, many folks are responding by simply ‘choosing’ their own ‘facts’. So debunking ideologies (like climate change or evolution) is no more about ‘facts’ or ‘persuasion’ than trying to convince a door-to-door evangelist.

  3. Great – like all interesting studies it brings up more questions than it answers, I think.

    It was only those who were given a plausible alternative that were able to let go of the misinformation

    Although believing it could have been arson does not hinge on the subjects renouncing beliefs closest to their conservative little hearts. It seemed like the information presented was all new (the investigation of a recent fire) so I’m confused as to how it could change the mind of a well-read denier. I did like the research (via you, I believe) suggesting that we pose questions to the denier (“what do you think causes the heat island effect?”)or asking them to teach the material to someone else.

    I believe it’s a misunderstanding of science limitations. It’s why physicist Lisa Randall suggested that nothing about Newton’s Laws were wrong (to paraphrase) pg 15 of her popular book.

  4. Reblogged this on Diane Cleverley and commented:
    Great analysis that might explain why certain medical misinformation persists in the general public, depsite the fact it has been refuted, even recinded by the very researchers who first presented it! A better tactic for education would be to present a more plausable cause that the public could readily understand.

  5. This perhaps explains why climate sceptics with more scientific education tend to be the most sceptical that humans are causing global warming.

    i09 linked to a new study about this and reported that this effect was only true for the tea party but was the inverse for other parties. Of course it was only one survey, and it was conducted in New Hampshire, which is really the Portland of the Northeast anyway.

    http://www.tandfonline.com/doi/full/10.1080/09644016.2014.976485

Leave a comment