The best way to win an argument

How do you change someone’s mind if you think you are right and they are wrong? Psychology reveals the last thing to do is the tactic we usually resort to.

You are, I’m afraid to say, mistaken. The position you are taking makes no logical sense. Just listen up and I’ll be more than happy to elaborate on the many, many reasons why I’m right and you are wrong. Are you feeling ready to be convinced?

Whether the subject is climate change, the Middle East or forthcoming holiday plans, this is the approach many of us adopt when we try to convince others to change their minds. It’s also an approach that, more often than not, leads to the person on the receiving end hardening their existing position. Fortunately research suggests there is a better way – one that involves more listening, and less trying to bludgeon your opponent into submission.

A little over a decade ago Leonid Rozenblit and Frank Keil from Yale University suggested that in many instances people believe they understand how something works when in fact their understanding is superficial at best. They called this phenomenon “the illusion of explanatory depth“. They began by asking their study participants to rate how well they understood how things like flushing toilets, car speedometers and sewing machines worked, before asking them to explain what they understood and then answer questions on it. The effect they revealed was that, on average, people in the experiment rated their understanding as much worse after it had been put to the test.

What happens, argued the researchers, is that we mistake our familiarity with these things for the belief that we have a detailed understanding of how they work. Usually, nobody tests us and if we have any questions about them we can just take a look. Psychologists call this idea that humans have a tendency to take mental short cuts when making decisions or assessments the “cognitive miser” theory.

Why would we bother expending the effort to really understand things when we can get by without doing so? The interesting thing is that we manage to hide from ourselves exactly how shallow our understanding is.

It’s a phenomenon that will be familiar to anyone who has ever had to teach something. Usually, it only takes the first moments when you start to rehearse what you’ll say to explain a topic, or worse, the first student question, for you to realise that you don’t truly understand it. All over the world, teachers say to each other “I didn’t really understand this until I had to teach it”. Or as researcher and inventor Mark Changizi quipped: “I find that no matter how badly I teach I still learn something”.

Explain yourself

Research published last year on this illusion of understanding shows how the effect might be used to convince others they are wrong. The research team, led by Philip Fernbach, of the University of Colorado, reasoned that the phenomenon might hold as much for political understanding as for things like how toilets work. Perhaps, they figured, people who have strong political opinions would be more open to other viewpoints, if asked to explain exactly how they thought the policy they were advocating would bring about the effects they claimed it would.

Recruiting a sample of Americans via the internet, they polled participants on a set of contentious US policy issues, such as imposing sanctions on Iran, healthcare and approaches to carbon emissions. One group was asked to give their opinion and then provide reasons for why they held that view. This group got the opportunity to put their side of the issue, in the same way anyone in an argument or debate has a chance to argue their case.

Those in the second group did something subtly different. Rather that provide reasons, they were asked to explain how the policy they were advocating would work. They were asked to trace, step by step, from start to finish, the causal path from the policy to the effects it was supposed to have.

The results were clear. People who provided reasons remained as convinced of their positions as they had been before the experiment. Those who were asked to provide explanations softened their views, and reported a correspondingly larger drop in how they rated their understanding of the issues. People who had previously been strongly for or against carbon emissions trading, for example, tended to became more moderate – ranking themselves as less certain in their support or opposition to the policy.

So this is something worth bearing in mind next time you’re trying to convince a friend that we should build more nuclear power stations, that the collapse of capitalism is inevitable, or that dinosaurs co-existed with humans 10,000 years ago. Just remember, however, there’s a chance you might need to be able to explain precisely why you think you are correct. Otherwise you might end up being the one who changes their mind.

This is my BBC Future column from last week. The original is here.

11 Comments

  1. Posted May 26, 2014 at 1:22 pm | Permalink

    Reblogged this on Beyond Meds and commented:
    Whether the subject is climate change, the Middle East or forthcoming holiday plans, this is the approach many of us adopt when we try to convince others to change their minds. It’s also an approach that, more often than not, leads to the person on the receiving end hardening their existing position. Fortunately research suggests there is a better way – one that involves more listening, and less trying to bludgeon your opponent into submission.

  2. Kapitano
    Posted May 26, 2014 at 2:10 pm | Permalink

    It sounds like the good old socratic method.

    Get someone to explain what they think they understand, asking faux-naive questions about details they skate over…in the hope they’ll eventually realise they’re hopelessly confused.

    At which point they’ll get angry and make you drink hemlock.

  3. Posted May 26, 2014 at 2:18 pm | Permalink

    Most people’s perceptions are based on their personal interests, whether it is food, science or politics. “What’s in it for me?” is the deciding factor, not “What’s in it for us?”. It looks like what this article says is that most people lack objectivity.

  4. Posted May 26, 2014 at 2:57 pm | Permalink

    I can see this working nicely for most arguments. But climate change? You either have people denying it exists or questioning the computer models. I can’t imagine how it could be applied then. If the trend shifts towards claiming the rise in temperature will benefit plants it will be far easier to use this strategy, I suppose. But George Will is a smart devil and won’t give up his convictions. Some also think these science debates give credibility to the anti-science crowd.

  5. recruitinganimal
    Posted May 26, 2014 at 10:29 pm | Permalink

    Usually, when you ask someone for the reasons supporting her belief and she supplies them, the conversation does not end there.

    You normally challenge her reasons and make her justify them in turn.

    Doesn’t that lessen the difference between the two techniques?

  6. Posted May 27, 2014 at 2:48 pm | Permalink

    Confirms that we all know a lot less about things than we think we do. It is so easy, and so comforting, and much less mental effort, to jump to conclusions rather than actually understand the nuances of complex issues.

  7. Hohenheim
    Posted May 27, 2014 at 5:29 pm | Permalink

    The difference between the two techniques is that in the former you immediately make them defensive and it becomes an argument. The latter allows them to realize themselves that they don’t know as much as they think. We were taught this in peer mediation on school when I was little. It’s funny to use on stupid people because they will completely contradict themselves and still deny they might be wrong

  8. Eric
    Posted May 30, 2014 at 3:34 pm | Permalink

    Pffft!

    That method is is over 2,400 years old!

    How could THAT possibly work?

  9. Posted June 2, 2014 at 4:34 am | Permalink

    There seem to be two ways to improve discourse involving differences of opinion where people are invested in their view. One, make ourselves and our adversaries more modest by being forced to explain in detail how what we believe actually works or what its consequences are, as this post addresses. And recognizing our adversary’s value as a human being, so that he feels less threatened if we challenge his views. Our political discourse may be broken, but it doesn’t seem impossible to fix.

  10. timgrayuk
    Posted June 8, 2014 at 2:25 pm | Permalink

    I use this quite a lot within digital marketing – but the other way around. Every one in my industry is an ‘expert’ it seems – and when you challenge the thing they are trying to convince you with and ask them why they believe what they believe and explain it in detail they wobble and seem to change their opinion. A really nice article about a great influencer!

  11. Ri3m4nn
    Posted June 15, 2014 at 10:08 am | Permalink

    When an individual repeats “facts”, they remain confident.

    When an individual attempts to describe a process they have not fully engaged in, they develop doubt.

    There is one consideration not mentioned: An individual who has “known”(example: former teacher) and has made a judgement on the subject, but has forgot the process due to lack of recent involvement.

    great article. thanks!


Post a Comment

Required fields are marked *
*
*

Follow

Get every new post delivered to your Inbox.

Join 22,493 other followers