Using rational argument to change minds

I have a longer piece in the latest issue of Contributoria: What’s the evidence on using rational argument to change people’s minds? Here’s a few snips from the opening:

Are we, the human species, unreasonable? Do rational arguments have any power to sway us, or is it all intuition, hidden motivations, and various other forms of prejudice?

…the picture of human rationality painted by our profession can seem pretty bleak. Every week I hear about a new piece of research which shows up some quirk of our minds, like the one about people given a heavy clip board judge public issues as more important than people given a light clip board. Or that more attractive people are judged as more trustworthy, or they arguments they give as more intelligent.

…I set out to get to the bottom of the evidence on how we respond to rational arguments. Does rationality lose out every time to irrational motivations? Or is there any hope to those of us who want to persuade because we have good arguments, not because we are handsome, or popular, or offer heavy clipboards.

You can read the full thing here, and while you’re over there check out the rest of the the Contributoria site – all of the articles on which are published under a CC license and commissioned by members. On which note, a massive thanks to everyone who backed my proposal and offered comments (see previous announcements). Special thanks to Josie and Dan for giving close readings to the piece before it was finished.

Edit: Contributoria didn’t last long, but I republished this essay and some others in an ebook “For argument’s sake: evidence that reason can change minds” (amazon, smashwords)

 

Research Digest posts, #1: A self-fulfilling fallacy?

This week I will be blogging over at the BPS Research Digest. The Digest was written for over ten years by psychology-writer extraordinaire Christian Jarrett, and I’m one of a series of guest editors during the transition period to a new permanent editor.

My first piece is now up, and here is the opening:

Lady Luck is fickle, but many of us believe we can read her mood. A new study of one year’s worth of bets made via an online betting site shows that gamblers’ attempts to predict when their luck will turn has some unexpected consequences.

Read the rest over at the digest, I’ll post about the other stories I’ve written as they go up.

What’s the evidence for the power of reason to change minds?

Last month I proposed an article for Contributoria, titled What’s the evidence on using rational argument to change people’s minds?. Unfortunately, I had such fun reading about the topic that I missed the end-of-month deadline and now need to get backers for my proposal again.

So, here’s something from my proposal, please consider backing it so I can put my research to good use:

Is it true that “you can’t tell anybody anything”? From pub arguments to ideology-driven party political disputes it can sometimes seem like people have their minds all made up, that there’s no point trying to persuade anybody of anything. Popular psychology books reinforce the idea that we’re emotional, irrational creatures (Dan Ariely “Predictably irrational”, David McRaney “You Are Not So Smart”). This piece will be 3000 words on the evidence from psychological science about persuasion by rational argument.

All you need to do to back proposals, currently, is sign up for the site. You can see all current proposals here. Written articles are Creative Commons licensed.

Back the proposal: What’s the evidence on using rational argument to change people’s minds?

Full disclosure: I’ll be paid by Contributoria if the proposal is backed

Update:: Backed! Thanks all! Watch this space for the finished article. I promise I’ll make the deadline this time

What’s the evidence on using rational argument to change people’s minds?

Contributoria is an experiment in community funded, collaborative journalism. What that means is that you can propose an article you’d like to write, and back proposals by others that you’d like to see written. There’s an article I’d like to write: What’s the evidence on using rational argument to change people’s minds?. Here’s something from the proposal:

Is it true that “you can’t tell anybody anything”? From pub arguments to ideology-driven party political disputes it can sometimes seem like people have their minds all made up, that there’s no point trying to persuade anybody of anything. Popular psychology books reinforce the idea that we’re emotional, irrational creatures (Dan Ariely “Predictably irrational”, David McRaney “You Are Not So Smart”). This piece will be 2000 words on the evidence from psychological science about persuasion by rational argument.

If the proposal is backed it will give me a chance to look at the evidence on things like the , on whether political extremism is supported by an illusion of explanatory depth (and how that can be corrected), and on how we treat all those social psychology priming experiments which suggest that our opinions on things can be pushed about by irrelevant factors such as the weight of a clipboard we’re holding.

All you need to do to back proposals, currently, is sign up for the site. You can see all current proposals here. Written articles are Creative Commons licensed.

Back the proposal: What’s the evidence on using rational argument to change people’s minds?

Full disclosure: I’ll be paid by Contributoria if the proposal is backed

Update: Backed! That was quick! Much thanks mindhacks.com readers! I’d better get reading and writing now…

Why Christmas rituals make tasty food

All of us carry out rituals in our daily lives, whether it is shaking hands or clinking glasses before we drink. At this time of year, the performance of customs and traditions is widespread – from sharing crackers, to pulling the wishbone on the turkey and lighting the Christmas pudding.

These rituals might seem like light-hearted traditions, but I’m going to try and persuade you that they are echoes of our evolutionary history, something which can tell us about how humans came to relate to each other before we had language. And the story starts by exploring how rituals can make our food much tastier.

In recent years, studies have suggested that performing small rituals can influence people’s enjoyment of what they eat. In one experiment, Kathleen Vohs from the University of Minnesota and colleagues explored how ritual affected people’s experience of eating a chocolate bar. Half of the people in the study were instructed to relax for a moment and then eat the chocolate bar as they normally would. The other half were given a simple ritual to perform, which involved breaking the chocolate bar in half while it was still inside its wrapper, and then unwrapping each half and eating it in turn.

Something about carefully following these instructions before eating the chocolate bar had a dramatic effect. People who had focused on the ritual said they enjoyed eating the chocolate more, rating the experience 15% higher than the control group. They also spent longer eating the chocolate, savouring the flavour for 50% longer than the control group. Perhaps most persuasively, they also said they would pay almost twice as much for such a chocolate.

This experiment shows that a small act can significantly increase the value we get from a simple food experience. Vohs and colleagues went on to test the next obvious question – how exactly do rituals work this magic? Repeating the experiment, they asked participants to describe and rate the act of eating the chocolate bar. Was it fun? Boring? Interesting? This seemed to be a critical variable – those participants who were made to perform the ritual rated the experience as more fun, less boring and more interesting. Statistical analysis showed that this was the reason they enjoyed the chocolate more, and were more willing to pay extra.

So, rituals appear to make people pay attention to what they are doing, allowing them to concentrate their minds on the positives of a simple pleasure. But could there be more to rituals? Given that they appear in many realms of life that have nothing to do with food –from religious services to presidential inaugurations – could their performance have deeper roots in our evolutionary history? Attempting to answer the question takes us beyond the research I’ve been discussing so far and into the complex and controversial debate about the evolution of human nature.

In his book, The Symbolic Species, Terrance Deacon claims that ritual played a special role in human evolution, in particular, at the transition point where we began to acquire the building blocks of language. Deacon’s argument is that the very first “symbols” we used to communicate, the things that became the roots of human language, can’t have been anything like the words we use so easily and thoughtlessly today. He argues that these first symbols would have been made up of extended, effortful and complex sequences of behaviours performed in a group – in other words, rituals. These symbols were needed because of the way early humans arranged their family groups and, in particular, shared the products of hunting. Early humans needed a way to tell each other who had what responsibilities and which privileges; who was part of the family, and who could share the food, for instance. These ideas are particularly hard to refer to by pointing. Rituals, says Deacon, were the evolutionary answer to the conundrum of connecting human groups and checking they had a shared understanding of how the group worked.

If you buy this evolutionary story – and plenty don’t – it gives you a way to understand why exactly our minds might have a weakness for ritual. A small ritual makes food more enjoyable, but why does it have that effect? Deacon’s answer is that our love of rituals evolved with our need to share food. Early humans who enjoyed rituals had more offspring. I speculate that an easy shortcut for evolution to find to make us enjoy rituals is by connecting our minds to that the rituals make the food more enjoyable.

So, for those sitting down with family this holiday, don’t skip the traditional rituals – sing the songs, pull the crackers, clink the glasses and listen to Uncle Vinnie repeat his funny anecdotes for the hundredth time. The rituals will help you enjoy the food more, and carry with them an echo of our long history as a species, and all the feasts the tribe shared before there even was Christmas.

This is my latest column for BBC Future. You can see the original here. Merry Christmas y’all!

Why the stupid think they’re smart

Psychologists have shown humans are poor judges of their own abilities, from sense of humour to grammar. Those worst at it are the worst judges of all.

You’re pretty smart right? Clever, and funny too. Of course you are, just like me. But wouldn’t it be terrible if we were mistaken? Psychologists have shown that we are more likely to be blind to our own failings than perhaps we realise. This could explain why some incompetent people are so annoying, and also inject a healthy dose of humility into our own sense of self-regard.

In 1999, Justin Kruger and David Dunning, from Cornell University, New York, tested whether people who lack the skills or abilities for something are also more likely to lack awareness of their lack of ability. At the start of their research paper they cite a Pittsburgh bank robber called McArthur Wheeler as an example, who was arrested in 1995 shortly after robbing two banks in broad daylight without wearing a mask or any other kind of disguise. When police showed him the security camera footage, he protested “But I wore the juice”. The hapless criminal believed that if you rubbed your face with lemon juice you would be invisible to security cameras.

Kruger and Dunning were interested in testing another kind of laughing matter. They asked professional comedians to rate 30 jokes for funniness. Then, 65 undergraduates were asked to rate the jokes too, and then ranked according to how well their judgements matched those of the professionals. They were also asked how well they thought they had done compared to the average person.

As you might expect, most people thought their ability to tell what was funny was above average. The results were, however, most interesting when split according to how well participants performed. Those slightly above average in their ability to rate jokes were highly accurate in their self-assessment, while those who actually did the best tended to think they were only slightly above average. Participants who were least able to judge what was funny (at least according to the professional comics) were also least able to accurately assess their own ability.

This finding was not a quirk of trying to measure subjective sense of humour. The researchers repeated the experiment, only this time with tests of logical reasoning and grammar. These disciplines have defined answers, and in each case they found the same pattern: those people who performed the worst were also the worst in estimating their own aptitude. In all three studies, those whose performance put them in the lowest quarter massively overestimated their own abilities by rating themselves as above average.

It didn’t even help the poor performers to be given a benchmark. In a later study, the most incompetent participants still failed to realise they were bottom of the pack even when given feedback on the performance of others.

Kruger and Dunning’s interpretation is that accurately assessing skill level relies on some of the same core abilities as actually performing that skill, so the least competent suffer a double deficit. Not only are they incompetent, but they lack the mental tools to judge their own incompetence.

In a key final test, Kruger and Dunning trained a group of poor performers in logical reasoning tasks. This improved participants’ self-assessments, suggesting that ability levels really did influence self-awareness.

Other research has shown that this “unskilled and unaware of it” effect holds in real-life situations, not just in abstract laboratory tests. For example, hunters who know the least about firearms also have the most inaccurate view of their firearm knowledge, and doctors with the worst patient-interviewing skills are the least likely to recognise their inadequacies.

What has become known as the Dunning-Kruger effect is an example of what psychologists call metacognition – thinking about thinking. It’s also something that should give us all pause for thought. The effect might just explain the apparently baffling self belief of some of your friends and colleagues. But before you start getting too smug, just remember one thing. As unlikely as you might think it is, you too could be walking around blissfully ignorant of your ignorance.

This is my BBC Future column from last week. The original is here.

Does studying economics make you more selfish?

When economics students learn about what makes fellow humans tick it affects the way they treat others. Not necessarily in a good way, as Tom Stafford explains.

Studying human behaviour can be like a dog trying to catch its own tail. As we learn more about ourselves, our new beliefs change how we behave. Research on economics students showed this in action: textbooks describing facts and theories about human behaviour can affect the people studying them.

Economic models are often based on an imaginary character called the rational actor, who, with no messy and complex inner world, relentlessly pursues a set of desires ranked according to the costs and benefits. Rational actors help create simple models of economies and societies. According to rational choice theory, some of the predictions governing these hypothetical worlds are common sense: people should prefer more to less, firms should only do things that make a profit and, if the price is right, you should be prepared to give up anything you own.

Another tool used to help us understand our motivations and actions is game theory, which examines how you make choices when their outcomes are affected by the choices of others. To determine which of a number of options to go for, you need a theory about what the other person will do (and your theory needs to encompass the other person’s theory about what you will do, and so on). Rational actor theory says other players in the game all want the best outcome for themselves, and that they will assume the same about you.

The most famous game in game theory is the “prisoner’s dilemma”, in which you are one of a pair of criminals arrested and held in separate cells. The police make you this offer: you can inform on your partner, in which case you either get off scot free (if your partner keeps quiet), or you both get a few years in prison (if he informs on you too). Alternatively you can keep quiet, in which case you either get a few years (if your partner also keeps quiet), or you get a long sentence (if he informs on you, leading to him getting off scot free). Your partner, of course, faces exactly the same choice.

If you’re a rational actor, it’s an easy decision. You should inform on your partner in crime because if he keeps quiet, you go free, and if he informs on you, both of you go to prison, but the sentence will be either the same length or shorter than if you keep quiet.

Weirdly, and thankfully, this isn’t what happens if you ask real people to play the prisoner’s dilemma. Around the world, in most societies, most people maintain the criminals’ pact of silence. The exceptions who opt to act solely in their own interests are known in economics as “free riders” – individuals who take benefits without paying costs.

Self(ish)-selecting group

The prisoner’s dilemma is a theoretical tool, but there are plenty of parallel choices – and free riders – in the real world. People who are always late for appointments with others don’t have to hurry or wait for others. Some use roads and hospitals without paying their taxes. There are lots of interesting reasons why most of us turn up on time and don’t avoid paying taxes, even though these might be the selfish “rational” choices according to most economic models.

Crucially, rational actor theory appears more useful for predicting the actions of certain groups of people. One group who have been found to free ride more than others in repeated studies is people who have studied economics. In a study published in 1993, Robert Frank and colleagues from Cornell University, in Ithaca, New York State, tested this idea with a version of the prisoner’s dilemma game. Economics students “informed on” other players 60% of the time, while those studying other subjects did so 39% of the time. Men have previously been found to be more self-interested in such tests, and more men study economics than women. However even after controlling for this sex difference, Frank found economics students were 17% more likely to take the selfish route when playing the prisoner’s dilemma.

In good news for educators everywhere, the team found that the longer students had been at university, the higher their rates of cooperation. In other words, higher education (or simple growing up), seemed to make people more likely to put their faith in human co-operation. The economists again proved to be the exception. For them extra years of study did nothing to undermine their selfish rationality.

Frank’s group then went on to carry out surveys on whether students would return money they had found or report being undercharged, both at the start and end of their courses. Economics students were more likely to see themselves and others as more self-interested following their studies than a control group studying astronomy. This was especially true among those studying under a tutor who taught game theory and focused on notions of survival imperatives militating against co-operation.

Subsequent work has questioned these findings, suggesting that selfish people are just more likely to study economics, and that Frank’s surveys and games tell us little about real-world moral behaviour. It is true that what individuals do in the highly artificial situation of being presented with the prisoner’s dilemma doesn’t necessarily tell us how they will behave in more complex real-world situations.

In related work, Eric Schwitzgebel has shown that students and teachers of ethical philosophy don’t seem to behave more ethically when their behaviour is assessed using a range of real-world variables. Perhaps, says Schwitzgebel, we shouldn’t be surprised that economics students who have been taught about the prisoner’s dilemma, act in line with what they’ve been taught when tested in a classroom. Again, this is a long way from showing any influence on real world behaviour, some argue.

The lessons of what people do in tests and games are limited because of the additional complexities involved in real-world moral choices with real and important consequences. Yet I hesitate to dismiss the results of these experiments. We shouldn’t leap to conclusions based on the few simple experiments that have been done, but if we tell students that it makes sense to see the world through the eyes of the selfish rational actor, my suspicion is that they are more likely to do so.

Multiple factors influence our behaviour, of which formal education is just one. Economics and economic opinions are also prominent throughout the news media, for instance. But what the experiments above demonstrate, in one small way at least, is that what we are taught about human behaviour can alter it.

This is my column from BBC Future last week. You can see the original here. Thanks to Eric for some references and comments on this topic.