What gambling monkeys teach us about human rationality

We often make stupid choices when gambling, says Tom Stafford, but if you look at how monkeys act in the same situation, maybe there’s good reason.

When we gamble, something odd and seemingly irrational happens.

It’s called the ‘hot hand’ fallacy – a belief that your luck comes in streaks – and it can lose you a lot of money. Win on roulette and your chances of winning again aren’t more or less – they stay exactly the same. But something in human psychology resists this fact, and people often place money on the premise that streaks of luck will continue – the so called ‘hot hand’.

The opposite superstition is to bet that a streak has to end, in the false belief that independent events of chance must somehow even out. This is known as the gambler’s fallacy, and achieved notoriety at the Casino de Monte-Carlo on 18 August 1913. The ball fell on black 26 times in a row, and as the streak lengthened gamblers lost millions betting on red, believing that the chances changed with the length of the run of blacks.

Why do people act this way time and time again? We can discover intriguing insights, it seems, by recruiting monkeys and getting them to gamble too. If these animals make dumb choices like us, perhaps it could tell us more about ourselves.

First though, let’s look at what makes some games particularly likely to trigger these effects. Many results in games are based on a skill element, so it makes reasonable sense to bet, for instance, that a top striker like Lionel Messi is more likely to score a goal than a low-scoring defender.

Yet plenty of games contain randomness. For truly random events like roulette or the lottery, there is no force which makes clumps more or less likely to continue. Consider coin tosses: if you have tossed 10 heads in a row your chance of throwing another heads is still 50:50 (although, of course, at the point before you’ve thrown any, the overall odds of throwing 10 in a row is still minuscule).

The hot hand and gambler’s fallacies both show that we tend to have an unreasonable faith in the non-randomness of the universe, as if we can’t quite believe that those coins (or roulette wheels, or playing cards) really are due to the same chances on each flip, spin or deal.

It’s a result that sometimes makes us sneer at the irrationality of human psychology. But that conclusion may need revising.

Cross-species gambling

An experiment reported by Tommy Blanchard of the University of Rochester in New York State, and colleagues, shows that monkeys playing a gambling game are swayed by the same hot hand bias as humans. Their experiments involved three monkeys controlling a computer display with their eye-movements – indicating their choices by shifting their gaze left or right. In the experiment they were given two options, only one of which delivered a reward. When the correct option was random – the same 50:50 chance as a coin flip – the monkeys still had a tendency to select the previously winning option, as if luck should continue, clumping together in streaks.

The reason the result is so interesting is that monkeys aren’t taught probability theory as school. They never learn theories of randomness, or pick up complex ideas about chance events. The monkey’s choices must be based on some more primitive instincts about how the world works – they can’t be displaying irrational beliefs about probability, because they cannot have false beliefs, in the way humans can, about how luck works. Yet they show the same bias.

What’s going on, the researchers argue, is that it’s usually beneficial to behave in this manner. In most of life, chains of success or failure are linked for good reason – some days you really do have your eye on your tennis serve, or everything goes wrong with your car on the same day because the mechanics of the parts are connected. In these cases, the events reflect an underlying reality, and one you can take advantage of to predict what happens next. An example that works well for the monkeys is food. Finding high-value morsels like ripe food is a chance event, but also one where each instance isn’t independent. If you find one fruit on a tree the chances are that you’ll find more.

The wider lesson for students of human nature is that we shouldn’t be quick to call behaviours irrational. Sure, belief in the hot hand might make you bet wrong on a series of coin flips, or worse, lose a pot of money. But it may be that across the timespan in evolution, thinking that luck comes in clumps turned out to be useful more often than it was harmful.

This is my BBC Future article from last week. The original is here

Is public opinion rational?

There is no shortage of misconceptions. The British public believes that for every £100 spent on benefits, £24 is claimed fraudulently (the actual figure is £0.70). We think that 31% of the population are immigrants (actually its 13%). One recent headline summed it up: “British Public wrong about nearly everything, and I’d bet good money that it isn’t just the British who are exceptionally misinformed.

This looks like a problem for democracy, which supposes a rational and informed public opinion. But perhaps it isn’t, at least according to a body of political science research neatly summarised by Will Jennings in his chapter of a new book “Sex, lies & the ballot box: 50 things you need to know about British elections“. The book is a collection of accessible essays by British political scientists, and has a far wider scope than the book subtitle implies: there are important morals here for anyone interested in collective human behaviour, not just those interested in elections.

Will’s chapter discusses the “public opinion as thermostat” theory. This, briefly, is that the public can be misinformed about absolute statistics, but we can still change our strength of feeling in an appropriate way. So, for example, we may be misled about the absolute unemployment rate, but can still discern whether unemployment is getting better or worse. There’s evidence to support this view, and the chapter includes this striking graph (reproduced with permission), showing the percentage of people saying “unemployment” is the most important issue facing the country against the actual unemployment rate . As you can see public opinion tracks reality with remarkable accuracy:

Unemployment rate (source: ONS) and share of voters rating unemployment as the most important issue facing the country (source: ipsos-MORI), from Will Jenning's chapter in "Sex, lies & the ballot box" (p.35)
Unemployment rate and share of voters rating unemployment as the most important issue facing the country, from Will Jenning’s chapter in “Sex, lie & the ballot box” (p.35)

The topic of how a biased and misinformed public can make rational collective decisions is a fascinating one, which has received attention from disciplines ranging from psychology to political science. I’m looking forward to reading the rest of the book to get more evidence based insights into how our psychological biases play out when decision making is at the collective level of elections.

Full disclosure: Will is a friend of mine and sent me a free copy of the book.

Link: “Sex, lies & the ballot box (Edited by Philip Cowley & Robert Ford).

Link: Guardian data blog Five things we can learn from Sex, Lies and the Ballot Box

Implicit racism in academia

teacher-309533_640Subtle racism is prevalent in US and UK universities, according to a new paper commissioned by the Leadership Foundation for Higher Education and released last week, reports The Times Higher Education.

Black professors surveyed for the paper said they were treated differently than white colleagues in the form of receiving less eye contact or requests for their opinion, that they felt excluded in meetings and experienced undermining of their work. “I have to downplay my achievements sometimes to be accepted” said one academic, explaining that colleagues that didn’t expect a black woman to be clever and articulate. Senior managers often dismiss racist incidents as conflicts of personalities or believe them to be exaggerated, found the paper.

And all this in institutions where almost all staff would say they are not just “not racist” but where many would say they were actively committed to fighting prejudice.

This seems like a clear case of the operation of implicit biases – where there is a contradiction between people’s egalitarian beliefs and their racist actions. Implicit biases are an industry in psychology, where tools such as the implicit association test (IAT) are used to measure them. The IAT is a fairly typical cognitive psychology-type study: individuals sit in front of a computer and the speed of their reactions to stimuli are measured (the stimuli are things like faces of people with different ethnicities, which is how we get out a measure of implicit prejudice).

The LFHE paper is a nice opportunity to connect this lab measure with the reality of implicit bias ‘in the wild’. In particular, along with some colleagues, I have been interested in exactly what an implicit bias, is, psychologically.

Commonly, implicit biases are described as if they are unconscious or somehow outside of the awareness of those holding them. Unfortunately, this hasn’t been shown to be the case (in fact the opposite may be true – there’s some evidence that people can predict their IAT scores fairly accurately). Worse, the very idea of being unaware of a bias is badly specified. Does ‘unaware’ mean you aren’t aware of your racist feelings? Of your racist behaviour? Of that the feelings, in this case, have produced the behaviour?

The racist behaviours reported in the paper – avoiding eye-contact, assuming that discrimination is due to personalities and not race, etc – could all work at any or all of these levels of awareness. Although the behaviours are subtle, and contradict people’s expressed, anti-racist, opinions, the white academics could still be completely aware. They could know that black academics make them feel awkward or argumentative, and know that this is due to their race. Or they could be completely unaware. They could know that they don’t trust the opinions of certain academics, for example, but not realise that race is a factor in why they feel this way.

Just because the behaviour is subtle, or the psychological phenomenon is called ‘implicit’, doesn’t mean we can be certain about what people really know about it. The real value in the notion of implicit bias is that it reminds us that prejudice can exist in how we behave, not just in what we say and believe.

Full disclosure: I am funded by the Leverhulme Trust to work on project looking at the philosophy and psychology of implicit bias . This post is cross-posted on the project blog. Run your own IAT with our open-source code: Open-IAT!

Why bad news dominates the headlines

Why are newspapers and TV broadcasts filled with disaster, corruption and incompetence? It may be because we’re drawn to depressing stories without realising, says psychologist Tom Stafford.

When you read the news, sometimes it can feel like the only things reported are terrible, depressing events. Why does the media concentrate on the bad things in life, rather than the good? And what might this depressing slant say about us, the audience?

It isn’t that these are the only things that happen. Perhaps journalists are drawn to reporting bad news because sudden disaster is more compelling than slow improvements. Or it could be that newsgatherers believe that cynical reports of corrupt politicians or unfortunate events make for simpler stories. But another strong possibility is that we, the readers or viewers, have trained journalists to focus on these things. Many people often say that they would prefer good news: but is that actually true?

To explore this possibility, researchers Marc Trussler and Stuart Soroka, set up an experiment, run at McGill University in Canada. They were dissatisfied with previous research on how people relate to the news – either the studies were uncontrolled (letting people browse news at home, for example, where you can’t even tell who is using the computer), or they were unrealistic (inviting them to select stories in the lab, where every participant knew their choices would be closely watched by the experimenter). So, the team decided to try a new strategy: deception.

 

Trick question

Trussler and Soroka invited participants from their university to come to the lab for “a study of eye tracking”. The volunteers were first asked to select some stories about politics to read from a news website so that a camera could make some baseline eye-tracking measures. It was important, they were told, that they actually read the articles, so the right measurements could be prepared, but it didn’t matter what they read.

After this ‘preparation’ phase, they watched a short video (the main purpose of the experiment as far as the subjects were concerned, but it was in fact just a filler task), and then they answered questions on the kind of political news they would like to read.

The results of the experiment, as well as the stories that were read most, were somewhat depressing. Participants often chose stories with a negative tone – corruption, set-backs, hypocrisy and so on – rather than neutral or positive stories. People who were more interested in current affairs and politics were particularly likely to choose the bad news.

And yet when asked, these people said they preferred good news. On average, they said that the media was too focussed on negative stories.

 

Danger reaction

The researchers present their experiment as solid evidence of a so called “negativity bias“, psychologists’ term for our collective hunger to hear, and remember bad news.

It isn’t just schadenfreude, the theory goes, but that we’ve evolved to react quickly to potential threats. Bad news could be a signal that we need to change what we’re doing to avoid danger.

As you’d expect from this theory, there’s some evidence that people respond quicker to negative words. In lab experiments, flash the word “cancer”, “bomb” or “war” up at someone and they can hit a button in response quicker than if that word is “baby”, “smile” or “fun” (despite these pleasant words being slightly more common). We are also able to recognise negative words faster than positive words, and even tell that a word is going to be unpleasant before we can tell exactly what the word is going to be.

So is our vigilance for threats the only way to explain our predilection for bad news? Perhaps not.

There’s another interpretation that Trussler and Soroka put on their evidence: we pay attention to bad news, because on the whole, we think the world is rosier than it actually is. When it comes to our own lives, most of us believe we’re better than average, and that, like the clichés, we expect things to be all right in the end. This pleasant view of the world makes bad news all the more surprising and salient. It is only against a light background that the dark spots are highlighted.

So our attraction to bad news may be more complex than just journalistic cynicism or a hunger springing from the darkness within.

And that, on another bad news day, gives me a little bit of hope for humanity.

Are women and men forever destined to think differently?

By Tom Stafford, University of Sheffield

The headlines

The Australian: Male and female brains still unequal

The International Institute for Applied Systems Analysis: Gender disparities in cognition will not diminish

The Economist: A variation in the cognitive abilities of the two sexes may be more about social development than gender stereotypes

The story

Everybody has an opinion on men, women and the difference (or not) between them. Now a new study has used a massive and long-running European survey to investigate how differences in cognitive ability are changing. This is super smart, because it offers us an escape from arguing about whether men and women are different in how they think, allowing us some insight into how any such differences might develop.

What they actually did

Researchers led by Daniela Weber at Austria’s International Institute for Applied Systems Analysis analysed data collected as part of the European Survey of Health, Ageing and Retirement. This includes data analysed in this study from approximately 31,000 adults, men and women all aged older than 50. As well as answering demographic questions, the survey participants took short quizzes which tested their memory, numeracy and verbal fluency (this last item involved a classic test which asks people to name as many animals as they could in 60 seconds). Alongside each test score, we have the year the participant was born in, as well as measures of gender equality and economic development for the country where they grew up.

What they found

The results show that as a country develops economically, the differences in cognitive ability between men and women change. But the pattern isn’t straightforward. Differences in verbal fluency disappear (so that an advantage on this test for men born in the 1920s over women is not found for those born in the 1950s). Differences in numeracy diminish (so the male advantage is less) and differences in memory actually increase (so that a female advantage is accentuated).

Further analysis looked at the how these differences in cognitive performance related to the amount of education men and women got. In all regions women tended to have fewer years of education, on average, then men. But, importantly, the size of this difference varied. This allowed the researchers to gauge how differences in education affected cognitive performance.

For all three abilities tested, there was a relationship between the size of the differences in the amount of education and the size of the difference in cognitive performance: fewer years of education for women was associated with worse scores for women, as you’d expect.

What varied for the three abilities was in the researchers’ predictions for the situation where men and women spent an equal amount of time in education: for memory this scenario was associated with a distinct female advantage, for numeracy a male advantage and for verbal fluency, there was no difference.

What this means

The thing that dogs studies on gender differences in cognition is the question of why these differences exist. People have such strong expectations, that they often leap to the assumption that any observed difference must reflect something fundamental about men vs women. Here, consider the example of the Australian newspaper which headlined their take on this story as telling us something about “male and female brains”, the implication being that the unequalness was a fundamental, biological, difference. In fact, research often shows that gender differences in cognitive performance are small, and even then we don’t know why these differences exist.

The great thing about this study is that by looking at how gender differences evolve over time it promises insight into what drives those difference in the first place. The fact that the female memory advantage increases as women are allowed more access to education is, on the face of it, suggestive evidence that at least one cognitive difference between men and women may be unleashed by more equal societies, rather than removed by them.

Tom’s take

The most important thing to take from this research is – as the authors report – increasing gender equality disproportionately benefits women. This is because – no surprise! – gender inequality disproportionately disadvantages women. Even in the area of cognitive performance, this historical denial of opportunities, health and education to women means, at a population level, they have more potential to increase their scores on these tests.

Along with other research on things like IQ, this study found systemmatic improvements in cognitive performance across time for both men and women – as everyone’s opportunities and health increases, so does their cognitive function.

But the provocative suggestion of this study is that as societies develop we won’t necessarily see all gender differences go away. Some cognitive differences may actually increase when women are at less of a disadvantage.

You don’t leap to conclusions based on one study, but this is a neat contribution. One caveat is that even though indices such as “years in education” show diminished gender inequality in Europe, you’d be a fool to think that societies which educated men and women for an equal number of years treated them both equally and put equal expectations on them.

Even if you thought this was true for 2014, you wouldn’t think this was true for European societies of the 1950s (when the youngest of these study participants were growing up). There could be very strong societal influences on cognitive ability – such as expecting women to be good with words and bad with numbers – that simply aren’t captured by the data analysed here.

Personally, I find it interesting to observe how keen people are to seize on such evidence that “essential” gender differences definitely do exist (despite the known confounds of living in a sexist society). My preferred strategy would be to hold judgement and focus on the remaking the definitely sexist society. For certain, we’ll only get the truth when we have an account of how cognitive abilities develop within both biological and social contexts. Studies like this point the way, and suggest that whatever the truth is, it should have some surprises for everyone.

Read more

The original research: The changing face of cognitive gender differences in Europe

My previous column on gender differences: Are men better wired to read maps or is it a tired cliché?

Cordelia Fine’s book, Delusions of gender: how our minds, society, and neuro-sexism create difference

The Conversation

This article was originally published on The Conversation.
Read the original article.

Do we really hate thinking so much we’d electrocute ourselves rather than do it?

By Tom Stafford, University of Sheffield

The headlines

The Guardian: Shocking but true: students prefer jolt of pain than being made to sit and think

Nature: We dislike being alone with our thoughts

Washington Post: Most men would rather shock themselves than be alone with their thoughts

 

The story

Quiet contemplation is so awful that when deprived of the distractions of noise, crowds or smart phones, a bunch of students would rather give themselves electric shocks than sit and think.

 

What they actually did

Psychologists from the universities of Virginia and Harvard in the US carried out a series of 11 studies in which participants – including students and non-students – were left in an unadorned room for six to 15 minutes and asked to “spend time entertaining themselves with their thoughts.” Both groups, and men and women equally, were unable to enjoy this task. Most said they found it difficult to concentrate and that their minds wandered.

In one of the studies, participants were given the option to give themselves an electric shock, for no given reason or reward. Many did, including the majority of male participants, despite the fact that the vast majority of participants had previously rated the shocks as unpleasant and said they would pay to avoid them.

 

How plausible is this?

This is a clever, provocative piece of research. The results are almost certainly reliable; the authors, some of whom are extremely distinguished, discovered in the 11 studies the same basic effect – namely, that being asked to sit and think wasn’t enjoyable. The data from the studies is also freely available, so there’s no chance of statistical jiggery-pokery. This is a real effect. The questions, then, are over what exactly the finding means.

 

Tom’s take

Contrary to what some reporters have implied, this result isn’t just about students – non-students also found being made to sit and think aversive, and there were no differences in this with age. And it isn’t just about men – women generally found the experience as unpleasant. The key result is that being made to sit and think is unpleasant so let’s look at this first before thinking about the shocks.

The results fit with research on sensory deprivation from 50 years ago. Paradoxically, when there are no distractions people find it hard to concentrate. It seems that for most of us, most of the time, our minds need to receive stimulus, interact with the environment, or at least have a task to function enjoyably. Thinking is an active process which involves the world – a far cry from some ideals of “pure thought”.

What the result certainly doesn’t mean, despite the interpretation given by some people – including one author of the study – is that people don’t like thinking. Rather, it’s fair to say that people don’t like being forced to do nothing but think.

It’s possible that there is a White Bear Effect here – also known as the ironic process theory. Famously, if you’re told to think of anything except a white bear, you can’t help but think about a white bear. If you imagine the circumstances of these studies, participants were told they had to sit in their chairs and just think. No singing, no exploring, no exercises. Wouldn’t that make you spend your time (unpleasantly) ruminating on what you couldn’t do?

In this context, are the shocks really so surprising? The shocks were very mild. The participants rated them as unpleasant when they were instructed to shock themselves, but we all know that there’s a big difference between having something done to you (or being told to do something) and choosing to do it yourself.

Although many participants chose to shock themselves I wouldn’t say they were avoiding thinking – rather they were thinking about what it would be like to get another shock. One participant shocked himself 190 times. Perhaps he was exploring how he could learn to cope with the discomfort. Curiosity and exploration are all hallmarks of thinking. It is only the very limited internally directed, stimulus-free kind of thinking to which we can apply the conclusion that it isn’t particular enjoyable.

 

Read more

The original paper: Just think: The challenges of the disengaged mind.

You can see the data over at the Open Science Framework.

Daniel Wegner’s brilliant book on the White Bear problem.

The Conversation

This article was originally published on The Conversation.
Read the original article.

Brains in their feat

Footballers skills seem light years from our own. But, Tom Stafford argues, the jaw-dropping talents on the World Cup pitch have more in common with everyday life than you might think.

The first week of the 2014 World Cup has already given us a clutch of classic moments: Robin Van Persie’s perfect header to open the Dutch onslaught against the Spanish; Australian Tim Cahill’s breathtaking volley to equalise against Holland; and Mexican keeper Guillermo Ochoa defying an increasingly desperate Brazilian attack.

We can’t help but be dazzled by the skills on display. Whether it is a header lobbed over an open-mouthed goalie, or a keeper’s last-second leap to save the goal, it can seem as if the footballers have access to talents that are not just beyond description, but beyond conscious comprehension. But the players sprinting, diving and straining on Brazil’s football pitches have a lot more in common with everyday intelligence than you might think.

We often talk about astonishing athletic feats as if they are something completely different from everyday thought. When we say a footballer acts on instinct, out of habit or due to his training, we distance what they do from that we hear echoing within our own heads.

The idea of “muscle memory” encourages this – allowing us to cordon off feats of motor skill as a special kind of psychological phenomenon, something stored, like magic potion, in our muscles. But the truth, of course, is that so called muscle memories are stored in our brains, just like every other kind of memory. What is more, these examples of great skill are not so different from ordinary thought.

If you speak to world-class athletes, such as World Cup footballers, about what they do, they reveal that a lot of conscious reasoning goes into those moments of sublime skill. Here’s England’s Wayne Rooney, in 2012, describing what it feels like as a cross comes into the penalty box: “You’re asking yourself six questions in a split second. Maybe you’ve got time to bring it down on the chest and shoot, or you have to head it first-time. If the defender is there, you’ve obviously got to try and hit it first-time. If he’s farther back, you’ve got space to take a touch. You get the decision made. Then it’s obviously about the execution.”

All this in half a second! Rooney is obviously thinking more, not less, during these most crucial moments.

This is not an isolated example. Dennis Bergkamp delighted Dutch fans by scoring a beautiful winning goal from a long pass in the 1998 World Cup quarter final against Argentina (and if you watch a clip on YouTube, make sure it the one with the ecstatic commentary by Jack van Gelder). In a subsequent interview Bergkamp describes in minute detail all the factors leading up to the goal, from the moment he made eye contact with the defender who was about to pass the ball, to his calculations about how to control the ball. He even lets slip that part of his brain is keeping track of the wind conditions. Just as with Rooney, this isn’t just a moment of unconscious instinct, but of instinct combined with a whirlwind of conscious reasoning. And it all comes together.

Studies of the way the brain embeds new skills, until the movements become automatic, may help make sense of this picture. We know that athletes like those performing in the World Cup train with many years of deliberate, mindful, practice . As they go through their drills, dedicated brain networks develop, allowing the movements to be deployed with less effort and more control. As well as the brain networks involved becoming more refined, the areas of the brain most active in controlling a movement change with increased skill  – as we practice, areas deeper within the brain reorganise to take on more of the work, leaving the cortex, including areas associated with planning and reasoning, free to take on new tasks.

But this doesn’t mean we think less when we’re highly skilled. On the contrary, this process called automatisation means that we think differently. Bergkamp doesn’t have to think about his foot when he wants to control a ball, so he’s free to think about the wind, or the defender, or when  exactly he wants to control the ball. For highly practiced movements we have to think less about controlling every action but what we do is still ultimately in the service of our overall targets (like scoring a goal in the case of football). In line with this, and contrary to the idea of skills as robotic-reflexes, experiments show that more flexibility develops alongside increased automaticity.

Maybe we like to think footballers are stupid because we want to feel good about ourselves, and many footballers aren’t as articulate as some of the eggheads we traditionally associate with intelligence (and aren’t trained in being articulate), but all the evidence suggests that the feats we see in the World Cup take an immense amount of thought.

Intelligence involves using conscious deliberation at the right level to optimally control your actions. Driving a car is easier because you don’t have to think about the physics of the combustion engine, and it’s also easier because you no longer have to think about the movements required to change gear or turn on the indicators. But just because driving a car relies on automatic skills like these, doesn’t mean that you’re mindless when driving a car. The better drivers, just like the better footballers, are making more choices each time they show off their talents, not fewer.

So footballer’s immense skills aren’t that different from many everyday things we do like walking, talking or driving a car. We’ve practiced these things so much we don’t have to think about how we’re doing them. We may even not pay much attention to what we’re doing, or have much of a memory for them (ever reached the end of a journey and realised you don’t recall a single thing about the trip?), but that doesn’t mean that we aren’t or couldn’t. In fact, because we have practiced these skills we can deploy them at the same time as other things (walking and chewing gum, talking while tying our shoe laces, etc). This doesn’t diminish their mystery, but it does align it with the central mystery of psychology – how we learn to do anything.

So while you may be unlikely to find yourself in the boots of Bergkamp and Rooney, preparing to drill one past a sprawling keeper, you can at least console yourself with the thought that you’re showing the skills of a World Cup legend every time you get behind the wheel of your car.

A bonus BBC Future column from last week. Here’s the original.