I am giving a talk on 28th October at Off the Shelf, Sheffield’s festival of words. Here is the blurb:
Is it true that “you can’t tell anybody anything”? From pub arguments to ideology-driven party political disputes it can sometimes people have their minds all made up, that there’s no point trying to persuade anybody of anything. Popular psychology books reinforce the idea that we’re emotional, irrational creatures, but Tom Stafford argues against this bleak portrait of human irrationality. He has investigated the psychological science of persuasion by rational argument, interpreting old studies and reporting some new ones which should give hope to those with a faith in reason. Tom tells you how to most effectively change someone’s mind, when people are persuaded by evidence (and when they aren’t) and why evolution might have designed our thinking to work best in groups rather than on our own.
Mostly I’ll be picking up on ideas I outlined in my Contributoria piece: What’s the evidence on using rational argument to change people’s minds? Tickets are £7.50/£6 (cons), the venue is the Showroom Cinema, Paternoster Row, S1 and we start at 7pm (I talk for 45 minutes then there is time for questions). Book tickets by calling the Showroom on 0114 275 7727 or go to showroomworkstation.org.uk The full festival programme is available as a PDF.
Subtle racism is prevalent in US and UK universities, according to a new paper commissioned by the Leadership Foundation for Higher Education and released last week, reports The Times Higher Education.
Black professors surveyed for the paper said they were treated differently than white colleagues in the form of receiving less eye contact or requests for their opinion, that they felt excluded in meetings and experienced undermining of their work. “I have to downplay my achievements sometimes to be accepted” said one academic, explaining that colleagues that didn’t expect a black woman to be clever and articulate. Senior managers often dismiss racist incidents as conflicts of personalities or believe them to be exaggerated, found the paper.
And all this in institutions where almost all staff would say they are not just “not racist” but where many would say they were actively committed to fighting prejudice.
This seems like a clear case of the operation of implicit biases – where there is a contradiction between people’s egalitarian beliefs and their racist actions. Implicit biases are an industry in psychology, where tools such as the implicit association test (IAT) are used to measure them. The IAT is a fairly typical cognitive psychology-type study: individuals sit in front of a computer and the speed of their reactions to stimuli are measured (the stimuli are things like faces of people with different ethnicities, which is how we get out a measure of implicit prejudice).
The LFHE paper is a nice opportunity to connect this lab measure with the reality of implicit bias ‘in the wild’. In particular, along with some colleagues, I have been interested in exactly what an implicit bias, is, psychologically.
Commonly, implicit biases are described as if they are unconscious or somehow outside of the awareness of those holding them. Unfortunately, this hasn’t been shown to be the case (in fact the opposite may be true – there’s some evidence that people can predict their IAT scores fairly accurately). Worse, the very idea of being unaware of a bias is badly specified. Does ‘unaware’ mean you aren’t aware of your racist feelings? Of your racist behaviour? Of that the feelings, in this case, have produced the behaviour?
The racist behaviours reported in the paper – avoiding eye-contact, assuming that discrimination is due to personalities and not race, etc – could all work at any or all of these levels of awareness. Although the behaviours are subtle, and contradict people’s expressed, anti-racist, opinions, the white academics could still be completely aware. They could know that black academics make them feel awkward or argumentative, and know that this is due to their race. Or they could be completely unaware. They could know that they don’t trust the opinions of certain academics, for example, but not realise that race is a factor in why they feel this way.
Just because the behaviour is subtle, or the psychological phenomenon is called ‘implicit’, doesn’t mean we can be certain about what people really know about it. The real value in the notion of implicit bias is that it reminds us that prejudice can exist in how we behave, not just in what we say and believe.
We are now beginning to crack the brain’s code, which allows us to answer such bizarre questions as “what is the speed of thought?”
When he was asked, as a joke, to explain how the mind works in five words, cognitive scientist Steven Pinker didn’t hesitate. “Brain cells fire in patterns”, he replied. It’s a good effort, but all it really does is replace one enigma with another mystery.
It’s long been known that brain cells communicate by firing electrical signals to each other, and we now have myriad technologies for recording their patterns of activity – from electrodes in the brain or on the scalp, to functional magnetic resonance scanners that can detect changes in blood oxygenation. But, having gathered these data, the meaning of these patterns is still an enduring mystery. They seem to dance to a tune we can’t hear, led by rules we don’t know.
Neuroscientists speak of the neural code, and have made some progress in cracking that code. They are figuring out some basic rules, such as when cells in specific parts of the brain are likely to light up depending on the task at hand. Progress has been slow, but in the last decade various research teams around the world have been pursuing a far more ambitious project. We may never be able to see the complete code book, they realised, but by trying to write our own entries, we can begin to pick apart the ways that different patterns correspond to different actions.
Albert Lee and Matthew Wilson, at the Massachusetts Institute of Technology (MIT) first helped to set out the principles in 2002. It progresses like this. First, we record from the brain of a rat – one of our closer relatives, in the grand tree of life – as it runs a maze. Studying the whole brain would be too ambitious, so we can focus our recording on an area known as the hippocampus, known to be important for navigation and memory. If you’ve heard of this area before it is probably because of a famous result which showed that London taxi drivers developed larger hippocampi the longer they had spent navigating the streets of England’s sprawling capital.
While the rat runs the maze we record where it is, and simultaneously how the cells in the hippocampus are firing. The cell firing patterns are thrown into a mathematical algorithm which finds the pattern that best matches each bit of the maze. The language of the cells is no less complex, but now we have a Rosetta Stone against which we can decode it. We then test the algorithm by feeding it freshly recorded patterns, to see if it correctly predicts where the rat was at the point that pattern was recorded.
It doesn’t allow us to completely crack the code, because we still don’t know all the rules, and it can’t help us read the patterns which aren’t from this bit of the brain or which aren’t about maze running, but it is still a powerful tool. For instance, using this technique, the team was able to show that the specific sequence of cell firing repeated in the brain of the rat when it slept after running the maze (and, as a crucial comparison, not in the sleep it had enjoyed before it had run the maze).
Fascinatingly, the sequence repeated faster during sleep – around 20 times faster. This meant that the rat could run the maze in their sleeping minds in a fraction of the time it took them in real life. This could be related to the mnemonic function of sleep; by replaying the memory, it might have helped the rat to consolidate its learning. And the fact that the replay was accelerated might give us a glimpse of the activity that lies behind sudden insights, or experiences where our life “flashes before our eyes”; when not restrained, our thoughts really can retrace familiar paths in “fast forward”. Subsequent work has shown that these maze patterns can run backwards as well as forwards – suggesting that the rats can imagine a goal, like the end of the maze, and work their way back from that to the point where they are.
One application of techniques like these, which are equal parts highly specialised measurement systems and fiercely complicated algorithms, has been to decode the brain activity in patients who are locked in or in a vegetative state. These patients can’t move any of their muscles, and yet they may still be mentally aware and able to hear people talking to them in the same room. First, the doctors ask the patients to imagine activities which are known to active specific brain regions – such as the hippocampus. The data is then decoded so that you know which brain activity corresponds to certain ideas. During future brain scans, the patients can then re-imagine the same activities to answer basic questions. For instance, they might be told to imagine playing tennis to answer yes and walking around their house to answer no – the first form of communication since their injury.
There are other applications, both theoretical science, to probe the inner workings of our minds, and practical domains such as brain-computer interfaces. If, in the future, a paraplegic wants to control a robot arm, or even another person, via a brain interface, then it will rely on the same techniques to decode information and translate it into action. Now the principles have been shown to work, the potential is staggering.
If you have an everyday psychological phenomenon you’d like to see written about in these columns please get in touch @tomstafford or ideas@idiolect.org.uk
This is my BBC Future column from monday. The original is here
Why are newspapers and TV broadcasts filled with disaster, corruption and incompetence? It may be because we’re drawn to depressing stories without realising, says psychologist Tom Stafford.
When you read the news, sometimes it can feel like the only things reported are terrible, depressing events. Why does the media concentrate on the bad things in life, rather than the good? And what might this depressing slant say about us, the audience?
It isn’t that these are the only things that happen. Perhaps journalists are drawn to reporting bad news because sudden disaster is more compelling than slow improvements. Or it could be that newsgatherers believe that cynical reports of corrupt politicians or unfortunate events make for simpler stories. But another strong possibility is that we, the readers or viewers, have trained journalists to focus on these things. Many people often say that they would prefer good news: but is that actually true?
To explore this possibility, researchers Marc Trussler and Stuart Soroka, set up an experiment, run at McGill University in Canada. They were dissatisfied with previous research on how people relate to the news – either the studies were uncontrolled (letting people browse news at home, for example, where you can’t even tell who is using the computer), or they were unrealistic (inviting them to select stories in the lab, where every participant knew their choices would be closely watched by the experimenter). So, the team decided to try a new strategy: deception.
Trick question
Trussler and Soroka invited participants from their university to come to the lab for “a study of eye tracking”. The volunteers were first asked to select some stories about politics to read from a news website so that a camera could make some baseline eye-tracking measures. It was important, they were told, that they actually read the articles, so the right measurements could be prepared, but it didn’t matter what they read.
After this ‘preparation’ phase, they watched a short video (the main purpose of the experiment as far as the subjects were concerned, but it was in fact just a filler task), and then they answered questions on the kind of political news they would like to read.
The results of the experiment, as well as the stories that were read most, were somewhat depressing. Participants often chose stories with a negative tone – corruption, set-backs, hypocrisy and so on – rather than neutral or positive stories. People who were more interested in current affairs and politics were particularly likely to choose the bad news.
And yet when asked, these people said they preferred good news. On average, they said that the media was too focussed on negative stories.
Danger reaction
The researchers present their experiment as solid evidence of a so called “negativity bias“, psychologists’ term for our collective hunger to hear, and remember bad news.
It isn’t just schadenfreude, the theory goes, but that we’ve evolved to react quickly to potential threats. Bad news could be a signal that we need to change what we’re doing to avoid danger.
As you’d expect from this theory, there’s some evidence that people respond quicker to negative words. In lab experiments, flash the word “cancer”, “bomb” or “war” up at someone and they can hit a button in response quicker than if that word is “baby”, “smile” or “fun” (despite these pleasant words being slightly more common). We are also able to recognise negative words faster than positive words, and even tell that a word is going to be unpleasant before we can tell exactly what the word is going to be.
So is our vigilance for threats the only way to explain our predilection for bad news? Perhaps not.
There’s another interpretation that Trussler and Soroka put on their evidence: we pay attention to bad news, because on the whole, we think the world is rosier than it actually is. When it comes to our own lives, most of us believe we’re better than average, and that, like the clichés, we expect things to be all right in the end. This pleasant view of the world makes bad news all the more surprising and salient. It is only against a light background that the dark spots are highlighted.
So our attraction to bad news may be more complex than just journalistic cynicism or a hunger springing from the darkness within.
And that, on another bad news day, gives me a little bit of hope for humanity.
Everybody has an opinion on men, women and the difference (or not) between them. Now a new study has used a massive and long-running European survey to investigate how differences in cognitive ability are changing. This is super smart, because it offers us an escape from arguing about whether men and women are different in how they think, allowing us some insight into how any such differences might develop.
What they actually did
Researchers led by Daniela Weber at Austria’s International Institute for Applied Systems Analysis analysed data collected as part of the European Survey of Health, Ageing and Retirement. This includes data analysed in this study from approximately 31,000 adults, men and women all aged older than 50. As well as answering demographic questions, the survey participants took short quizzes which tested their memory, numeracy and verbal fluency (this last item involved a classic test which asks people to name as many animals as they could in 60 seconds). Alongside each test score, we have the year the participant was born in, as well as measures of gender equality and economic development for the country where they grew up.
What they found
The results show that as a country develops economically, the differences in cognitive ability between men and women change. But the pattern isn’t straightforward. Differences in verbal fluency disappear (so that an advantage on this test for men born in the 1920s over women is not found for those born in the 1950s). Differences in numeracy diminish (so the male advantage is less) and differences in memory actually increase (so that a female advantage is accentuated).
Further analysis looked at the how these differences in cognitive performance related to the amount of education men and women got. In all regions women tended to have fewer years of education, on average, then men. But, importantly, the size of this difference varied. This allowed the researchers to gauge how differences in education affected cognitive performance.
For all three abilities tested, there was a relationship between the size of the differences in the amount of education and the size of the difference in cognitive performance: fewer years of education for women was associated with worse scores for women, as you’d expect.
What varied for the three abilities was in the researchers’ predictions for the situation where men and women spent an equal amount of time in education: for memory this scenario was associated with a distinct female advantage, for numeracy a male advantage and for verbal fluency, there was no difference.
What this means
The thing that dogs studies on gender differences in cognition is the question of why these differences exist. People have such strong expectations, that they often leap to the assumption that any observed difference must reflect something fundamental about men vs women. Here, consider the example of the Australian newspaper which headlined their take on this story as telling us something about “male and female brains”, the implication being that the unequalness was a fundamental, biological, difference. In fact, research often shows that gender differences in cognitive performance are small, and even then we don’t know why these differences exist.
The great thing about this study is that by looking at how gender differences evolve over time it promises insight into what drives those difference in the first place. The fact that the female memory advantage increases as women are allowed more access to education is, on the face of it, suggestive evidence that at least one cognitive difference between men and women may be unleashed by more equal societies, rather than removed by them.
Tom’s take
The most important thing to take from this research is – as the authors report – increasing gender equality disproportionately benefits women. This is because – no surprise! – gender inequality disproportionately disadvantages women. Even in the area of cognitive performance, this historical denial of opportunities, health and education to women means, at a population level, they have more potential to increase their scores on these tests.
Along with other research on things like IQ, this study found systemmatic improvements in cognitive performance across time for both men and women – as everyone’s opportunities and health increases, so does their cognitive function.
But the provocative suggestion of this study is that as societies develop we won’t necessarily see all gender differences go away. Some cognitive differences may actually increase when women are at less of a disadvantage.
You don’t leap to conclusions based on one study, but this is a neat contribution. One caveat is that even though indices such as “years in education” show diminished gender inequality in Europe, you’d be a fool to think that societies which educated men and women for an equal number of years treated them both equally and put equal expectations on them.
Even if you thought this was true for 2014, you wouldn’t think this was true for European societies of the 1950s (when the youngest of these study participants were growing up). There could be very strong societal influences on cognitive ability – such as expecting women to be good with words and bad with numbers – that simply aren’t captured by the data analysed here.
Personally, I find it interesting to observe how keen people are to seize on such evidence that “essential” gender differences definitely do exist (despite the known confounds of living in a sexist society). My preferred strategy would be to hold judgement and focus on the remaking the definitely sexist society. For certain, we’ll only get the truth when we have an account of how cognitive abilities develop within both biological and social contexts. Studies like this point the way, and suggest that whatever the truth is, it should have some surprises for everyone.
It can ruin the appearance of your hands, could be unhygienic and can hurt if you take it too far. So why do people do it? Biter Tom Stafford investigates
What do ex-British prime minster Gordon Brown, Jackie Onassis, Britney Spears and I all have in common? We all are (or were) nail biters.
It’s not a habit I’m proud of. It’s pretty disgusting for other people to watch, ruins the appearance of my hands, is probably unhygienic and sometimes hurts if I take it too far. I’ve tried to quit many times, but have never managed to keep it up.
Lately I’ve been wondering what makes someone an inveterate nail-biter like me. Are we weaker willed? More neurotic? Hungrier? Perhaps, somewhere in the annals of psychological research there could be an answer to my question, and maybe even hints about how to cure myself of this unsavoury habit.
My first dip into the literature shows up the medical name for excessive nail biting: ‘onychophagia’. Psychiatrists classify it as an impulse control problem, alongside things like obsessive compulsive disorder. But this is for extreme cases, where psychiatric help is beneficial, as with other excessive grooming habits like skin picking or hair pulling. I’m not at that stage, falling instead among the majority of nail biters who carry on the habit without serious side effects. Up to 45% of teenagers bite their nails, for example; teenagers may be a handful but you wouldn’t argue that nearly half of them need medical intervention. I want to understand the ‘subclinical’ side of the phenomenon – nail biting that isn’t a major problem, but still enough of an issue for me to want to be rid of it.
It’s mother’s fault
Psychotherapists have had some theories about nail biting, of course. Sigmund Freud blamed it on arrested psycho-sexual development, at the oral stage (of course). Typical to Freudian theories, oral fixation is linked to myriad causes, such as under-feeding or over-feeding, breast-feeding too long, or problematic relationship with your mother. It also has a grab-bag of resulting symptoms: nail biting, of course, but also a sarcastic personality, smoking, alcoholism and love of oral sex. Other therapists have suggested nail-biting may be due to inward hostility – it is a form of self-mutilation after all – or nervous anxiety.
Like most psychodynamic theories these explanations could be true, but there’s no particular reason to believe they should be true. Most importantly for me, they don’t have any strong suggestions on how to cure myself of the habit. I’ve kind of missed the boat as far as extent of breast-feeding goes, and I bite my nails even when I’m at my most relaxed, so there doesn’t seem to be an easy fix there either. Needless to say, there’s no evidence that treatments based on these theories have any special success.
Unfortunately, after these speculations, the trail goes cold. A search of a scientific literature reveals only a handful of studies on treatment of nail-biting. One reports that any treatment which made people more aware of the habit seemed to help, but beyond that there is little evidence to report on the habit. Indeed, several of the few articles on nail-biting open by commenting on the surprising lack of literature on the topic.
Creature of habit
Given this lack of prior scientific treatment, I feel free to speculate for myself. So, here is my theory on why people bite their nails, and how to treat it.
Let’s call it the ‘anti-theory’ theory. I propose that there is no special cause of nail biting – not breastfeeding, chronic anxiety or a lack of motherly love. The advantage of this move is that we don’t need to find a particular connection between me, Gordon, Jackie and Britney. Rather, I suggest, nail biting is just the result of a number of factors which – due to random variation – combine in some people to create a bad habit.
First off, there is the fact that putting your fingers in your mouth is an easy thing to do. It is one of the basic functions for feeding and grooming, and so it is controlled by some pretty fundamental brain circuitry, meaning it can quickly develop into an automatic reaction. Added to this, there is a ‘tidying up’ element to nail biting – keeping them short – which means in the short term at least it can be pleasurable, even if the bigger picture is that you end up tearing your fingers to shreds. This reward element, combined with the ease with which the behaviour can be carried out, means that it is easy for a habit to develop; apart from touching yourself in the genitals it is hard to think of a more immediate way to give yourself a small moment of pleasure, and biting your nails has the advantage of being OK at school. Once established, the habit can become routine – there are many situations in everyone’s daily life where you have both your hands and your mouth available to use.
Understanding nail-biting as a habit has a bleak message for a cure, unfortunately, since we know how hard bad habits can be to break. Most people, at least once per day, will lose concentration on not biting their nails.
Nail-biting, in my view, isn’t some revealing personality characteristic, nor a maladaptive echo of some useful evolutionary behaviour. It is the product of the shape of our bodies, how hand-to-mouth behaviour is built into (and rewarded in) our brains and the psychology of habit.
And, yes, I did bite my nails while writing this column. Sometimes even a good theory doesn’t help.
Who could possibly be against replication of research results? Jason Mitchell of Harvard University is, under some conditions, for reasons described in his essay On the emptiness of failed replications.
I wrote something for the Centre for Open Science which tries to draw out the sensible points in Mitchell’s essay – something I thought worth doing since for many people being against replication in science is like being against motherhood and apple pie. It’s worth noting that I was invited to do this by Brian Nosek, who is co-founder of the Center for Open Science and instrumental in the Many Labs projects. As such, Brian is implicitly one of the targets of Mitchell’s criticisms, so kudos to him for encouraging this discussion.
Quiet contemplation is so awful that when deprived of the distractions of noise, crowds or smart phones, a bunch of students would rather give themselves electric shocks than sit and think.
What they actually did
Psychologists from the universities of Virginia and Harvard in the US carried out a series of 11 studies in which participants – including students and non-students – were left in an unadorned room for six to 15 minutes and asked to “spend time entertaining themselves with their thoughts.” Both groups, and men and women equally, were unable to enjoy this task. Most said they found it difficult to concentrate and that their minds wandered.
In one of the studies, participants were given the option to give themselves an electric shock, for no given reason or reward. Many did, including the majority of male participants, despite the fact that the vast majority of participants had previously rated the shocks as unpleasant and said they would pay to avoid them.
How plausible is this?
This is a clever, provocative piece of research. The results are almost certainly reliable; the authors, some of whom are extremely distinguished, discovered in the 11 studies the same basic effect – namely, that being asked to sit and think wasn’t enjoyable. The data from the studies is also freely available, so there’s no chance of statistical jiggery-pokery. This is a real effect. The questions, then, are over what exactly the finding means.
Tom’s take
Contrary to what some reporters have implied, this result isn’t just about students – non-students also found being made to sit and think aversive, and there were no differences in this with age. And it isn’t just about men – women generally found the experience as unpleasant. The key result is that being made to sit and think is unpleasant so let’s look at this first before thinking about the shocks.
The results fit with research on sensory deprivation from 50 years ago. Paradoxically, when there are no distractions people find it hard to concentrate. It seems that for most of us, most of the time, our minds need to receive stimulus, interact with the environment, or at least have a task to function enjoyably. Thinking is an active process which involves the world – a far cry from some ideals of “pure thought”.
What the result certainly doesn’t mean, despite the interpretation given by some people – including one author of the study – is that people don’t like thinking. Rather, it’s fair to say that people don’t like being forced to do nothing but think.
It’s possible that there is a White Bear Effect here – also known as the ironic process theory. Famously, if you’re told to think of anything except a white bear, you can’t help but think about a white bear. If you imagine the circumstances of these studies, participants were told they had to sit in their chairs and just think. No singing, no exploring, no exercises. Wouldn’t that make you spend your time (unpleasantly) ruminating on what you couldn’t do?
In this context, are the shocks really so surprising? The shocks were very mild. The participants rated them as unpleasant when they were instructed to shock themselves, but we all know that there’s a big difference between having something done to you (or being told to do something) and choosing to do it yourself.
Although many participants chose to shock themselves I wouldn’t say they were avoiding thinking – rather they were thinking about what it would be like to get another shock. One participant shocked himself 190 times. Perhaps he was exploring how he could learn to cope with the discomfort. Curiosity and exploration are all hallmarks of thinking. It is only the very limited internally directed, stimulus-free kind of thinking to which we can apply the conclusion that it isn’t particular enjoyable.
Footballers skills seem light years from our own. But, Tom Stafford argues, the jaw-dropping talents on the World Cup pitch have more in common with everyday life than you might think.
The first week of the 2014 World Cup has already given us a clutch of classic moments: Robin Van Persie’s perfect header to open the Dutch onslaught against the Spanish; Australian Tim Cahill’s breathtaking volley to equalise against Holland; and Mexican keeper Guillermo Ochoa defying an increasingly desperate Brazilian attack.
We can’t help but be dazzled by the skills on display. Whether it is a header lobbed over an open-mouthed goalie, or a keeper’s last-second leap to save the goal, it can seem as if the footballers have access to talents that are not just beyond description, but beyond conscious comprehension. But the players sprinting, diving and straining on Brazil’s football pitches have a lot more in common with everyday intelligence than you might think.
We often talk about astonishing athletic feats as if they are something completely different from everyday thought. When we say a footballer acts on instinct, out of habit or due to his training, we distance what they do from that we hear echoing within our own heads.
The idea of “muscle memory” encourages this – allowing us to cordon off feats of motor skill as a special kind of psychological phenomenon, something stored, like magic potion, in our muscles. But the truth, of course, is that so called muscle memories are stored in our brains, just like every other kind of memory. What is more, these examples of great skill are not so different from ordinary thought.
If you speak to world-class athletes, such as World Cup footballers, about what they do, they reveal that a lot of conscious reasoning goes into those moments of sublime skill. Here’s England’s Wayne Rooney, in 2012, describing what it feels like as a cross comes into the penalty box: “You’re asking yourself six questions in a split second. Maybe you’ve got time to bring it down on the chest and shoot, or you have to head it first-time. If the defender is there, you’ve obviously got to try and hit it first-time. If he’s farther back, you’ve got space to take a touch. You get the decision made. Then it’s obviously about the execution.”
All this in half a second! Rooney is obviously thinking more, not less, during these most crucial moments.
This is not an isolated example. Dennis Bergkamp delighted Dutch fans by scoring a beautiful winning goal from a long pass in the 1998 World Cup quarter final against Argentina (and if you watch a clip on YouTube, make sure it the one with the ecstatic commentary by Jack van Gelder). In a subsequent interview Bergkamp describes in minute detail all the factors leading up to the goal, from the moment he made eye contact with the defender who was about to pass the ball, to his calculations about how to control the ball. He even lets slip that part of his brain is keeping track of the wind conditions. Just as with Rooney, this isn’t just a moment of unconscious instinct, but of instinct combined with a whirlwind of conscious reasoning. And it all comes together.
Studies of the way the brain embeds new skills, until the movements become automatic, may help make sense of this picture. We know that athletes like those performing in the World Cup train with many years of deliberate, mindful, practice . As they go through their drills, dedicated brain networks develop, allowing the movements to be deployed with less effort and more control. As well as the brain networks involved becoming more refined, the areas of the brain most active in controlling a movement change with increased skill – as we practice, areas deeper within the brain reorganise to take on more of the work, leaving the cortex, including areas associated with planning and reasoning, free to take on new tasks.
But this doesn’t mean we think less when we’re highly skilled. On the contrary, this process called automatisation means that we think differently. Bergkamp doesn’t have to think about his foot when he wants to control a ball, so he’s free to think about the wind, or the defender, or when exactly he wants to control the ball. For highly practiced movements we have to think less about controlling every action but what we do is still ultimately in the service of our overall targets (like scoring a goal in the case of football). In line with this, and contrary to the idea of skills as robotic-reflexes, experiments show that more flexibility develops alongside increased automaticity.
Maybe we like to think footballers are stupid because we want to feel good about ourselves, and many footballers aren’t as articulate as some of the eggheads we traditionally associate with intelligence (and aren’t trained in being articulate), but all the evidence suggests that the feats we see in the World Cup take an immense amount of thought.
Intelligence involves using conscious deliberation at the right level to optimally control your actions. Driving a car is easier because you don’t have to think about the physics of the combustion engine, and it’s also easier because you no longer have to think about the movements required to change gear or turn on the indicators. But just because driving a car relies on automatic skills like these, doesn’t mean that you’re mindless when driving a car. The better drivers, just like the better footballers, are making more choices each time they show off their talents, not fewer.
So footballer’s immense skills aren’t that different from many everyday things we do like walking, talking or driving a car. We’ve practiced these things so much we don’t have to think about how we’re doing them. We may even not pay much attention to what we’re doing, or have much of a memory for them (ever reached the end of a journey and realised you don’t recall a single thing about the trip?), but that doesn’t mean that we aren’t or couldn’t. In fact, because we have practiced these skills we can deploy them at the same time as other things (walking and chewing gum, talking while tying our shoe laces, etc). This doesn’t diminish their mystery, but it does align it with the central mystery of psychology – how we learn to do anything.
So while you may be unlikely to find yourself in the boots of Bergkamp and Rooney, preparing to drill one past a sprawling keeper, you can at least console yourself with the thought that you’re showing the skills of a World Cup legend every time you get behind the wheel of your car.
Every seven seconds? Probably not. But rather than wonder about whether this is true, Tom Stafford asks how on earth you can actually prove it or not.
We’ve all been told that men think about you-know-what far too often – every seven seconds, by some accounts. Most of us have entertained this idea for long enough to be sceptical. However, rather than merely wonder about whether this is true, stop for a moment to consider how you could – or could not – prove it.
If we believe the stats, thinking about sex every seven seconds adds up to 514 times an hour. Or approximately 7,200 times during each waking day. Is that a lot? It sounds like a big number to me, I’d imagine it’s bigger than the number of thoughts I have about anything in a day. So, here’s an interesting question: how is it possible to count the number of mine, or anyone else’s thoughts (sexual or otherwise) over the course of a day?
The scientific attempt to measure thoughts is known to psychologists as “experience sampling“. It involves interrupting people as they go about their daily lives and asking them to record the thoughts they are having right at that moment, in that place.
Terri Fisher and her research team at Ohio State University did this using ‘clickers’. They gave these to 283 college students, divided into three groups, and asked them to press and record each time they thought about sex, or food, or sleep.
Using this method they found that the average man in their study had 19 thoughts about sex a day. This was more than the women in their study – who had about 10 thoughts a day. However, the men also had more thoughts about food and sleep, suggesting perhaps that men are more prone to indulgent impulses in general. Or they are more likely to decide to count any vague feeling as a thought. Or some combination of both.
The interesting thing about the study was the large variation in number of thoughts. Some people said they thought about sex only once per day, whereas the top respondent recorded 388 clicks, which is a sexual thought about every two minutes.
However, the big confounding factor with this study is “ironic processes”, more commonly known as the “white bear problem“. If you want to have cruel fun with a child tell them to put their hand in their air and only put it down when they’ve stopped thinking about a white bear. Once you start thinking about something, trying to forget it just brings it back to mind.
This is exactly the circumstances the participants in Fisher’s study found themselves in. They were given a clicker by the researchers and asked to record when they thought about sex (or food or sleep). Imagine them walking away from the psychology department, holding the clicker in their hand, trying hard not to think about sex all the time, yet also trying hard to remember to press the clicker every time they did think about it. My bet is that the poor man who clicked 388 times was as much a victim of the experimental design as he was of his impulses.
Always on my mind
Another approach, used by Wilhelm Hoffman and colleagues, involved issuing German adult volunteers with smartphones, which were set to notify them seven times a day at random intervals for a week. They were asked to record what featured in their most recent thoughts when they received the random alert, the idea being that putting the responsibility for remembering onto a device left participants’ minds more free to wander.
The results aren’t directly comparable to the Fisher study, as the most anyone could record thinking about sex was seven times a day. But what is clear is that people thought about it far less often than the seven-second myth suggests. They recorded a sexual thought in the last half hour on approximately 4% of occasions, which works out as about once per day, compared with 19 reported in the Fisher study.
The real shock from Hoffman’s study is the relative unimportance of sex in the participants’ thoughts. People said they thought more about food, sleep, personal hygiene, social contact, time off, and (until about 5pm) coffee. Watching TV, checking email and other forms of media use also won out over sex for the entire day. In fact, sex only became a predominant thought towards the end of the day (around midnight), and even then it was firmly in second place, behind sleep.
Hoffman’s method is also contaminated by a white bear effect, though, because participants knew at some point during the day they’d be asked to record what they had been thinking about. This could lead to overestimating some thoughts. Alternately, people may have felt embarrassed about admitting to having sexual thoughts throughout the day, and therefore underreported it.
So, although we can confidently dismiss the story that the average male thinks about sex every seven seconds, we can’t know with much certainty what the true frequency actually is. Probably it varies wildly between people, and within the same person depending on their circumstances, and this is further confounded by the fact that any efforts to measure the number of someone’s thoughts risks changing those thoughts.
There’s also the tricky issue that thoughts have no natural unit of measurement. Thoughts aren’t like distances we can measure in centimetres, metres and kilometres. So what constitutes a thought, anyway? How big does it need to be to count? Have you had none, one or many while reading this? Plenty of things to think about!
This is a BBC Future column from last week. The original is here.
The past is not just a foreign country, but also one we are all exiled from. Like all exiles, we sometimes long to return. That longing is called nostalgia.
Whether it is triggered by a photograph, a first kiss or a treasured possession, nostalgia evokes a particular sense of time or place. We all know the feeling: a sweet sadness for what is gone, in colours that are invariably sepia-toned, rose-tinted, or stained with evening sunlight.
The term “nostalgia” was coined by Swiss physicians in the late 1600s to signify a certain kind of homesickness among soldiers. Nowadays we know it encompasses more than just homesickness (or indeed Swiss soldiers), and if we take nostalgia too far it becomes mawkish or indulgent.
But, perhaps, it has some function beyond mere sentimentality. A series of investigations by psychologist Constantine Sedikides suggest nostalgia may act as a resource that we can draw on to connect to other people and events, so that we can move forward with less fear and greater purpose.
Sedikides was inspired by something called Terror Management Theory (TMT), which is approximately 8,000 times sexier than most theories in psychology, and posits that a primary psychological need for humans is to deal with the inevitability of our own deaths. The roots of this theory are in the psychoanalytic tradition of Sigmund Freud, making the theory a bit different from many modern psychological theories, which draw on more mundane inspirations, such as considering the mind as a computer.
Experiments published in 2008 used a standard way to test Terror Management Theory: asking participants to think about their own deaths, answering questions such as: “Briefly describe the emotions that the thought of your own death arouses in you.” (A control group was asked to think about dental pain, something unpleasant, but not existentially threatening.)
TMT suggests that one response to thinking about death is to cling more strongly to the view that life has some wider meaning, so after their intervention they asked participants to indicate their agreement with statements such as: “Life has no meaning or purpose”, or “All strivings in life are futile and absurd”. From the answers they positioned participants on a scale of how strongly they felt life had meaning.
The responses were influenced by how prone people were to nostalgia. The researchers found that reminding participants of their own deaths was likely to increase feelings of meaninglessness, but only in those who reported that they were less likely to indulge in nostalgia. Participants who rated themselves as more likely than average to have nostalgic thoughts weren’t affected by negative thoughts about their mortality (they rated life as highly meaningful, just like the control group).
Follow-up experiments suggest that people prone to nostalgia were less likely to have lingering thoughts about death, as well as less likely to be vulnerable to feelings of loneliness. Nostalgia, according to this view, is very different from a weakness or indulgence. The researchers call it a “meaning providing resource”, a vital part of mental health. Nostalgia acts a store of positive emotions in memory, something we can access consciously, and perhaps also draw on continuously during our daily lives to bolster our feelings. It’s these strong feelings for our past that helps us cope better with our future.
Thanks to Jules Hall for suggesting the topic of nostalgia. If you have an everyday psychological phenomenon you’d like to see written about in these columns please get in touch @tomstafford or ideas@idiolect.org.uk
This was my BBC Future column from last week. The original is here.
Released on 6th of June 1984, Tetris is 30 years old today. Here’s a video where I try and explain something of the psychology of Tetris:
All credit for the graphics to Andrew Twist. What I say in the video is based on an article I wrote a while back for BBC Future.
As well as hijacking the minds and twitchy fingers of puzzle-gamers for 30 years, Tetris has also been involved in some important psychological research.
My favourite is Kirsh and Maglio’s work on “epistemic action“, which showed how Tetris players prefer to rotate the blocks in the game world rather than mentally. This using the world in synchrony with your mental representations is part of what makes it so immersive, I argue.
How do you change someone’s mind if you think you are right and they are wrong? Psychology reveals the last thing to do is the tactic we usually resort to.
You are, I’m afraid to say, mistaken. The position you are taking makes no logical sense. Just listen up and I’ll be more than happy to elaborate on the many, many reasons why I’m right and you are wrong. Are you feeling ready to be convinced?
Whether the subject is climate change, the Middle East or forthcoming holiday plans, this is the approach many of us adopt when we try to convince others to change their minds. It’s also an approach that, more often than not, leads to the person on the receiving end hardening their existing position. Fortunately research suggests there is a better way – one that involves more listening, and less trying to bludgeon your opponent into submission.
A little over a decade ago Leonid Rozenblit and Frank Keil from Yale University suggested that in many instances people believe they understand how something works when in fact their understanding is superficial at best. They called this phenomenon “the illusion of explanatory depth“. They began by asking their study participants to rate how well they understood how things like flushing toilets, car speedometers and sewing machines worked, before asking them to explain what they understood and then answer questions on it. The effect they revealed was that, on average, people in the experiment rated their understanding as much worse after it had been put to the test.
What happens, argued the researchers, is that we mistake our familiarity with these things for the belief that we have a detailed understanding of how they work. Usually, nobody tests us and if we have any questions about them we can just take a look. Psychologists call this idea that humans have a tendency to take mental short cuts when making decisions or assessments the “cognitive miser” theory.
Why would we bother expending the effort to really understand things when we can get by without doing so? The interesting thing is that we manage to hide from ourselves exactly how shallow our understanding is.
It’s a phenomenon that will be familiar to anyone who has ever had to teach something. Usually, it only takes the first moments when you start to rehearse what you’ll say to explain a topic, or worse, the first student question, for you to realise that you don’t truly understand it. All over the world, teachers say to each other “I didn’t really understand this until I had to teach it”. Or as researcher and inventor Mark Changizi quipped: “I find that no matter how badly I teach I still learn something”.
Explain yourself
Research published last year on this illusion of understanding shows how the effect might be used to convince others they are wrong. The research team, led by Philip Fernbach, of the University of Colorado, reasoned that the phenomenon might hold as much for political understanding as for things like how toilets work. Perhaps, they figured, people who have strong political opinions would be more open to other viewpoints, if asked to explain exactly how they thought the policy they were advocating would bring about the effects they claimed it would.
Recruiting a sample of Americans via the internet, they polled participants on a set of contentious US policy issues, such as imposing sanctions on Iran, healthcare and approaches to carbon emissions. One group was asked to give their opinion and then provide reasons for why they held that view. This group got the opportunity to put their side of the issue, in the same way anyone in an argument or debate has a chance to argue their case.
Those in the second group did something subtly different. Rather that provide reasons, they were asked to explain how the policy they were advocating would work. They were asked to trace, step by step, from start to finish, the causal path from the policy to the effects it was supposed to have.
The results were clear. People who provided reasons remained as convinced of their positions as they had been before the experiment. Those who were asked to provide explanations softened their views, and reported a correspondingly larger drop in how they rated their understanding of the issues. People who had previously been strongly for or against carbon emissions trading, for example, tended to became more moderate – ranking themselves as less certain in their support or opposition to the policy.
So this is something worth bearing in mind next time you’re trying to convince a friend that we should build more nuclear power stations, that the collapse of capitalism is inevitable, or that dinosaurs co-existed with humans 10,000 years ago. Just remember, however, there’s a chance you might need to be able to explain precisely why you think you are correct. Otherwise you might end up being the one who changes their mind.
This is my BBC Future column from last week. The original is here.
Are we, the human species, unreasonable? Do rational arguments have any power to sway us, or is it all intuition, hidden motivations, and various other forms of prejudice?
…the picture of human rationality painted by our profession can seem pretty bleak. Every week I hear about a new piece of research which shows up some quirk of our minds, like the one about people given a heavy clip board judge public issues as more important than people given a light clip board. Or that more attractive people are judged as more trustworthy, or they arguments they give as more intelligent.
…I set out to get to the bottom of the evidence on how we respond to rational arguments. Does rationality lose out every time to irrational motivations? Or is there any hope to those of us who want to persuade because we have good arguments, not because we are handsome, or popular, or offer heavy clipboards.
You can read the full thing here, and while you’re over there check out the rest of the the Contributoria site – all of the articles on which are published under a CC license and commissioned by members. On which note, a massive thanks to everyone who backed my proposal and offered comments (see previousannouncements). Special thanks to Josie and Dan for giving close readings to the piece before it was finished.
Edit: Contributoria didn’t last long, but I republished this essay and some others in an ebook “For argument’s sake: evidence that reason can change minds” (amazon, smashwords)
Implicit attitudes are one of the hottest topics in social psychology. Now a massive new study directly compares methods for changing them. The results are both good and bad for those who believe that some part of prejudice is our automatic, uncontrollable, reactions to different social groups.
All three studies I covered (#1, #2, #3) use large behavioural datasets, something I’m particularly keen on in my own work.
My time in the BPS Research Digest hotseat continues. Today’s post is about a lovely study by Stuart Ritchie and colleagues which uses a unique dataset to look at the effect of alcohol on cognitive function across the lifespan. Here’s the intro:
The cognitive cost or benefit of booze depends on your genes, suggests a new study which uses a unique longitudinal data set.
Inside the laboratory psychologists use a control group to isolate the effects of specific variables. But many important real world problems can’t be captured in the lab. Ageing is a good example: if we want to know what predicts a healthy old age, running experiments is difficult, even if only for the reason that they take a lifetime to get the results. Questions about potentially harmful substances are another good example: if we suspect something may be harmful we can hardly give it to half of a group of volunteer participants. The question of the long-term effects of alcohol consumption on cognitive ability combines both of these difficulties.