A cultural view of agony

painNew Statesman has a fascinating article on the ‘cultural history of pain’ that tracks how our ideas about pain and suffering have radically changed through the years.

One of the most interesting, and worrying, themes is how there have been lots of cultural beliefs about whether certain groups are more or less sensitive to pain.

Needless to say, these beliefs tended to justify existing prejudices rather than stem from any sound evidence.

Some speculated whether the availability of anaesthetics and analgesics had an effect on people’s ability (as well as willingness) to cope with acute affliction. Writing in the 1930s, the distinguished pain surgeon René Leriche argued fervently that Europeans had become more sensitive to pain. Unlike earlier in the century, he claimed, modern patients “would not have allowed us to cut even a centimetre . . . without administering an anaesthetic”. This was not due to any decline of moral fibre, Leriche added: rather, it was a sign of a “nervous system differently developed, and more sensitive”.

Other physicians and scientists of the 19th and early 20th centuries wanted to complicate the picture by making a distinction between pain perception and pain reaction. But this distinction was used to denigrate “outsider” groups even further. Their alleged insensitivity to pain was proof of their humble status – yet when they did exhibit pain reactions, their sensitivity was called “exaggerated” or “hysterical” and therefore seen as more evidence of their inferiority.

 

Link to New Statesman article (via @SarahRoseCrook)

Do we really hate thinking so much we’d electrocute ourselves rather than do it?

By Tom Stafford, University of Sheffield

The headlines

The Guardian: Shocking but true: students prefer jolt of pain than being made to sit and think

Nature: We dislike being alone with our thoughts

Washington Post: Most men would rather shock themselves than be alone with their thoughts

 

The story

Quiet contemplation is so awful that when deprived of the distractions of noise, crowds or smart phones, a bunch of students would rather give themselves electric shocks than sit and think.

 

What they actually did

Psychologists from the universities of Virginia and Harvard in the US carried out a series of 11 studies in which participants – including students and non-students – were left in an unadorned room for six to 15 minutes and asked to “spend time entertaining themselves with their thoughts.” Both groups, and men and women equally, were unable to enjoy this task. Most said they found it difficult to concentrate and that their minds wandered.

In one of the studies, participants were given the option to give themselves an electric shock, for no given reason or reward. Many did, including the majority of male participants, despite the fact that the vast majority of participants had previously rated the shocks as unpleasant and said they would pay to avoid them.

 

How plausible is this?

This is a clever, provocative piece of research. The results are almost certainly reliable; the authors, some of whom are extremely distinguished, discovered in the 11 studies the same basic effect – namely, that being asked to sit and think wasn’t enjoyable. The data from the studies is also freely available, so there’s no chance of statistical jiggery-pokery. This is a real effect. The questions, then, are over what exactly the finding means.

 

Tom’s take

Contrary to what some reporters have implied, this result isn’t just about students – non-students also found being made to sit and think aversive, and there were no differences in this with age. And it isn’t just about men – women generally found the experience as unpleasant. The key result is that being made to sit and think is unpleasant so let’s look at this first before thinking about the shocks.

The results fit with research on sensory deprivation from 50 years ago. Paradoxically, when there are no distractions people find it hard to concentrate. It seems that for most of us, most of the time, our minds need to receive stimulus, interact with the environment, or at least have a task to function enjoyably. Thinking is an active process which involves the world – a far cry from some ideals of “pure thought”.

What the result certainly doesn’t mean, despite the interpretation given by some people – including one author of the study – is that people don’t like thinking. Rather, it’s fair to say that people don’t like being forced to do nothing but think.

It’s possible that there is a White Bear Effect here – also known as the ironic process theory. Famously, if you’re told to think of anything except a white bear, you can’t help but think about a white bear. If you imagine the circumstances of these studies, participants were told they had to sit in their chairs and just think. No singing, no exploring, no exercises. Wouldn’t that make you spend your time (unpleasantly) ruminating on what you couldn’t do?

In this context, are the shocks really so surprising? The shocks were very mild. The participants rated them as unpleasant when they were instructed to shock themselves, but we all know that there’s a big difference between having something done to you (or being told to do something) and choosing to do it yourself.

Although many participants chose to shock themselves I wouldn’t say they were avoiding thinking – rather they were thinking about what it would be like to get another shock. One participant shocked himself 190 times. Perhaps he was exploring how he could learn to cope with the discomfort. Curiosity and exploration are all hallmarks of thinking. It is only the very limited internally directed, stimulus-free kind of thinking to which we can apply the conclusion that it isn’t particular enjoyable.

 

Read more

The original paper: Just think: The challenges of the disengaged mind.

You can see the data over at the Open Science Framework.

Daniel Wegner’s brilliant book on the White Bear problem.

The Conversation

This article was originally published on The Conversation.
Read the original article.

Spike activity 27-06-2014

Quick links from the past week in mind and brain news:

Slate has a piece on developmental psychology’s WEIRD problem. Most kids in child psychology studies are from very restricted social groups – rich, educated families.

Facebook manipulated stories in users’ newsfeeds to conduct experiments on emotional contagion. Don’t remember signing the consent form for the study that appeared in PNAS?

Time covers the massive prevalence of PTSD among US veterans. The Pentagon’s PTSD treatments “appear to be local, ad hoc, incremental, and crisis-driven” with no effective evaluation.

Excellent analysis of a new study: FDA’s antidepressant warning didn’t actually backfired and cause more suicides. Neuroskeptic on the case.

Time magazine has an interesting piece on the under-reported problem of violence in women.

Interesting National Geographic piece about how new finds of human skull bones show even more complexity in the evolution of human and hominid species.

Slate has a piece on how that a lot of zoo animals are on antipsychotics because they become mentally ill when enclosed.

A spook’s guide to the psychology of deception

Last February, a file from the Edward Snowden leaks was released from a 2012 GCHQ presentation called ‘The Art of Deception: Training for Online Covert Operations’. It describes the ‘Online Covert Action Accreditation’ course which draws heavily on the psychology of influence and persuasion. This post will look at how they’re piecing together the science that forms the basis for these online operations.

The work seems to have been put together by GCHQ’s Human Science Operations Cell which presumably exists as an internal consultancy to allow the relevant cognitive and social sciences to be applied to practical covert operations.

One of the early slides lists the subjects the HSOC draws on which stretch from psychology to political science to neuroscience. At the current time, neuroscience has nothing practical to contribute, so they’re clearly blowing their neurological trumpets to sound a bit more high-tech but it’s worth noting the breadth of disciplines they draw on meaning they’ve got a wide and comprehensive vision of human behaviour from the micro to the macro.

However, one of the key slides has a road map of how everything fits together. It’s shown below and it’s quite dense so you can click the image below if you want a larger version.

One of the first thing that stands out if the ad-hoc-ness of their approach. They’ve appropriated a patchwork of relevant theories as a guide to practice with nothing being drawn from their own data.

You can see the main areas they’re drawing from – which includes profiling cultures and personality, research on persuasion, cognitive biases and scams, research on the psychology of stage magic, and organisational psychology or management science more generally.
 

Perhaps the weakest elements here are the cultural and personality profiling using Hofstede’s cultural dimensions and a Big Five personality traits. The trouble is that while these are statistically reliable on the group level they predict very little on the individual level because the effects are swamped by individual variation.

This means it may be more useful in the domain of PSYOPS, which attempts to influence groups, rather than targeting individuals.

The slide below details the general psychological framework for deception. As far as I can tell, this is the only original piece of psychological theory in the presentation.
 

It’s more a useful way of organising different approaches to deception rather than a theory in itself. It’s what clinical psychologists would call a ‘formulation’. It’s a way of organising evidence-based effects that may not be thoroughly tested itself but works well enough to aid understanding.

Perhaps the key thing to note is the sensemaking component. Sensemaking is a key concept in management science that just describes the different ways in which people come to conclusions about the meaning and significance of things.

It should be a well-known concept in intelligence circles because it is used both in military people management and military intelligence analysis. Interestingly, they treat individuals as like naive intelligence analysts who are trying to piece together their own understanding of the world and aim to exploit some of the weaknesses in this process. The big messy ‘concept map’ slides mentions ‘destructive organisational psychology’ which presumably refers to using the understanding of what keeps organisation together to break them apart.

However, in terms of the psychological science which underlies their approach, the next slide is key.
 

You can see several influences here. The techniques listed under ‘attention’ are all taken from research on the psychology of magic tricks, particularly from Susana Martinez-Conde’s work on how sleight-of-hand artists manipulate attention. Most of it is reviewed in a paper she wrote with a series of co-authors including pickpocket Apollo Robbins.

The HSOC spooks clearly love the idea of the psychology of magic and they refer to it a lot in their presentation. One slide just says ‘We want to build Cyber Magicians’, but it’s really not clear how it applies online. The whole point of sleight-of-hand is that it is dynamic and takes advantage of how you pay attention. When online, however, users’ attention doesn’t necessary flow in a predictable pattern because you can wander off from the screen, pause, grab screenshots and so on. In other words, individuals have better control over the flow of information because online interaction is designed for information control and therefore partial staggered attention.

The ‘perception’ techniques listed on the slide are largely taken from Stefano Grazioli and Sirkka Jarvenpaa’s classic paper [pdf] on online deception entitled ‘Deceived: Under Target Online’. The paper looks at how internet scammers rip people off and assuming that successful online con artists have found useful techniques by natural selection, HSOC just borrow them.

The techniques to exploit sensemaking are largely based on theories of sensemaking itself although the story fragments components seems to be drawn from research on relational agents that are designed ‘to form long-term, social-emotional relationships with their users’. Rather than actually deploying autonomous relational agents, I suspect it’s simply a case of using research insights from the area that suggests, for example, that presenting fragments of the agent’s backstory and letting the other person piece them together makes the person seem more believable.

The techniques in the ‘affect’ section are some general points taken from a vast experimental literature on the psychology of marketing and persuasion that describes how emotion modulates the heuristics (judgement processes) involved in persuasion.

The ‘behaviour’ section is the only part I don’t recognise as coming from the psychological literature. This makes me suspect it comes from PSYOPS or IO practice, but if you recognise it, leave a comment below.
 

The ’10 Principles of Influence’ is perhaps one of the most interesting slides in terms of illustrating the empirical basis for their approach as they use research both on the strategies of honest persuasion and dishonest scammers.

‘Principles are influence’ are largely associated with the work of consumer psychologist Robert Cialdini but the list actually consists of three of his six principles (Reciprocity, Social Compliance / Authority, Consistency).

Another six are taken from Stajano and Wilson’s classic study ‘Understanding scam victims: seven principles for systems security’ which describes six methods used by con artists. One item overlaps with the Cialdini principles and additionally they’ve included flattery (known to be an effective persuasive tool) and time – although it’s not clear whether they’re referring to giving people time and putting people under time pressure.
 

This section seems to be about gaining people’s trust to encourage disclosure and the slide you see above refers to social penetration theory which describes how relationships progress to increased levels of intimate connection through self-disclosure. The slide that follows this gives some basic advice about encouraging this: mirroring communication cues, adjust speech patters and so on – the sort of things you get taught in the first week of a psychotherapy course.

So here’s what the Online Covert Action Accreditation’ course looks like: like a PhD psychologist was given the task to come up with a plausible psychological framework for practical deception and influence online. It draws on a mix of persuasion psychology from marketing, studies on scammers and con-men, the social psychology of trust and disclosure, studies of how stage magic works psychologically, and work on what makes organisations work effectively and what degrades their performance.

This is a comprehensive approach to the problem, but the trouble is, this probably only translates approximately and probably rather poorly into practical effects.

In place of this, HSOC would be better of doing research and lots of it. They could do lots of informal RCTs online and gather a large amount of data quite quickly to test out which techniques actually increase influence or lead to successful deception. What behaviours on the part of the actor lead to increased self-disclosure the quickest? Does a laggy internet connection mean people’s increased frustration affects their evaluation of honest? and so on.

I suspect, however, that the Human Science Operations Cell were, and maybe still are, quite a small outfit and so they’re restricted to a consultancy role which will ultimately limit their effectiveness.

We tend to think that the secret services are super efficient experts with an infinite budget, but they probably just work like any other organisation. HSOC were probably told to deliver an Online Covert Action Accreditation course with few resources and not enough time and came up with the most sensible thing in the time allowed.

Oh, and by the way, hello spooks, and welcome to Mind Hacks.
 

Link to copy of slides.
Link to coverage from The Intercept.

Brains in their feat

Footballers skills seem light years from our own. But, Tom Stafford argues, the jaw-dropping talents on the World Cup pitch have more in common with everyday life than you might think.

The first week of the 2014 World Cup has already given us a clutch of classic moments: Robin Van Persie’s perfect header to open the Dutch onslaught against the Spanish; Australian Tim Cahill’s breathtaking volley to equalise against Holland; and Mexican keeper Guillermo Ochoa defying an increasingly desperate Brazilian attack.

We can’t help but be dazzled by the skills on display. Whether it is a header lobbed over an open-mouthed goalie, or a keeper’s last-second leap to save the goal, it can seem as if the footballers have access to talents that are not just beyond description, but beyond conscious comprehension. But the players sprinting, diving and straining on Brazil’s football pitches have a lot more in common with everyday intelligence than you might think.

We often talk about astonishing athletic feats as if they are something completely different from everyday thought. When we say a footballer acts on instinct, out of habit or due to his training, we distance what they do from that we hear echoing within our own heads.

The idea of “muscle memory” encourages this – allowing us to cordon off feats of motor skill as a special kind of psychological phenomenon, something stored, like magic potion, in our muscles. But the truth, of course, is that so called muscle memories are stored in our brains, just like every other kind of memory. What is more, these examples of great skill are not so different from ordinary thought.

If you speak to world-class athletes, such as World Cup footballers, about what they do, they reveal that a lot of conscious reasoning goes into those moments of sublime skill. Here’s England’s Wayne Rooney, in 2012, describing what it feels like as a cross comes into the penalty box: “You’re asking yourself six questions in a split second. Maybe you’ve got time to bring it down on the chest and shoot, or you have to head it first-time. If the defender is there, you’ve obviously got to try and hit it first-time. If he’s farther back, you’ve got space to take a touch. You get the decision made. Then it’s obviously about the execution.”

All this in half a second! Rooney is obviously thinking more, not less, during these most crucial moments.

This is not an isolated example. Dennis Bergkamp delighted Dutch fans by scoring a beautiful winning goal from a long pass in the 1998 World Cup quarter final against Argentina (and if you watch a clip on YouTube, make sure it the one with the ecstatic commentary by Jack van Gelder). In a subsequent interview Bergkamp describes in minute detail all the factors leading up to the goal, from the moment he made eye contact with the defender who was about to pass the ball, to his calculations about how to control the ball. He even lets slip that part of his brain is keeping track of the wind conditions. Just as with Rooney, this isn’t just a moment of unconscious instinct, but of instinct combined with a whirlwind of conscious reasoning. And it all comes together.

Studies of the way the brain embeds new skills, until the movements become automatic, may help make sense of this picture. We know that athletes like those performing in the World Cup train with many years of deliberate, mindful, practice . As they go through their drills, dedicated brain networks develop, allowing the movements to be deployed with less effort and more control. As well as the brain networks involved becoming more refined, the areas of the brain most active in controlling a movement change with increased skill  – as we practice, areas deeper within the brain reorganise to take on more of the work, leaving the cortex, including areas associated with planning and reasoning, free to take on new tasks.

But this doesn’t mean we think less when we’re highly skilled. On the contrary, this process called automatisation means that we think differently. Bergkamp doesn’t have to think about his foot when he wants to control a ball, so he’s free to think about the wind, or the defender, or when  exactly he wants to control the ball. For highly practiced movements we have to think less about controlling every action but what we do is still ultimately in the service of our overall targets (like scoring a goal in the case of football). In line with this, and contrary to the idea of skills as robotic-reflexes, experiments show that more flexibility develops alongside increased automaticity.

Maybe we like to think footballers are stupid because we want to feel good about ourselves, and many footballers aren’t as articulate as some of the eggheads we traditionally associate with intelligence (and aren’t trained in being articulate), but all the evidence suggests that the feats we see in the World Cup take an immense amount of thought.

Intelligence involves using conscious deliberation at the right level to optimally control your actions. Driving a car is easier because you don’t have to think about the physics of the combustion engine, and it’s also easier because you no longer have to think about the movements required to change gear or turn on the indicators. But just because driving a car relies on automatic skills like these, doesn’t mean that you’re mindless when driving a car. The better drivers, just like the better footballers, are making more choices each time they show off their talents, not fewer.

So footballer’s immense skills aren’t that different from many everyday things we do like walking, talking or driving a car. We’ve practiced these things so much we don’t have to think about how we’re doing them. We may even not pay much attention to what we’re doing, or have much of a memory for them (ever reached the end of a journey and realised you don’t recall a single thing about the trip?), but that doesn’t mean that we aren’t or couldn’t. In fact, because we have practiced these skills we can deploy them at the same time as other things (walking and chewing gum, talking while tying our shoe laces, etc). This doesn’t diminish their mystery, but it does align it with the central mystery of psychology – how we learn to do anything.

So while you may be unlikely to find yourself in the boots of Bergkamp and Rooney, preparing to drill one past a sprawling keeper, you can at least console yourself with the thought that you’re showing the skills of a World Cup legend every time you get behind the wheel of your car.

A bonus BBC Future column from last week. Here’s the original.

The normality trap

I remember taking a bus to London Bridge when, after a few stops, a woman got on who seemed to move with a subtle but twitchy disregard for her surroundings. She found herself a seat among the Saturday shoppers and divided her time between looking out the window and responding to invisible companions, occasionally shouting at her unseen persecutors.

By East Street, the bus was empty.

You’ve probably encountered fellow travellers who are strikingly out of the ordinary, sometimes quite distressed, scattered among the urban landscape where they seem to have a social forcefield around them that makes crowds part in their presence.

If you’ve ever worked in a hospital or support service for people with psychological or neurological difficulties, you’ve probably met lots of people who are markedly out of step with the mundane rules of social engagement.

They seem to talk too loud, or too fast, or too much. They can be full of fantastical things or fantasies. They may be afraid or angry, difficult or disengaged or intent on rewind-replay behaviours. Their dress can be notable for its eccentricity or decay.

So why don’t we see people like these in anti-stigma campaigns?

Don’t get me wrong, I’m a massive fan of the great work anti-stigma campaigns do. Everybody is susceptible to mental health problems and the reason these campaigns are necessary is that they often go unrecognised by other people and instead of help, too often people receive misunderstanding and ignorance.

But there’s more to mental health than normality.

That woman on the bus shouting at her voices, she deserves respect too. That guy who posts those leaflets about Masons and thought-stealing all over town, deserves your time. The guy that speaks in a clumsy monotone voice and doesn’t look you in the eye, is also worthy of compassion.

Disability charities don’t base their campaigns solely on ‘nice people in wheelchairs’. They’re happy to show people who represent the full range of appearance and presentation. So why not mental health?

Step up mental health organisations, you’ve got nothing to lose except your conformity.

How often do men really think about sex?

Every seven seconds? Probably not. But rather than wonder about whether this is true, Tom Stafford asks how on earth you can actually prove it or not.

We’ve all been told that men think about you-know-what far too often – every seven seconds, by some accounts. Most of us have entertained this idea for long enough to be sceptical. However, rather than merely wonder about whether this is true, stop for a moment to consider how you could – or could not – prove it.

If we believe the stats, thinking about sex every seven seconds adds up to 514 times an hour. Or approximately 7,200 times during each waking day. Is that a lot? It sounds like a big number to me, I’d imagine it’s bigger than the number of thoughts I have about anything in a day. So, here’s an interesting question: how is it possible to count the number of mine, or anyone else’s thoughts (sexual or otherwise) over the course of a day?

The scientific attempt to measure thoughts is known to psychologists as “experience sampling“. It involves interrupting people as they go about their daily lives and asking them to record the thoughts they are having right at that moment, in that place.

Terri Fisher and her research team at Ohio State University did this using ‘clickers’. They gave these to 283 college students, divided into three groups, and asked them to press and record each time they thought about sex, or food, or sleep.

Using this method they found that the average man in their study had 19 thoughts about sex a day. This was more than the women in their study – who had about 10 thoughts a day. However, the men also had more thoughts about food and sleep, suggesting perhaps that men are more prone to indulgent impulses in general. Or they are more likely to decide to count any vague feeling as a thought. Or some combination of both.

The interesting thing about the study was the large variation in number of thoughts. Some people said they thought about sex only once per day, whereas the top respondent recorded 388 clicks, which is a sexual thought about every two minutes.

However, the big confounding factor with this study is “ironic processes”, more commonly known as the “white bear problem“. If you want to have cruel fun with a child tell them to put their hand in their air and only put it down when they’ve stopped thinking about a white bear. Once you start thinking about something, trying to forget it just brings it back to mind.

This is exactly the circumstances the participants in Fisher’s study found themselves in. They were given a clicker by the researchers and asked to record when they thought about sex (or food or sleep). Imagine them walking away from the psychology department, holding the clicker in their hand, trying hard not to think about sex all the time, yet also trying hard to remember to press the clicker every time they did think about it. My bet is that the poor man who clicked 388 times was as much a victim of the experimental design as he was of his impulses.

Always on my mind

Another approach, used by Wilhelm Hoffman and colleagues, involved issuing German adult volunteers with smartphones, which were set to notify them seven times a day at random intervals for a week. They were asked to record what featured in their most recent thoughts when they received the random alert, the idea being that putting the responsibility for remembering onto a device left participants’ minds more free to wander.

The results aren’t directly comparable to the Fisher study, as the most anyone could record thinking about sex was seven times a day. But what is clear is that people thought about it far less often than the seven-second myth suggests. They recorded a sexual thought in the last half hour on approximately 4% of occasions, which works out as about once per day, compared with 19 reported in the Fisher study.

The real shock from Hoffman’s study is the relative unimportance of sex in the participants’ thoughts. People said they thought more about food, sleep, personal hygiene, social contact, time off, and (until about 5pm) coffee. Watching TV, checking email and other forms of media use also won out over sex for the entire day. In fact, sex only became a predominant thought towards the end of the day (around midnight), and even then it was firmly in second place, behind sleep.

Hoffman’s method is also contaminated by a white bear effect, though, because participants knew at some point during the day they’d be asked to record what they had been thinking about. This could lead to overestimating some thoughts. Alternately, people may have felt embarrassed about admitting to having sexual thoughts throughout the day, and therefore underreported it.

So, although we can confidently dismiss the story that the average male thinks about sex every seven seconds, we can’t know with much certainty what the true frequency actually is. Probably it varies wildly between people, and within the same person depending on their circumstances, and this is further confounded by the fact that any efforts to measure the number of someone’s thoughts risks changing those thoughts.

There’s also the tricky issue that thoughts have no natural unit of measurement. Thoughts aren’t like distances we can measure in centimetres, metres and kilometres. So what constitutes a thought, anyway? How big does it need to be to count? Have you had none, one or many while reading this? Plenty of things to think about!

This is a BBC Future column from last week. The original is here.

Spike activity 20-06-2014

Quick links from the past week in mind and brain news:

OK Go’s new music video is like standing naked under a waterfall of optical illusions while wearing hipster spectacles.

The mighty Neurocritic looks at advances in physical brain tweaking and the possible rebirth of paradise engineering.

The Dana Foundation has an excellent piece on how to make sense of those ‘gene for’ behavioural genetics stories in the media.

Slow news day: The New York Times reports my killer robot opinions. Sadly the key quote (“To the bunkers if you want any chance of saving yourselves from the coming robotocalypse. RUN, RUN FOR YOUR LIVES!”) was omitted.

PsyPost reports on a new study finding that the ‘trophy wife’ stereotype is largely a myth, because not even good looks can break the class barrier.

Watching porn won’t shrink your brain. Just makes you a bit sore. Brain Watch comments on a widely misreported recent study.

The Atlantic has a great piece on five neurology patients who changed the way we think about the brain.

There’s an excellent article about maternal mental health in The New York Times.

Simon says Psychosis! is an excellent new mini-documentary on the first experience of psychosis and early intervention services.

The ever-interesting neuroscientist Molly Crockett is featured in this Wellcome Trust focus on scientists’ working days.

A peek inside The Skeleton Cupboard

You’ll get more out of The Skeleton Cupboard, Tanyan Byron’s account of her training as a clinical psychologist, if you read the epilogue first.

It tells you that the patients described in the book are fictional, to preserve confidentiality, but indicates that the stories were representative of real situations.

This is a common device in clinical memoires, from Irvin Yalom’s existential tales of psychotherapy to Philippa Perry’s couch fiction, but I’m never quite sure what to make of these clinical quasi-biographies.

They are usually realistic, insightful and wonderful to read, Byron’s book is no exception, but the smudged line between truth and necessary fiction is sometimes hard to navigate.

In Byron’s case, her book is perhaps the most deliberately autobiographical in the genre, where she intends to reflect the role of the psychologist’s own psychology in working with distressed, impaired, and sometimes difficult individuals.

This is part of what clinical psychologists aim to do – understand how your own reactions are colouring your approach to the patient – but when the patients are literary collages of real people, it is perhaps the process rather than the content of those reflections that are the most informative.

From this perspective, The Skeleton Cupboard is best understood as an illustrated history of ‘how my thinking evolved as a clinician’ rather than a journal of patients past, although we assume the non-clinical parts are factual: the hard-boiled supervisor, the misjudged snogging of a psychiatrist, the friends through good times and bad.

Byron is Britain’s best ambassador for clinical psychology and a very good writer to boot and I’m sure The Skeleton Cupboard will prompt many to take up the profession or inspire them during their training. It’s also a good account of how thinking and practice evolves through first contact with patients.

It has some artistic license, maybe even melodrama in places, but it has some points of emotional truth that are hard to deny.
 

Link to more details of The Skeleton Cupboard.

Nostalgia: Why it is good for you

The past is not just a foreign country, but also one we are all exiled from. Like all exiles, we sometimes long to return. That longing is called nostalgia.

Whether it is triggered by a photograph, a first kiss or a treasured possession, nostalgia evokes a particular sense of time or place. We all know the feeling: a sweet sadness for what is gone, in colours that are invariably sepia-toned, rose-tinted, or stained with evening sunlight.

The term “nostalgia” was coined by Swiss physicians in the late 1600s to signify a certain kind of homesickness among soldiers. Nowadays we know it encompasses more than just homesickness (or indeed Swiss soldiers), and if we take nostalgia too far it becomes mawkish or indulgent.

But, perhaps, it has some function beyond mere sentimentality. A series of investigations by psychologist Constantine Sedikides suggest nostalgia may act as a resource that we can draw on to connect to other people and events, so that we can move forward with less fear and greater purpose.

Sedikides was inspired by something called Terror Management Theory (TMT), which is approximately 8,000 times sexier than most theories in psychology, and posits that a primary psychological need for humans is to deal with the inevitability of our own deaths. The roots of this theory are in the psychoanalytic tradition of Sigmund Freud, making the theory a bit different from many modern psychological theories, which draw on more mundane inspirations, such as considering the mind as a computer.

Experiments published in 2008 used a standard way to test Terror Management Theory: asking participants to think about their own deaths, answering questions such as: “Briefly describe the emotions that the thought of your own death arouses in you.” (A control group was asked to think about dental pain, something unpleasant, but not existentially threatening.)

TMT suggests that one response to thinking about death is to cling more strongly to the view that life has some wider meaning, so after their intervention they asked participants to indicate their agreement with statements such as: “Life has no meaning or purpose”, or “All strivings in life are futile and absurd”. From the answers they positioned participants on a scale of how strongly they felt life had meaning.

The responses were influenced by how prone people were to nostalgia. The researchers found that reminding participants of their own deaths was likely to increase feelings of meaninglessness, but only in those who reported that they were less likely to indulge in nostalgia. Participants who rated themselves as more likely than average to have nostalgic thoughts weren’t affected by negative thoughts about their mortality (they rated life as highly meaningful, just like the control group).

Follow-up experiments suggest that people prone to nostalgia were less likely to have lingering thoughts about death, as well as less likely to be vulnerable to feelings of loneliness. Nostalgia, according to this view, is very different from a weakness or indulgence. The researchers call it a “meaning providing resource”, a vital part of mental health. Nostalgia acts a store of positive emotions in memory, something we can access consciously, and perhaps also draw on continuously during our daily lives to bolster our feelings. It’s these strong feelings for our past that helps us cope better with our future.

Thanks to Jules Hall for suggesting the topic of nostalgia. If you have an everyday psychological phenomenon you’d like to see written about in these columns please get in touch @tomstafford or ideas@idiolect.org.uk

This was my BBC Future column from last week. The original is here.

Spike activity 06-06-2014

Quick links from the past week in mind and brain news:

Psychedelic chemist, godfather of Ecstasy, and lover of phenethylamines, Alexander Shulgin, has left the building. PhysOrg has an obituary.

New Republic looks back at 50 years of the landmark account of psychosis ‘I Never Promised You a Rose Garden’.

The US Secret Service wants a sarcasm detection tool for Twitter reports The Telegraph. Their irony detection tool is apparently still switched off.

Aeon Magazine has a piece on how artificial intelligence is being used to develop the first generation of sex robots. Voight-Kampff plugin for Tinder coming soon.

British folk: Now that BBC Future is available to people in the country it is based in, do check out its large cache of excellent psychology and neuroscience articles.

Mosaic has an extensive article on the US Military’s interest in boosting the brain by passing small electrical currents through it.

Go check out this excellent piece on ‘mirror neurons’ and what they’re likely to be actually doing from Nautilus magazine.

Advances in the History of Psychology blog has an interesting piece on how Little Albert may not have been correctly identified after all.

How to Criticize with Kindness: Philosopher of Mind Daniel Dennett brings some wisdom and describes the four steps to arguing intelligently over at Brain Pickings.

The Economist has a great interview with risk psychologist Gerd Gigerenzer.

A festival of anxious art

If you’re in London during June, the Anxiety Arts Festival is surprisingly diverse and interesting series of events that looks at anxiety through film, theatre and visual arts.

The festival is being curated by the Mental Health Foundation who have put together a genuinely exciting programme that avoids the curse of constant niceness and goes into some quite challenging areas.

Highlights include the darkly comic play Non-stop Exotic Anxiety, Ian Curtis and Joy Division biopic Control, South London Gallery exhibition The Military Industrial Complex on consensual reality, the irrepressible CoolTan Arts event Mad Hatters Tea Party, and Hearing Things – a theatre production of improvised scenes with mental health service users, professionals, and professional actors.

There’s masses more events and its one not to miss.
 

Link to Anxiety Festival.

Happy Birthday Tetris!

Released on 6th of June 1984, Tetris is 30 years old today. Here’s a video where I try and explain something of the psychology of Tetris:

All credit for the graphics to Andrew Twist. What I say in the video is based on an article I wrote a while back for BBC Future.

As well as hijacking the minds and twitchy fingers of puzzle-gamers for 30 years, Tetris has also been involved in some important psychological research.

My favourite is Kirsh and Maglio’s work on “epistemic action“, which showed how Tetris players prefer to rotate the blocks in the game world rather than mentally. This using the world in synchrony with your mental representations is part of what makes it so immersive, I argue.

Other research has looked at whether Tetris’s hook on our visual imagery can be used to help people with PTSD flashbacks.

And don’t forget that Tetris was the control condition is Green and Bavelier’s now famous studies of how action video games can train visual attention

In my own research I’ve used simple games to explore skill learning. John Lindstedt and Wayne Gray at Rensselaer Polytechnic Institute have been pursuing a parallel line looking at expertise in Tetris players.

I’m sure there are more examples, if you know of any researching using Tetris let me know. Happy Birthday Tetris!

Spike activity 30-05-2014

Quick links from the past week in mind and brain news:

If you’ve not been keeping up with the internet, there’s been a replication crisis hoedown and everyone’s had a go on the violin.

Political Science Replication had a good summary. Schnall’s reply, the rise of ‘negative psychology’ and a pointed response.

Military Plans To Test Brain Implants To Fight Mental Disorders reports NPR. If only there was some way to avoid traumatising people…

The BPS Research Digest has been hosting some amazing guest mind and brain writers and here’s an index to all their articles.

The Myth of Einstein’s Brain. Neuroskeptic has an excellent piece about how studies of his kidnapped brain don’t actually tell us much.

The Best Illusion of the Year contest has just announced it’s 2014 winners.

Spacetimemind is a new podcast with some good philosophy of mind material.

Neuroscientists win 2014 Kavli Prize in neuroscience: Brenda Milner, John O’Keefe, and Marcus Raichle

The Blind Woman Who Sees Rain, But Not Her Daughter’s Smile. Another fascinating piece from NPR.

Brain Watch asks ‘what happens if you apply electricity to the brain of a corpse?’ Don’t try this at home.

Philosopher fight in the New York Review of Books: Patricia Churchland and Colin McGinn on brains and minds and retorts like only philosophers can manage.

The best way to win an argument

How do you change someone’s mind if you think you are right and they are wrong? Psychology reveals the last thing to do is the tactic we usually resort to.

You are, I’m afraid to say, mistaken. The position you are taking makes no logical sense. Just listen up and I’ll be more than happy to elaborate on the many, many reasons why I’m right and you are wrong. Are you feeling ready to be convinced?

Whether the subject is climate change, the Middle East or forthcoming holiday plans, this is the approach many of us adopt when we try to convince others to change their minds. It’s also an approach that, more often than not, leads to the person on the receiving end hardening their existing position. Fortunately research suggests there is a better way – one that involves more listening, and less trying to bludgeon your opponent into submission.

A little over a decade ago Leonid Rozenblit and Frank Keil from Yale University suggested that in many instances people believe they understand how something works when in fact their understanding is superficial at best. They called this phenomenon “the illusion of explanatory depth“. They began by asking their study participants to rate how well they understood how things like flushing toilets, car speedometers and sewing machines worked, before asking them to explain what they understood and then answer questions on it. The effect they revealed was that, on average, people in the experiment rated their understanding as much worse after it had been put to the test.

What happens, argued the researchers, is that we mistake our familiarity with these things for the belief that we have a detailed understanding of how they work. Usually, nobody tests us and if we have any questions about them we can just take a look. Psychologists call this idea that humans have a tendency to take mental short cuts when making decisions or assessments the “cognitive miser” theory.

Why would we bother expending the effort to really understand things when we can get by without doing so? The interesting thing is that we manage to hide from ourselves exactly how shallow our understanding is.

It’s a phenomenon that will be familiar to anyone who has ever had to teach something. Usually, it only takes the first moments when you start to rehearse what you’ll say to explain a topic, or worse, the first student question, for you to realise that you don’t truly understand it. All over the world, teachers say to each other “I didn’t really understand this until I had to teach it”. Or as researcher and inventor Mark Changizi quipped: “I find that no matter how badly I teach I still learn something”.

Explain yourself

Research published last year on this illusion of understanding shows how the effect might be used to convince others they are wrong. The research team, led by Philip Fernbach, of the University of Colorado, reasoned that the phenomenon might hold as much for political understanding as for things like how toilets work. Perhaps, they figured, people who have strong political opinions would be more open to other viewpoints, if asked to explain exactly how they thought the policy they were advocating would bring about the effects they claimed it would.

Recruiting a sample of Americans via the internet, they polled participants on a set of contentious US policy issues, such as imposing sanctions on Iran, healthcare and approaches to carbon emissions. One group was asked to give their opinion and then provide reasons for why they held that view. This group got the opportunity to put their side of the issue, in the same way anyone in an argument or debate has a chance to argue their case.

Those in the second group did something subtly different. Rather that provide reasons, they were asked to explain how the policy they were advocating would work. They were asked to trace, step by step, from start to finish, the causal path from the policy to the effects it was supposed to have.

The results were clear. People who provided reasons remained as convinced of their positions as they had been before the experiment. Those who were asked to provide explanations softened their views, and reported a correspondingly larger drop in how they rated their understanding of the issues. People who had previously been strongly for or against carbon emissions trading, for example, tended to became more moderate – ranking themselves as less certain in their support or opposition to the policy.

So this is something worth bearing in mind next time you’re trying to convince a friend that we should build more nuclear power stations, that the collapse of capitalism is inevitable, or that dinosaurs co-existed with humans 10,000 years ago. Just remember, however, there’s a chance you might need to be able to explain precisely why you think you are correct. Otherwise you might end up being the one who changes their mind.

This is my BBC Future column from last week. The original is here.

The day video games ate my school child

The BBC is reporting that a UK teachers union “is calling for urgent action over the impact of modern technology on children’s ability to learn” and that “some pupils were unable to concentrate or socialise properly” due to what they perceive as ‘over-use’ of digital technology.

Due to evidence reviewed by neuroscientist Kathryn Mills in a recent paper (pdf) we know that we’ve really got no reason to worry about technology having an adverse effects on kids’ brains.

It may not be that the teachers’ union is completely mistaken, however. They may be on to something but maybe just not what they think they’re onto.

To make sense of the confusion, you need to check out an elegant study completed by psychologists Robert Weis and Brittany Cerankosky who decided to test the psychological effects of giving young boys video game consoles.

They asked for families to take part who did not have a video-game system already in their home, had a parent interested in purchasing a system for their use, and where the kid had no history of developmental, behavioural, medical, or learning problems.

They ran a randomised controlled trial or RCT where 6 to 9-year-old boys were first given neuropsychological tests to measure their cognitive abilities (memory, concentration and problem-solving) and then randomly assigned to get a video games console.

The families in the control group were promised a console at the end of the study, by the way, so they didn’t think ‘oh sod it’ and go and buy one anyway.

So, we have half the kids with spanking brand new console, and, as part of the trial, the amount of time kids spent gaming and doing their school work was measured throughout, as was reporting of any behavioural problems. At the end of the study their academic progress was measured and their cognitive abilities were tested again.

The results were clear: kids who got video game consoles were worse off academically compared to their non-console-owning peers – their progress in reading and writing had suffered.

But this wasn’t due to an impact on their concentration, memory, problem-solving or behaviour – their neuropsychological and social performance was completely unaffected.

By looking at how much time the kids spent on the consoles, they found that reduced academic performance was due to the fact that kids in the console-owning families started spending less time doing their homework.

In other words, if your kids play a lot of computer games instead of doing homework they may well appear worse off, and from the teachers’ point-of-view, might seem a little slowed-down compared to their peers, but this is not due to cognitive changes.

Interestingly, teachers may not be in the best position to see this distinction very well because they tend, like the rest of us, to measure ability by performance in the tasks they set and not in comparison to neuropsychological test performance.

The solution is not to panic about technology as this same conclusion probably applies to anything that displaces homework (too many piano lessons will have the same effect) but good parental management of out-of-school time is clearly important.
 

Link to locked study on the effects of video games.

Follow

Get every new post delivered to your Inbox.

Join 22,998 other followers