Why bad news dominates the headlines

Why are newspapers and TV broadcasts filled with disaster, corruption and incompetence? It may be because we’re drawn to depressing stories without realising, says psychologist Tom Stafford.

When you read the news, sometimes it can feel like the only things reported are terrible, depressing events. Why does the media concentrate on the bad things in life, rather than the good? And what might this depressing slant say about us, the audience?

It isn’t that these are the only things that happen. Perhaps journalists are drawn to reporting bad news because sudden disaster is more compelling than slow improvements. Or it could be that newsgatherers believe that cynical reports of corrupt politicians or unfortunate events make for simpler stories. But another strong possibility is that we, the readers or viewers, have trained journalists to focus on these things. Many people often say that they would prefer good news: but is that actually true?

To explore this possibility, researchers Marc Trussler and Stuart Soroka, set up an experiment, run at McGill University in Canada. They were dissatisfied with previous research on how people relate to the news – either the studies were uncontrolled (letting people browse news at home, for example, where you can’t even tell who is using the computer), or they were unrealistic (inviting them to select stories in the lab, where every participant knew their choices would be closely watched by the experimenter). So, the team decided to try a new strategy: deception.

 

Trick question

Trussler and Soroka invited participants from their university to come to the lab for “a study of eye tracking”. The volunteers were first asked to select some stories about politics to read from a news website so that a camera could make some baseline eye-tracking measures. It was important, they were told, that they actually read the articles, so the right measurements could be prepared, but it didn’t matter what they read.

After this ‘preparation’ phase, they watched a short video (the main purpose of the experiment as far as the subjects were concerned, but it was in fact just a filler task), and then they answered questions on the kind of political news they would like to read.

The results of the experiment, as well as the stories that were read most, were somewhat depressing. Participants often chose stories with a negative tone – corruption, set-backs, hypocrisy and so on – rather than neutral or positive stories. People who were more interested in current affairs and politics were particularly likely to choose the bad news.

And yet when asked, these people said they preferred good news. On average, they said that the media was too focussed on negative stories.

 

Danger reaction

The researchers present their experiment as solid evidence of a so called “negativity bias“, psychologists’ term for our collective hunger to hear, and remember bad news.

It isn’t just schadenfreude, the theory goes, but that we’ve evolved to react quickly to potential threats. Bad news could be a signal that we need to change what we’re doing to avoid danger.

As you’d expect from this theory, there’s some evidence that people respond quicker to negative words. In lab experiments, flash the word “cancer”, “bomb” or “war” up at someone and they can hit a button in response quicker than if that word is “baby”, “smile” or “fun” (despite these pleasant words being slightly more common). We are also able to recognise negative words faster than positive words, and even tell that a word is going to be unpleasant before we can tell exactly what the word is going to be.

So is our vigilance for threats the only way to explain our predilection for bad news? Perhaps not.

There’s another interpretation that Trussler and Soroka put on their evidence: we pay attention to bad news, because on the whole, we think the world is rosier than it actually is. When it comes to our own lives, most of us believe we’re better than average, and that, like the clichés, we expect things to be all right in the end. This pleasant view of the world makes bad news all the more surprising and salient. It is only against a light background that the dark spots are highlighted.

So our attraction to bad news may be more complex than just journalistic cynicism or a hunger springing from the darkness within.

And that, on another bad news day, gives me a little bit of hope for humanity.

Are women and men forever destined to think differently?

By Tom Stafford, University of Sheffield

The headlines

The Australian: Male and female brains still unequal

The International Institute for Applied Systems Analysis: Gender disparities in cognition will not diminish

The Economist: A variation in the cognitive abilities of the two sexes may be more about social development than gender stereotypes

The story

Everybody has an opinion on men, women and the difference (or not) between them. Now a new study has used a massive and long-running European survey to investigate how differences in cognitive ability are changing. This is super smart, because it offers us an escape from arguing about whether men and women are different in how they think, allowing us some insight into how any such differences might develop.

What they actually did

Researchers led by Daniela Weber at Austria’s International Institute for Applied Systems Analysis analysed data collected as part of the European Survey of Health, Ageing and Retirement. This includes data analysed in this study from approximately 31,000 adults, men and women all aged older than 50. As well as answering demographic questions, the survey participants took short quizzes which tested their memory, numeracy and verbal fluency (this last item involved a classic test which asks people to name as many animals as they could in 60 seconds). Alongside each test score, we have the year the participant was born in, as well as measures of gender equality and economic development for the country where they grew up.

What they found

The results show that as a country develops economically, the differences in cognitive ability between men and women change. But the pattern isn’t straightforward. Differences in verbal fluency disappear (so that an advantage on this test for men born in the 1920s over women is not found for those born in the 1950s). Differences in numeracy diminish (so the male advantage is less) and differences in memory actually increase (so that a female advantage is accentuated).

Further analysis looked at the how these differences in cognitive performance related to the amount of education men and women got. In all regions women tended to have fewer years of education, on average, then men. But, importantly, the size of this difference varied. This allowed the researchers to gauge how differences in education affected cognitive performance.

For all three abilities tested, there was a relationship between the size of the differences in the amount of education and the size of the difference in cognitive performance: fewer years of education for women was associated with worse scores for women, as you’d expect.

What varied for the three abilities was in the researchers’ predictions for the situation where men and women spent an equal amount of time in education: for memory this scenario was associated with a distinct female advantage, for numeracy a male advantage and for verbal fluency, there was no difference.

What this means

The thing that dogs studies on gender differences in cognition is the question of why these differences exist. People have such strong expectations, that they often leap to the assumption that any observed difference must reflect something fundamental about men vs women. Here, consider the example of the Australian newspaper which headlined their take on this story as telling us something about “male and female brains”, the implication being that the unequalness was a fundamental, biological, difference. In fact, research often shows that gender differences in cognitive performance are small, and even then we don’t know why these differences exist.

The great thing about this study is that by looking at how gender differences evolve over time it promises insight into what drives those difference in the first place. The fact that the female memory advantage increases as women are allowed more access to education is, on the face of it, suggestive evidence that at least one cognitive difference between men and women may be unleashed by more equal societies, rather than removed by them.

Tom’s take

The most important thing to take from this research is – as the authors report – increasing gender equality disproportionately benefits women. This is because – no surprise! – gender inequality disproportionately disadvantages women. Even in the area of cognitive performance, this historical denial of opportunities, health and education to women means, at a population level, they have more potential to increase their scores on these tests.

Along with other research on things like IQ, this study found systemmatic improvements in cognitive performance across time for both men and women – as everyone’s opportunities and health increases, so does their cognitive function.

But the provocative suggestion of this study is that as societies develop we won’t necessarily see all gender differences go away. Some cognitive differences may actually increase when women are at less of a disadvantage.

You don’t leap to conclusions based on one study, but this is a neat contribution. One caveat is that even though indices such as “years in education” show diminished gender inequality in Europe, you’d be a fool to think that societies which educated men and women for an equal number of years treated them both equally and put equal expectations on them.

Even if you thought this was true for 2014, you wouldn’t think this was true for European societies of the 1950s (when the youngest of these study participants were growing up). There could be very strong societal influences on cognitive ability – such as expecting women to be good with words and bad with numbers – that simply aren’t captured by the data analysed here.

Personally, I find it interesting to observe how keen people are to seize on such evidence that “essential” gender differences definitely do exist (despite the known confounds of living in a sexist society). My preferred strategy would be to hold judgement and focus on the remaking the definitely sexist society. For certain, we’ll only get the truth when we have an account of how cognitive abilities develop within both biological and social contexts. Studies like this point the way, and suggest that whatever the truth is, it should have some surprises for everyone.

Read more

The original research: The changing face of cognitive gender differences in Europe

My previous column on gender differences: Are men better wired to read maps or is it a tired cliché?

Cordelia Fine’s book, Delusions of gender: how our minds, society, and neuro-sexism create difference

The Conversation

This article was originally published on The Conversation.
Read the original article.

Shuffle Your Mind: Short Film Screenings

If you’re around in London Saturday 2nd August I’m curating a showing of short films about psychosis, hallucinations and mental health as part of the fantastic Shuffle Festival.

The films include everything from a first-person view of voice hearing, to out-of-step behaviour in the urban sprawl, to a free-diver’s deep sea hallucinations.

There will be a discussion after the showing with film-makers and first-person visionaries about the challenges of depicting altered minds, other inner worlds and the limits of mental health.

Tickets are free but you have to book as there are only 40 seats.

If you want to join us, find the event on this page (which doesn’t list all the films, so prepare for some surprises) and click to book.

Seeing ourselves through the eyes of the machine

I’ve got an article in The Observer about how our inventions have profoundly shaped how we view ourselves because we’ve traditionally looked to technology for metaphors of human nature.

We tend to think that we understand ourselves and then create technologies to take advantage of that new knowledge but it usually happens the other way round – we invent something new and then use that as a metaphor to explain the mind and brain.

As history has moved on, the mind has been variously explained in terms of a wax tablets, a house with many rooms, pressures and fluids, phonograph recordings, telegraph signalling, and computing.

The idea that these are metaphors sometimes gets lost which, in some ways, is quite worrying.

It could be that we’ve reached “the end of history” as far as neuroscience goes and that everything we’ll ever say about the brain will be based on our current “brain as calculation” metaphors. But if this is not the case, there is a danger that we’ll sideline aspects of human nature that don’t easily fit the concept. Our subjective experience, emotions and the constantly varying awareness of our own minds have traditionally been much harder to understand as forms of “information processing”. Importantly, these aspects of mental life are exactly where things tend to go awry in mental illness, and it may be that our main approach for understanding the mind and brain is insufficient for tackling problems such as depression and psychosis. It could be we simply need more time with our current concepts, but history might show us that our destiny lies in another metaphor, perhaps from a future technology.

I mention Douwe Draaisma’s book Metaphors of Memory in the article but I also really recommend Alison Winter’s book Memory: Fragments of a Modern History which also covers the fascinating interaction between technological developments and how we understand ourselves.

You can read my full article at the link below.
 

Link to article in The Observer.

Awaiting a theory of neural weather

In a recent New York Times editorial, psychologist Gary Marcus noted that neuroscience is still awaiting a ‘bridging’ theory that elegantly connects neuroscience with psychology.

This reflects a common belief in cognitive science that there is a ‘missing law’ to be discovered that will tell us how mind and brain are linked – but it is quite possible there just isn’t one to be discovered.

Marcus, not arguing for the theory himself, describes it when he writes:

What we are really looking for is a bridge, some way of connecting two separate scientific languages — those of neuroscience and psychology.

Such bridges don’t come easily or often, maybe once in a generation, but when they do arrive, they can change everything. An example is the discovery of DNA, which allowed us to understand how genetic information could be represented and replicated in a physical structure. In one stroke, this bridge transformed biology from a mystery — in which the physical basis of life was almost entirely unknown — into a tractable if challenging set of problems, such as sequencing genes, working out the proteins that they encode and discerning the circumstances that govern their distribution in the body.

Neuroscience awaits a similar breakthrough. We know that there must be some lawful relation between assemblies of neurons and the elements of thought, but we are currently at a loss to describe those laws.

The idea of a DNA-like missing component that will allow us to connect theories of psychology and neuroscience is an attractive one, but it is equally as likely that the connection between mind and brain is more like the relationship between molecular interactions and the weather.

In this case, there is no ‘special theory’ that connects weather to molecules because different atmospheric phenomena are understood in multiple ways and across multiple models, each of which has a differing relationship to the scale at which the physical data is understood – fluid flows, as statistical models, atomic interactions and so on.

In explanatory terms, ‘psychology’ is probably a lot like the weather. The idea of their being a ‘psychological level’ is a human concept and its conceptual components won’t neatly relate to neural function in a uniform way.

Some functions will have much more direct relationships – like basic sensory information and its representation in the brain’s ‘sensotopic maps’. A good example might be how visual information in space is represented in an equivalent retinotopic map in the brain.

Other functions will have more more indirect relationships but in great part because of how we define ‘functions’. Some have very empirical definitions – take iconic memory – whereas others will be cultural or folk concepts – think vicarious embarrassment or nostalgia.

So it’s unlikely we’re going to find an all-purpose theoretical bridge to connect psychology and neuroscience. Instead, we’ll probably end up with what Kenneth Kendler calls ‘patchy reductionism’ – making pragmatic links between mind and brain where possible using a variety of theories and descriptions.

A search for a general ‘bridging theory’ may be a fruitless one.
 

Link to NYT piece ‘The Trouble With Brain Science’.

Out on a limb too many

Two neuropsychologists have written a fascinating review article about the desire to amputate a perfectly healthy limb known variously as apotemnophilia, xenomelia or body integrity identity disorder

The article is published in the Journal of Neuropsychiatric Disease and Treatment although some who have these desires would probably disagree that it is a disease or disorder and are more likely to compare it to something akin to being transgender.

The article also discusses the two main themes in the research literature: an association with sexual fetish for limb aputation (most associated with the use of the name apotemnophilia) and an alteration in body image linked to differences in the function of the parietal lobe in the brain (most associated with the use of the name xenomelia).

It’s a fascinating review of what we know about this under-recognised form of human experience but it also has an interesting snippet about how this desire first came to light not in the scientific literature, but in the letters page of Penthouse magazine:

A first description of this condition traces back to a series of letters published in 1972 in the magazine Penthouse. These letters were from erotically-obsessed persons who wanted to become amputees themselves. However, the first scientific report of this desire only appeared in 1977: Money et al described two cases who had intense desire toward amputation of a healthy limb. Another milestone was a 2005 study by Michael First, an American psychiatrist, who published the first systematic attempt to describe individuals who desire amputation of a healthy limb. Thanks to this survey, which included 52 volunteers, a number of key features of the condition are identified: gender prevalence (most individuals are men), side preference (left-sided amputations are most frequently desired), and finally, a preference toward amputation of the leg versus the arm.

The review also discusses a potentially related experience which has recently been reported – the desire to be paralysed.

If you want a more journalistic account, Matter published an extensive piece on the condition last year.
 

Link to scientific review article on apotemnophilia / xenomelia.
Link to Matter article.

Towards a scientifically unified therapy

nature_scienceToday’s edition of Nature has an excellent article on the need to apply cognitive science to understanding how psychological therapies work.

Psychological therapies are often called ‘talking treatments’ but this is often a misleading name. Talking is essential, but it’s not where most of the change happens.

Like seeing a personal trainer in the gym, communication is key, but it’s the exercise which accounts for the changes.

In the same way, psychological therapy is only as effective as the experience of putting changes into practice, but we still know relatively little about the cognitive science behind this process.

Unfortunately, there is a traditional but unhelpful divide in psychology where some don’t see any sort of emotional problem as biological in any way, and the contrasting divide in psychiatry where biology is considered the only explanation in town.

The article in Nature argues that this is pointless and counter-productive:

It is time to use science to advance the psychological, not just the pharmaceutical, treatment of those with mental-health problems. Great strides can and must be made by focusing on concerns that are common to fields from psychology, psychiatry and pharmacology to genetics and molecular biology, neurology, neuroscience, cognitive and social sciences, computer science, and mathematics. Molecular and theoretical scientists need to engage with the challenges that face the clinical scientists who develop and deliver psychological treatments, and who evaluate their outcomes. And clinicians need to get involved in experimental science. Patients, mental-health-care providers and researchers of all stripes stand to benefit.

The piece tackles many good examples of why this is the case and sets out three steps for bridging the divide.

Essential reading.
 

Link to ‘Psychological treatments: A call for mental-health science’.

Why do we bite our nails?

It can ruin the appearance of your hands, could be unhygienic and can hurt if you take it too far. So why do people do it? Biter Tom Stafford investigates

What do ex-British prime minster Gordon Brown, Jackie Onassis, Britney Spears and I all have in common? We all are (or were) nail biters.

It’s not a habit I’m proud of. It’s pretty disgusting for other people to watch, ruins the appearance of my hands, is probably unhygienic and sometimes hurts if I take it too far. I’ve tried to quit many times, but have never managed to keep it up.

Lately I’ve been wondering what makes someone an inveterate nail-biter like me. Are we weaker willed? More neurotic? Hungrier? Perhaps, somewhere in the annals of psychological research there could be an answer to my question, and maybe even hints about how to cure myself of this unsavoury habit.

My first dip into the literature shows up the medical name for excessive nail biting: ‘onychophagia’. Psychiatrists classify it as an impulse control problem, alongside things like obsessive compulsive disorder. But this is for extreme cases, where psychiatric help is beneficial, as with other excessive grooming habits like skin picking or hair pulling. I’m not at that stage, falling instead among the majority of nail biters who carry on the habit without serious side effects. Up to 45% of teenagers bite their nails, for example; teenagers may be a handful but you wouldn’t argue that nearly half of them need medical intervention. I want to understand the ‘subclinical’ side of the phenomenon – nail biting that isn’t a major problem, but still enough of an issue for me to want to be rid of it.

It’s mother’s fault

Psychotherapists have had some theories about nail biting, of course. Sigmund Freud blamed it on arrested psycho-sexual development, at the oral stage (of course). Typical to Freudian theories, oral fixation is linked to myriad causes, such as under-feeding or over-feeding, breast-feeding too long, or problematic relationship with your mother. It also has a grab-bag of resulting symptoms: nail biting, of course, but also a sarcastic personality, smoking, alcoholism and love of oral sex. Other therapists have suggested nail-biting may be due to inward hostility – it is a form of self-mutilation after all – or nervous anxiety.

Like most psychodynamic theories these explanations could be true, but there’s no particular reason to believe they should be true. Most importantly for me, they don’t have any strong suggestions on how to cure myself of the habit. I’ve kind of missed the boat as far as extent of breast-feeding goes, and I bite my nails even when I’m at my most relaxed, so there doesn’t seem to be an easy fix there either. Needless to say, there’s no evidence that treatments based on these theories have any special success.

Unfortunately, after these speculations, the trail goes cold. A search of a scientific literature reveals only a handful of studies on treatment of nail-biting. One reports that any treatment which made people more aware of the habit seemed to help, but beyond that there is little evidence to report on the habit. Indeed, several of the few articles on nail-biting open by commenting on the surprising lack of literature on the topic.

Creature of habit

Given this lack of prior scientific treatment, I feel free to speculate for myself. So, here is my theory on why people bite their nails, and how to treat it.

Let’s call it the ‘anti-theory’ theory. I propose that there is no special cause of nail biting – not breastfeeding, chronic anxiety or a lack of motherly love. The advantage of this move is that we don’t need to find a particular connection between me, Gordon, Jackie and Britney. Rather, I suggest, nail biting is just the result of a number of factors which – due to random variation – combine in some people to create a bad habit.

First off, there is the fact that putting your fingers in your mouth is an easy thing to do. It is one of the basic functions for feeding and grooming, and so it is controlled by some pretty fundamental brain circuitry, meaning it can quickly develop into an automatic reaction. Added to this, there is a ‘tidying up’ element to nail biting – keeping them short – which means in the short term at least it can be pleasurable, even if the bigger picture is that you end up tearing your fingers to shreds. This reward element, combined with the ease with which the behaviour can be carried out, means that it is easy for a habit to develop; apart from touching yourself in the genitals it is hard to think of a more immediate way to give yourself a small moment of pleasure, and biting your nails has the advantage of being OK at school. Once established, the habit can become routine – there are many situations in everyone’s daily life where you have both your hands and your mouth available to use.

Understanding nail-biting as a habit has a bleak message for a cure, unfortunately, since we know how hard bad habits can be to break. Most people, at least once per day, will lose concentration on not biting their nails.

Nail-biting, in my view, isn’t some revealing personality characteristic, nor a maladaptive echo of some useful evolutionary behaviour. It is the product of the shape of our bodies, how hand-to-mouth behaviour is built into (and rewarded in) our brains and the psychology of habit.

And, yes, I did bite my nails while writing this column. Sometimes even a good theory doesn’t help.

 

This was my BBC Future column from last week

The concept of stress, sponsored by Big Tobacco

NPR has an excellent piece on how the scientific concept of stress was massively promoted by tobacco companies who wanted an angle to market ‘relaxing’ cigarettes and a way for them to argue that it was stress, not cigarettes, that was to blame for heart disease and cancer.

They did this by funding, guiding and editing the work of renowned physiologist Hans Selye who essentially founded the modern concept of stress and whose links with Big Tobacco have been largely unknown.

For the past decade or so, [Public Health Professor Mark] Petticrew and a group of colleagues in London have been searching through millions of documents from the tobacco industry that were archived online in the late ’90s as part of a legal settlement with tobacco companies.

What they’ve discovered is that both Selye’s work and much of the work around Type A personality were profoundly influenced by cigarette manufacturers. They were interested in promoting the concept of stress because it allowed them to argue that it was stress — not cigarettes — that was to blame for heart disease and cancer.

“In the case of Selye they vetted … the content of the paper, they agreed the wording of papers,” says Petticrew, “tobacco industry lawyers actually influenced the content of his writings, they suggested to him things that he should comment on.”

They also, Petticrew says, spent a huge amount of money funding his research. All of this is significant, Petticrew says, because Selye’s influence over our ideas about stress are hard to overstate. It wasn’t just that Selye came up with the concept, but in his time he was a tremendously respected figure.

Despite the success of the campaign to associate smoking with stress relief, the idea that smoking alleviates anxiety is almost certainly wrong. It tends to just relieve anxiety-provoking withdrawal and quitting smoking reduces overall anxiety levels.

Although the NPR article focuses on Selye and his work on stress, another big name was recruited by Big Tobacco to promote their theories.

It’s still little known that psychologist Hans Eysenck took significant sums of cash from tobacco companies.

They paid for a lot of Eysenck’s research that tried to show that the relationship between lung cancer and smoking was not direct but was mediated by personality differences. There was also lots of other research arguing that a range of smoking related health problems were only present in certain personality types.

Tobacco companies wanted to fund this research to cite it in court cases where they were defending themselves against lung cancer sufferers. It was their personalities, rather than their 20-a-day habit, that was a key cause behind their imminent demise, they wanted to argue in court, and they needed ‘hard science’ to back it up. So they bought some.

However, the link between ‘father of stress’ Hans Seyle and psychologist Hans Eysenck was not just that they were funded by the same people.

A study by Petticrew uncovered documents showing that both Seyle and Eysenck appeared in a 1977 tobacco industry promotional film together where “the film’s message is quite clear without being obvious about it — a controversy exists concerning the etiologic role of cigarette smoking in cancer.”

The ‘false controversy’ PR tactic has now became solidified as a science-denier standard.
 

Link to The Secret History Behind The Science Of Stress from NPR.
Link to paper ‘Hans Selye and the Tobacco Industry’.

Spike activity 11-07-2014

Quick links from the past week in mind and brain news:

Your Brain Is On the Brink of Chaos. Nautilus has an interesting piece on chaos the and the brain.

Neuroskeptic has a good Q&A with Zach Mainen, one of the originators of the NeuroFuture open letter demanding reform of the Human Brain Project.

There’s an open-access special issue on epilepsy in the latest edition of Nature.

The New York Times has a good piece on developments towards brain implants for cognitive enhancement.

Phantom limb pain tortures amputees and puzzles scientists. A man in Cambodia cycles round the country and treats it with mirrors. Excellent Mosaic Science piece.

Practical Ethics has an excellent piece on ‘tidying up psychiatry’.

Searching for the “Free Will” Neuron. Interesting piece from MIT Tech Review.

PLOS has launched a neuroscience channel.

Adults, like children, have a tendency to think vision is more informative than it is. Interesting piece on our understanding of what we understanding though looking from the BPS Research Digest

The Toast has what seems to be the first ever first-person account of Cotard’s delusion, the belief that you’re dead, in someone who experienced intense psychosis.

A thought lab in the sun

Neuroscientist Karl Friston, being an absolute champ, in an interview in The Lancet Psychiatry

“I get up very late, I go and smoke my pipe in the conservatory, hopefully in the sunshine with a nice cup of coffee, and have thoughts until I can raise the energy to have a bath. I don’t normally get to work until mid day.”

I have to say, I have a very similar approach which is getting up very early, drinking Red Bull, not having any thoughts, and raising the energy to catch a bus to an inpatient ward.

The man clearly doesn’t know the good life when he sees it.

The Lancet Psychiatry is one of the new speciality journals from the big names in medical publishing.

It seems to be publishing material from the correspondence and ‘insight’ sections (essays and the like) without a paywall, so there’s often plenty for the general reader to catch up on. It also has a podcast which is aimed at mental health professionals.
 

Link to interview with Karl Friston.

Motherhood, apple pie and replication

Who could possibly be against replication of research results? Jason Mitchell of Harvard University is, under some conditions, for reasons described in his essay On the emptiness of failed replications.

I wrote something for the Centre for Open Science which tries to draw out the sensible points in Mitchell’s essay – something I thought worth doing since for many people being against replication in science is like being against motherhood and apple pie. It’s worth noting that I was invited to do this by Brian Nosek, who is co-founder of the Center for Open Science and instrumental in the Many Labs projects. As such, Brian is implicitly one of the targets of Mitchell’s criticisms, so kudos to him for encouraging this discussion.

Here’s my commentary: What Jason Mitchell’s ‘On the emptiness of failed replications’ gets right

Memories of ‘hands on’ sex therapy

There’s an amusing passage in Andrew Solomon’s book Far From the Tree where he recounts his own experience of a curious attempt at surrogate partner therapy – a type of sex therapy where a ‘stand in’ partner engages with sexual activity with the client to help overcome sexual difficulties.

In Solomon’s case, he was a young gay man still confused about his sexuality who signed himself up to a cut-price clinic to try and awaken any possibility of ‘hidden heterosexual urges’.

It’s a curious historical snapshot, presumably from the early 1980s, but also quite funny as Solomon dryly recounts the futile experience.

When I was nineteen, I read an ad in the back of New York magazine that offered surrogate therapy for people who had issues with sex. I still believed the problem of whom I wanted was subsidiary to the problem of whom I didn’t want. I knew the back of a magazine was not a good place to find treatment, but my condition was too embarrassing to reveal to anyone who knew me.

Taking my savings to a walk-up office in Hell’s Kitchen, I subjected myself to long conversations about my sexual anxieties, unable to admit to myself or the so-called therapist that I was actually just not interested in women. I didn’t mention the busy sexual life I had by this time with men. I began “counselling” with people I was encouraged to call “doctors,” who would prescribe “exercises” with my “surrogates” – women who were not exactly prostitutes but who were also not exactly anything else.

In one protocol, I had to crawl around naked on all fours pretending to be a dog while the surrogate pretended to be a cat; the metaphor of enacting intimacy between mutually averse species is more loaded than I noticed at the time. I became curiously fond of these women, one of whom, an attractive blonde from the Deep South, eventually told me she was a necrophiliac and had taken this job after she got into trouble down the morgue.

You were supposed to keep switching girls so your ease was not limited to one sexual partner; I remember the first time a Puerto Rican woman climbed on top of me and began to bounce up and down, crying ecstatically, “You’re in me! You’re in me!” and how I lay there wondering with anxious boredom whether I had finally achieved the prize and become a qualified heterosexual.

Surrogate partner therapy is still used for a variety of sexual difficulties, although only fringe clinics now use it for pointless ‘gay conversion therapy’.

Although it is clearly in line with good psychological principles of experiential therapy, it has been quite controversial because of fears about being, as Solomon says, “not exactly prostitutes” along with some well-founded ethical concerns.

In the UK, the first bona fide clinic that used surrogate partner therapy was started in the 1970s and run by the sexologist Martin Cole – who was best known to the British public by his actually rather wonderful tabloid nickname Sex King Cole.

He spent several decades scandalising the establishment with his campaign for open and direct sex education and unstigmatised treatment of sexual dysfunction.

You can see the extent to which he rattled the self-appointed defenders of English morality by his mentions in parliamentary speeches made by concerned MPs who retold second-hand tales of scandal supposedly from Cole’s clinics.

This 1972 speech by MP Jill Knight veers from the melodramatic to the farcical as she describes how a sex surrogate “was with a client when a thunderous knocking occurred on the door and the glass panels in the door revealed a blue-clad figure topped by a policeman’s helmet. She knew at once that it was her fiance, who happened to be a policeman.”

If you want an up-to-date and level-headed discussion of surrogate partner therapy, an article by sex researcher Petra Boyton is a good place to start, and its something we’ve covered previously on Mind Hacks.

As for Cole, The Independent tracked him down, still working, in 1993, and wrote a somewhat wry profile of him.

A cultural view of agony

painNew Statesman has a fascinating article on the ‘cultural history of pain’ that tracks how our ideas about pain and suffering have radically changed through the years.

One of the most interesting, and worrying, themes is how there have been lots of cultural beliefs about whether certain groups are more or less sensitive to pain.

Needless to say, these beliefs tended to justify existing prejudices rather than stem from any sound evidence.

Some speculated whether the availability of anaesthetics and analgesics had an effect on people’s ability (as well as willingness) to cope with acute affliction. Writing in the 1930s, the distinguished pain surgeon René Leriche argued fervently that Europeans had become more sensitive to pain. Unlike earlier in the century, he claimed, modern patients “would not have allowed us to cut even a centimetre . . . without administering an anaesthetic”. This was not due to any decline of moral fibre, Leriche added: rather, it was a sign of a “nervous system differently developed, and more sensitive”.

Other physicians and scientists of the 19th and early 20th centuries wanted to complicate the picture by making a distinction between pain perception and pain reaction. But this distinction was used to denigrate “outsider” groups even further. Their alleged insensitivity to pain was proof of their humble status – yet when they did exhibit pain reactions, their sensitivity was called “exaggerated” or “hysterical” and therefore seen as more evidence of their inferiority.

 

Link to New Statesman article (via @SarahRoseCrook)

Do we really hate thinking so much we’d electrocute ourselves rather than do it?

By Tom Stafford, University of Sheffield

The headlines

The Guardian: Shocking but true: students prefer jolt of pain than being made to sit and think

Nature: We dislike being alone with our thoughts

Washington Post: Most men would rather shock themselves than be alone with their thoughts

 

The story

Quiet contemplation is so awful that when deprived of the distractions of noise, crowds or smart phones, a bunch of students would rather give themselves electric shocks than sit and think.

 

What they actually did

Psychologists from the universities of Virginia and Harvard in the US carried out a series of 11 studies in which participants – including students and non-students – were left in an unadorned room for six to 15 minutes and asked to “spend time entertaining themselves with their thoughts.” Both groups, and men and women equally, were unable to enjoy this task. Most said they found it difficult to concentrate and that their minds wandered.

In one of the studies, participants were given the option to give themselves an electric shock, for no given reason or reward. Many did, including the majority of male participants, despite the fact that the vast majority of participants had previously rated the shocks as unpleasant and said they would pay to avoid them.

 

How plausible is this?

This is a clever, provocative piece of research. The results are almost certainly reliable; the authors, some of whom are extremely distinguished, discovered in the 11 studies the same basic effect – namely, that being asked to sit and think wasn’t enjoyable. The data from the studies is also freely available, so there’s no chance of statistical jiggery-pokery. This is a real effect. The questions, then, are over what exactly the finding means.

 

Tom’s take

Contrary to what some reporters have implied, this result isn’t just about students – non-students also found being made to sit and think aversive, and there were no differences in this with age. And it isn’t just about men – women generally found the experience as unpleasant. The key result is that being made to sit and think is unpleasant so let’s look at this first before thinking about the shocks.

The results fit with research on sensory deprivation from 50 years ago. Paradoxically, when there are no distractions people find it hard to concentrate. It seems that for most of us, most of the time, our minds need to receive stimulus, interact with the environment, or at least have a task to function enjoyably. Thinking is an active process which involves the world – a far cry from some ideals of “pure thought”.

What the result certainly doesn’t mean, despite the interpretation given by some people – including one author of the study – is that people don’t like thinking. Rather, it’s fair to say that people don’t like being forced to do nothing but think.

It’s possible that there is a White Bear Effect here – also known as the ironic process theory. Famously, if you’re told to think of anything except a white bear, you can’t help but think about a white bear. If you imagine the circumstances of these studies, participants were told they had to sit in their chairs and just think. No singing, no exploring, no exercises. Wouldn’t that make you spend your time (unpleasantly) ruminating on what you couldn’t do?

In this context, are the shocks really so surprising? The shocks were very mild. The participants rated them as unpleasant when they were instructed to shock themselves, but we all know that there’s a big difference between having something done to you (or being told to do something) and choosing to do it yourself.

Although many participants chose to shock themselves I wouldn’t say they were avoiding thinking – rather they were thinking about what it would be like to get another shock. One participant shocked himself 190 times. Perhaps he was exploring how he could learn to cope with the discomfort. Curiosity and exploration are all hallmarks of thinking. It is only the very limited internally directed, stimulus-free kind of thinking to which we can apply the conclusion that it isn’t particular enjoyable.

 

Read more

The original paper: Just think: The challenges of the disengaged mind.

You can see the data over at the Open Science Framework.

Daniel Wegner’s brilliant book on the White Bear problem.

The Conversation

This article was originally published on The Conversation.
Read the original article.

Spike activity 27-06-2014

Quick links from the past week in mind and brain news:

Slate has a piece on developmental psychology’s WEIRD problem. Most kids in child psychology studies are from very restricted social groups – rich, educated families.

Facebook manipulated stories in users’ newsfeeds to conduct experiments on emotional contagion. Don’t remember signing the consent form for the study that appeared in PNAS?

Time covers the massive prevalence of PTSD among US veterans. The Pentagon’s PTSD treatments “appear to be local, ad hoc, incremental, and crisis-driven” with no effective evaluation.

Excellent analysis of a new study: FDA’s antidepressant warning didn’t actually backfired and cause more suicides. Neuroskeptic on the case.

Time magazine has an interesting piece on the under-reported problem of violence in women.

Interesting National Geographic piece about how new finds of human skull bones show even more complexity in the evolution of human and hominid species.

Slate has a piece on how that a lot of zoo animals are on antipsychotics because they become mentally ill when enclosed.

Follow

Get every new post delivered to your Inbox.

Join 23,906 other followers