Seeing ourselves through the eyes of the machine

I’ve got an article in The Observer about how our inventions have profoundly shaped how we view ourselves because we’ve traditionally looked to technology for metaphors of human nature.

We tend to think that we understand ourselves and then create technologies to take advantage of that new knowledge but it usually happens the other way round – we invent something new and then use that as a metaphor to explain the mind and brain.

As history has moved on, the mind has been variously explained in terms of a wax tablets, a house with many rooms, pressures and fluids, phonograph recordings, telegraph signalling, and computing.

The idea that these are metaphors sometimes gets lost which, in some ways, is quite worrying.

It could be that we’ve reached “the end of history” as far as neuroscience goes and that everything we’ll ever say about the brain will be based on our current “brain as calculation” metaphors. But if this is not the case, there is a danger that we’ll sideline aspects of human nature that don’t easily fit the concept. Our subjective experience, emotions and the constantly varying awareness of our own minds have traditionally been much harder to understand as forms of “information processing”. Importantly, these aspects of mental life are exactly where things tend to go awry in mental illness, and it may be that our main approach for understanding the mind and brain is insufficient for tackling problems such as depression and psychosis. It could be we simply need more time with our current concepts, but history might show us that our destiny lies in another metaphor, perhaps from a future technology.

I mention Douwe Draaisma’s book Metaphors of Memory in the article but I also really recommend Alison Winter’s book Memory: Fragments of a Modern History which also covers the fascinating interaction between technological developments and how we understand ourselves.

You can read my full article at the link below.
 

Link to article in The Observer.

Awaiting a theory of neural weather

In a recent New York Times editorial, psychologist Gary Marcus noted that neuroscience is still awaiting a ‘bridging’ theory that elegantly connects neuroscience with psychology.

This reflects a common belief in cognitive science that there is a ‘missing law’ to be discovered that will tell us how mind and brain are linked – but it is quite possible there just isn’t one to be discovered.

Marcus writes:

What we are really looking for is a bridge, some way of connecting two separate scientific languages — those of neuroscience and psychology.

Such bridges don’t come easily or often, maybe once in a generation, but when they do arrive, they can change everything. An example is the discovery of DNA, which allowed us to understand how genetic information could be represented and replicated in a physical structure. In one stroke, this bridge transformed biology from a mystery — in which the physical basis of life was almost entirely unknown — into a tractable if challenging set of problems, such as sequencing genes, working out the proteins that they encode and discerning the circumstances that govern their distribution in the body.

Neuroscience awaits a similar breakthrough. We know that there must be some lawful relation between assemblies of neurons and the elements of thought, but we are currently at a loss to describe those laws.

The idea of a DNA-like missing component that will allow us to connect theories of psychology and neuroscience is an attractive one, but it is equally as likely that the connection between mind and brain is more like the relationship between molecular interactions and the weather.

In this case, there is no ‘special theory’ that connects weather to molecules because different atmospheric phenomena are understood in multiple ways and across multiple models, each of which has a differing relationship to the scale at which the physical data is understood – fluid flows, as statistical models, atomic interactions and so on.

In explanatory terms, ‘psychology’ is probably a lot like the weather. The idea of their being a ‘psychological level’ is a human concept and its conceptual components won’t neatly relate to neural function in a uniform way.

Some functions will have much more direct relationships – like basic sensory information and its representation in the brain’s ‘sensotopic maps’. A good example might be how visual information in space is represented in an equivalent retinotopic map in the brain.

Other functions will have more more indirect relationships but in great part because of how we define ‘functions’. Some have very empirical definitions – take iconic memory – whereas others will be cultural or folk concepts – think vicarious embarrassment or nostalgia.

So it’s unlikely we’re going to find an all-purpose theoretical bridge to connect psychology and neuroscience. Instead, we’ll probably end up with what Kenneth Kendler calls ‘patchy reductionism’ – making pragmatic links between mind and brain where possible using a variety of theories and descriptions.

A search for a general ‘bridging theory’ may be a fruitless one.
 

Link to NYT piece ‘The Trouble With Brain Science’.

Out on a limb too many

Two neuropsychologists have written a fascinating review article about the desire to amputate a perfectly healthy limb known variously as apotemnophilia, xenomelia or body integrity identity disorder

The article is published in the Journal of Neuropsychiatric Disease and Treatment although some who have these desires would probably disagree that it is a disease or disorder and are more likely to compare it to transexualism.

The article also discusses the two main themes in the research literature: an association with sexual fetish for limb aputation (most associated with the use of the name apotemnophilia) and an alteration in body image linked to differences in the function of the parietal lobe in the brain (most associated with the use of the name xenomelia).

It’s a fascinating review of what we know about this under-recognised form of human experience but it also has an interesting snippet about how this desire first came to light not in the scientific literature, but in the letters page of Penthouse magazine:

A first description of this condition traces back to a series of letters published in 1972 in the magazine Penthouse. These letters were from erotically-obsessed persons who wanted to become amputees themselves. However, the first scientific report of this desire only appeared in 1977: Money et al described two cases who had intense desire toward amputation of a healthy limb. Another milestone was a 2005 study by Michael First, an American psychiatrist, who published the first systematic attempt to describe individuals who desire amputation of a healthy limb. Thanks to this survey, which included 52 volunteers, a number of key features of the condition are identified: gender prevalence (most individuals are men), side preference (left-sided amputations are most frequently desired), and finally, a preference toward amputation of the leg versus the arm.

The review also discusses a potentially related experience which has recently been reported – the desire to be paralysed.

If you want a more journalistic account, Matter published an extensive piece on the condition last year.
 

Link to scientific review article on apotemnophilia / xenomelia.
Link to Matter article.

Towards a scientifically unified therapy

nature_scienceToday’s edition of Nature has an excellent article on the need to apply cognitive science to understanding how psychological therapies work.

Psychological therapies are often called ‘talking treatments’ but this is often a misleading name. Talking is essential, but it’s not where most of the change happens.

Like seeing a personal trainer in the gym, communication is key, but it’s the exercise which accounts for the changes.

In the same way, psychological therapy is only as effective as the experience of putting changes into practice, but we still know relatively little about the cognitive science behind this process.

Unfortunately, there is a traditional but unhelpful divide in psychology where some don’t see any sort of emotional problem as biological in any way, and the contrasting divide in psychiatry where biology is considered the only explanation in town.

The article in Nature argues that this is pointless and counter-productive:

It is time to use science to advance the psychological, not just the pharmaceutical, treatment of those with mental-health problems. Great strides can and must be made by focusing on concerns that are common to fields from psychology, psychiatry and pharmacology to genetics and molecular biology, neurology, neuroscience, cognitive and social sciences, computer science, and mathematics. Molecular and theoretical scientists need to engage with the challenges that face the clinical scientists who develop and deliver psychological treatments, and who evaluate their outcomes. And clinicians need to get involved in experimental science. Patients, mental-health-care providers and researchers of all stripes stand to benefit.

The piece tackles many good examples of why this is the case and sets out three steps for bridging the divide.

Essential reading.
 

Link to ‘Psychological treatments: A call for mental-health science’.

Why do we bite our nails?

It can ruin the appearance of your hands, could be unhygienic and can hurt if you take it too far. So why do people do it? Biter Tom Stafford investigates

What do ex-British prime minster Gordon Brown, Jackie Onassis, Britney Spears and I all have in common? We all are (or were) nail biters.

It’s not a habit I’m proud of. It’s pretty disgusting for other people to watch, ruins the appearance of my hands, is probably unhygienic and sometimes hurts if I take it too far. I’ve tried to quit many times, but have never managed to keep it up.

Lately I’ve been wondering what makes someone an inveterate nail-biter like me. Are we weaker willed? More neurotic? Hungrier? Perhaps, somewhere in the annals of psychological research there could be an answer to my question, and maybe even hints about how to cure myself of this unsavoury habit.

My first dip into the literature shows up the medical name for excessive nail biting: ‘onychophagia’. Psychiatrists classify it as an impulse control problem, alongside things like obsessive compulsive disorder. But this is for extreme cases, where psychiatric help is beneficial, as with other excessive grooming habits like skin picking or hair pulling. I’m not at that stage, falling instead among the majority of nail biters who carry on the habit without serious side effects. Up to 45% of teenagers bite their nails, for example; teenagers may be a handful but you wouldn’t argue that nearly half of them need medical intervention. I want to understand the ‘subclinical’ side of the phenomenon – nail biting that isn’t a major problem, but still enough of an issue for me to want to be rid of it.

It’s mother’s fault

Psychotherapists have had some theories about nail biting, of course. Sigmund Freud blamed it on arrested psycho-sexual development, at the oral stage (of course). Typical to Freudian theories, oral fixation is linked to myriad causes, such as under-feeding or over-feeding, breast-feeding too long, or problematic relationship with your mother. It also has a grab-bag of resulting symptoms: nail biting, of course, but also a sarcastic personality, smoking, alcoholism and love of oral sex. Other therapists have suggested nail-biting may be due to inward hostility – it is a form of self-mutilation after all – or nervous anxiety.

Like most psychodynamic theories these explanations could be true, but there’s no particular reason to believe they should be true. Most importantly for me, they don’t have any strong suggestions on how to cure myself of the habit. I’ve kind of missed the boat as far as extent of breast-feeding goes, and I bite my nails even when I’m at my most relaxed, so there doesn’t seem to be an easy fix there either. Needless to say, there’s no evidence that treatments based on these theories have any special success.

Unfortunately, after these speculations, the trail goes cold. A search of a scientific literature reveals only a handful of studies on treatment of nail-biting. One reports that any treatment which made people more aware of the habit seemed to help, but beyond that there is little evidence to report on the habit. Indeed, several of the few articles on nail-biting open by commenting on the surprising lack of literature on the topic.

Creature of habit

Given this lack of prior scientific treatment, I feel free to speculate for myself. So, here is my theory on why people bite their nails, and how to treat it.

Let’s call it the ‘anti-theory’ theory. I propose that there is no special cause of nail biting – not breastfeeding, chronic anxiety or a lack of motherly love. The advantage of this move is that we don’t need to find a particular connection between me, Gordon, Jackie and Britney. Rather, I suggest, nail biting is just the result of a number of factors which – due to random variation – combine in some people to create a bad habit.

First off, there is the fact that putting your fingers in your mouth is an easy thing to do. It is one of the basic functions for feeding and grooming, and so it is controlled by some pretty fundamental brain circuitry, meaning it can quickly develop into an automatic reaction. Added to this, there is a ‘tidying up’ element to nail biting – keeping them short – which means in the short term at least it can be pleasurable, even if the bigger picture is that you end up tearing your fingers to shreds. This reward element, combined with the ease with which the behaviour can be carried out, means that it is easy for a habit to develop; apart from touching yourself in the genitals it is hard to think of a more immediate way to give yourself a small moment of pleasure, and biting your nails has the advantage of being OK at school. Once established, the habit can become routine – there are many situations in everyone’s daily life where you have both your hands and your mouth available to use.

Understanding nail-biting as a habit has a bleak message for a cure, unfortunately, since we know how hard bad habits can be to break. Most people, at least once per day, will lose concentration on not biting their nails.

Nail-biting, in my view, isn’t some revealing personality characteristic, nor a maladaptive echo of some useful evolutionary behaviour. It is the product of the shape of our bodies, how hand-to-mouth behaviour is built into (and rewarded in) our brains and the psychology of habit.

And, yes, I did bite my nails while writing this column. Sometimes even a good theory doesn’t help.

 

This was my BBC Future column from last week

The concept of stress, sponsored by Big Tobacco

NPR has an excellent piece on how the scientific concept of stress was massively promoted by tobacco companies who wanted an angle to market ‘relaxing’ cigarettes and a way for them to argue that it was stress, not cigarettes, that was to blame for heart disease and cancer.

They did this by funding, guiding and editing the work of renowned physiologist Hans Selye who essentially founded the modern concept of stress and whose links with Big Tobacco have been largely unknown.

For the past decade or so, [Public Health Professor Mark] Petticrew and a group of colleagues in London have been searching through millions of documents from the tobacco industry that were archived online in the late ’90s as part of a legal settlement with tobacco companies.

What they’ve discovered is that both Selye’s work and much of the work around Type A personality were profoundly influenced by cigarette manufacturers. They were interested in promoting the concept of stress because it allowed them to argue that it was stress — not cigarettes — that was to blame for heart disease and cancer.

“In the case of Selye they vetted … the content of the paper, they agreed the wording of papers,” says Petticrew, “tobacco industry lawyers actually influenced the content of his writings, they suggested to him things that he should comment on.”

They also, Petticrew says, spent a huge amount of money funding his research. All of this is significant, Petticrew says, because Selye’s influence over our ideas about stress are hard to overstate. It wasn’t just that Selye came up with the concept, but in his time he was a tremendously respected figure.

Despite the success of the campaign to associate smoking with stress relief, the idea that smoking alleviates anxiety is almost certainly wrong. It tends to just relieve anxiety-provoking withdrawal and quitting smoking reduces overall anxiety levels.

Although the NPR article focuses on Selye and his work on stress, another big name was recruited by Big Tobacco to promote their theories.

It’s still little known that psychologist Hans Eysenck took significant sums of cash from tobacco companies.

They paid for a lot of Eysenck’s research that tried to show that the relationship between lung cancer and smoking was not direct but was mediated by personality differences. There was also lots of other research arguing that a range of smoking related health problems were only present in certain personality types.

Tobacco companies wanted to fund this research to cite it in court cases where they were defending themselves against lung cancer sufferers. It was their personalities, rather than their 20-a-day habit, that was a key cause behind their imminent demise, they wanted to argue in court, and they needed ‘hard science’ to back it up. So they bought some.

However, the link between ‘father of stress’ Hans Seyle and psychologist Hans Eysenck was not just that they were funded by the same people.

A study by Petticrew uncovered documents showing that both Seyle and Eysenck appeared in a 1977 tobacco industry promotional film together where “the film’s message is quite clear without being obvious about it — a controversy exists concerning the etiologic role of cigarette smoking in cancer.”

The ‘false controversy’ PR tactic has now became solidified as a science-denier standard.
 

Link to The Secret History Behind The Science Of Stress from NPR.
Link to paper ‘Hans Selye and the Tobacco Industry’.

Spike activity 11-07-2014

Quick links from the past week in mind and brain news:

Your Brain Is On the Brink of Chaos. Nautilus has an interesting piece on chaos the and the brain.

Neuroskeptic has a good Q&A with Zach Mainen, one of the originators of the NeuroFuture open letter demanding reform of the Human Brain Project.

There’s an open-access special issue on epilepsy in the latest edition of Nature.

The New York Times has a good piece on developments towards brain implants for cognitive enhancement.

Phantom limb pain tortures amputees and puzzles scientists. A man in Cambodia cycles round the country and treats it with mirrors. Excellent Mosaic Science piece.

Practical Ethics has an excellent piece on ‘tidying up psychiatry’.

Searching for the “Free Will” Neuron. Interesting piece from MIT Tech Review.

PLOS has launched a neuroscience channel.

Adults, like children, have a tendency to think vision is more informative than it is. Interesting piece on our understanding of what we understanding though looking from the BPS Research Digest

The Toast has what seems to be the first ever first-person account of Cotard’s delusion, the belief that you’re dead, in someone who experienced intense psychosis.

A thought lab in the sun

Neuroscientist Karl Friston, being an absolute champ, in an interview in The Lancet Psychiatry

“I get up very late, I go and smoke my pipe in the conservatory, hopefully in the sunshine with a nice cup of coffee, and have thoughts until I can raise the energy to have a bath. I don’t normally get to work until mid day.”

I have to say, I have a very similar approach which is getting up very early, drinking Red Bull, not having any thoughts, and raising the energy to catch a bus to an inpatient ward.

The man clearly doesn’t know the good life when he sees it.

The Lancet Psychiatry is one of the new speciality journals from the big names in medical publishing.

It seems to be publishing material from the correspondence and ‘insight’ sections (essays and the like) without a paywall, so there’s often plenty for the general reader to catch up on. It also has a podcast which is aimed at mental health professionals.
 

Link to interview with Karl Friston.

Motherhood, apple pie and replication

Who could possibly be against replication of research results? Jason Mitchell of Harvard University is, under some conditions, for reasons described in his essay On the emptiness of failed replications.

I wrote something for the Centre for Open Science which tries to draw out the sensible points in Mitchell’s essay – something I thought worth doing since for many people being against replication in science is like being against motherhood and apple pie. It’s worth noting that I was invited to do this by Brian Nosek, who is co-founder of the Center for Open Science and instrumental in the Many Labs projects. As such, Brian is implicitly one of the targets of Mitchell’s criticisms, so kudos to him for encouraging this discussion.

Here’s my commentary: What Jason Mitchell’s ‘On the emptiness of failed replications’ gets right

Memories of ‘hands on’ sex therapy

There’s an amusing passage in Andrew Solomon’s book Far From the Tree where he recounts his own experience of a curious attempt at surrogate partner therapy – a type of sex therapy where a ‘stand in’ partner engages with sexual activity with the client to help overcome sexual difficulties.

In Solomon’s case, he was a young gay man still confused about his sexuality who signed himself up to a cut-price clinic to try and awaken any possibility of ‘hidden heterosexual urges’.

It’s a curious historical snapshot, presumably from the early 1980s, but also quite funny as Solomon dryly recounts the futile experience.

When I was nineteen, I read an ad in the back of New York magazine that offered surrogate therapy for people who had issues with sex. I still believed the problem of whom I wanted was subsidiary to the problem of whom I didn’t want. I knew the back of a magazine was not a good place to find treatment, but my condition was too embarrassing to reveal to anyone who knew me.

Taking my savings to a walk-up office in Hell’s Kitchen, I subjected myself to long conversations about my sexual anxieties, unable to admit to myself or the so-called therapist that I was actually just not interested in women. I didn’t mention the busy sexual life I had by this time with men. I began “counselling” with people I was encouraged to call “doctors,” who would prescribe “exercises” with my “surrogates” – women who were not exactly prostitutes but who were also not exactly anything else.

In one protocol, I had to crawl around naked on all fours pretending to be a dog while the surrogate pretended to be a cat; the metaphor of enacting intimacy between mutually averse species is more loaded than I noticed at the time. I became curiously fond of these women, one of whom, an attractive blonde from the Deep South, eventually told me she was a necrophiliac and had taken this job after she got into trouble down the morgue.

You were supposed to keep switching girls so your ease was not limited to one sexual partner; I remember the first time a Puerto Rican woman climbed on top of me and began to bounce up and down, crying ecstatically, “You’re in me! You’re in me!” and how I lay there wondering with anxious boredom whether I had finally achieved the prize and become a qualified heterosexual.

Surrogate partner therapy is still used for a variety of sexual difficulties, although only fringe clinics now use it for pointless ‘gay conversion therapy’.

Although it is clearly in line with good psychological principles of experiential therapy, it has been quite controversial because of fears about being, as Solomon says, “not exactly prostitutes” along with some well-founded ethical concerns.

In the UK, the first bona fide clinic that used surrogate partner therapy was started in the 1970s and run by the sexologist Martin Cole – who was best known to the British public by his actually rather wonderful tabloid nickname Sex King Cole.

He spent several decades scandalising the establishment with his campaign for open and direct sex education and unstigmatised treatment of sexual dysfunction.

You can see the extent to which he rattled the self-appointed defenders of English morality by his mentions in parliamentary speeches made by concerned MPs who retold second-hand tales of scandal supposedly from Cole’s clinics.

This 1972 speech by MP Jill Knight veers from the melodramatic to the farcical as she describes how a sex surrogate “was with a client when a thunderous knocking occurred on the door and the glass panels in the door revealed a blue-clad figure topped by a policeman’s helmet. She knew at once that it was her fiance, who happened to be a policeman.”

If you want an up-to-date and level-headed discussion of surrogate partner therapy, an article by sex researcher Petra Boyton is a good place to start, and its something we’ve covered previously on Mind Hacks.

As for Cole, The Independent tracked him down, still working, in 1993, and wrote a somewhat wry profile of him.

A cultural view of agony

painNew Statesman has a fascinating article on the ‘cultural history of pain’ that tracks how our ideas about pain and suffering have radically changed through the years.

One of the most interesting, and worrying, themes is how there have been lots of cultural beliefs about whether certain groups are more or less sensitive to pain.

Needless to say, these beliefs tended to justify existing prejudices rather than stem from any sound evidence.

Some speculated whether the availability of anaesthetics and analgesics had an effect on people’s ability (as well as willingness) to cope with acute affliction. Writing in the 1930s, the distinguished pain surgeon René Leriche argued fervently that Europeans had become more sensitive to pain. Unlike earlier in the century, he claimed, modern patients “would not have allowed us to cut even a centimetre . . . without administering an anaesthetic”. This was not due to any decline of moral fibre, Leriche added: rather, it was a sign of a “nervous system differently developed, and more sensitive”.

Other physicians and scientists of the 19th and early 20th centuries wanted to complicate the picture by making a distinction between pain perception and pain reaction. But this distinction was used to denigrate “outsider” groups even further. Their alleged insensitivity to pain was proof of their humble status – yet when they did exhibit pain reactions, their sensitivity was called “exaggerated” or “hysterical” and therefore seen as more evidence of their inferiority.

 

Link to New Statesman article (via @SarahRoseCrook)

Do we really hate thinking so much we’d electrocute ourselves rather than do it?

By Tom Stafford, University of Sheffield

The headlines

The Guardian: Shocking but true: students prefer jolt of pain than being made to sit and think

Nature: We dislike being alone with our thoughts

Washington Post: Most men would rather shock themselves than be alone with their thoughts

 

The story

Quiet contemplation is so awful that when deprived of the distractions of noise, crowds or smart phones, a bunch of students would rather give themselves electric shocks than sit and think.

 

What they actually did

Psychologists from the universities of Virginia and Harvard in the US carried out a series of 11 studies in which participants – including students and non-students – were left in an unadorned room for six to 15 minutes and asked to “spend time entertaining themselves with their thoughts.” Both groups, and men and women equally, were unable to enjoy this task. Most said they found it difficult to concentrate and that their minds wandered.

In one of the studies, participants were given the option to give themselves an electric shock, for no given reason or reward. Many did, including the majority of male participants, despite the fact that the vast majority of participants had previously rated the shocks as unpleasant and said they would pay to avoid them.

 

How plausible is this?

This is a clever, provocative piece of research. The results are almost certainly reliable; the authors, some of whom are extremely distinguished, discovered in the 11 studies the same basic effect – namely, that being asked to sit and think wasn’t enjoyable. The data from the studies is also freely available, so there’s no chance of statistical jiggery-pokery. This is a real effect. The questions, then, are over what exactly the finding means.

 

Tom’s take

Contrary to what some reporters have implied, this result isn’t just about students – non-students also found being made to sit and think aversive, and there were no differences in this with age. And it isn’t just about men – women generally found the experience as unpleasant. The key result is that being made to sit and think is unpleasant so let’s look at this first before thinking about the shocks.

The results fit with research on sensory deprivation from 50 years ago. Paradoxically, when there are no distractions people find it hard to concentrate. It seems that for most of us, most of the time, our minds need to receive stimulus, interact with the environment, or at least have a task to function enjoyably. Thinking is an active process which involves the world – a far cry from some ideals of “pure thought”.

What the result certainly doesn’t mean, despite the interpretation given by some people – including one author of the study – is that people don’t like thinking. Rather, it’s fair to say that people don’t like being forced to do nothing but think.

It’s possible that there is a White Bear Effect here – also known as the ironic process theory. Famously, if you’re told to think of anything except a white bear, you can’t help but think about a white bear. If you imagine the circumstances of these studies, participants were told they had to sit in their chairs and just think. No singing, no exploring, no exercises. Wouldn’t that make you spend your time (unpleasantly) ruminating on what you couldn’t do?

In this context, are the shocks really so surprising? The shocks were very mild. The participants rated them as unpleasant when they were instructed to shock themselves, but we all know that there’s a big difference between having something done to you (or being told to do something) and choosing to do it yourself.

Although many participants chose to shock themselves I wouldn’t say they were avoiding thinking – rather they were thinking about what it would be like to get another shock. One participant shocked himself 190 times. Perhaps he was exploring how he could learn to cope with the discomfort. Curiosity and exploration are all hallmarks of thinking. It is only the very limited internally directed, stimulus-free kind of thinking to which we can apply the conclusion that it isn’t particular enjoyable.

 

Read more

The original paper: Just think: The challenges of the disengaged mind.

You can see the data over at the Open Science Framework.

Daniel Wegner’s brilliant book on the White Bear problem.

The Conversation

This article was originally published on The Conversation.
Read the original article.

Spike activity 27-06-2014

Quick links from the past week in mind and brain news:

Slate has a piece on developmental psychology’s WEIRD problem. Most kids in child psychology studies are from very restricted social groups – rich, educated families.

Facebook manipulated stories in users’ newsfeeds to conduct experiments on emotional contagion. Don’t remember signing the consent form for the study that appeared in PNAS?

Time covers the massive prevalence of PTSD among US veterans. The Pentagon’s PTSD treatments “appear to be local, ad hoc, incremental, and crisis-driven” with no effective evaluation.

Excellent analysis of a new study: FDA’s antidepressant warning didn’t actually backfired and cause more suicides. Neuroskeptic on the case.

Time magazine has an interesting piece on the under-reported problem of violence in women.

Interesting National Geographic piece about how new finds of human skull bones show even more complexity in the evolution of human and hominid species.

Slate has a piece on how that a lot of zoo animals are on antipsychotics because they become mentally ill when enclosed.

A spook’s guide to the psychology of deception

Last February, a file from the Edward Snowden leaks was released from a 2012 GCHQ presentation called ‘The Art of Deception: Training for Online Covert Operations’. It describes the ‘Online Covert Action Accreditation’ course which draws heavily on the psychology of influence and persuasion. This post will look at how they’re piecing together the science that forms the basis for these online operations.

The work seems to have been put together by GCHQ’s Human Science Operations Cell which presumably exists as an internal consultancy to allow the relevant cognitive and social sciences to be applied to practical covert operations.

One of the early slides lists the subjects the HSOC draws on which stretch from psychology to political science to neuroscience. At the current time, neuroscience has nothing practical to contribute, so they’re clearly blowing their neurological trumpets to sound a bit more high-tech but it’s worth noting the breadth of disciplines they draw on meaning they’ve got a wide and comprehensive vision of human behaviour from the micro to the macro.

However, one of the key slides has a road map of how everything fits together. It’s shown below and it’s quite dense so you can click the image below if you want a larger version.

One of the first thing that stands out if the ad-hoc-ness of their approach. They’ve appropriated a patchwork of relevant theories as a guide to practice with nothing being drawn from their own data.

You can see the main areas they’re drawing from – which includes profiling cultures and personality, research on persuasion, cognitive biases and scams, research on the psychology of stage magic, and organisational psychology or management science more generally.
 

Perhaps the weakest elements here are the cultural and personality profiling using Hofstede’s cultural dimensions and a Big Five personality traits. The trouble is that while these are statistically reliable on the group level they predict very little on the individual level because the effects are swamped by individual variation.

This means it may be more useful in the domain of PSYOPS, which attempts to influence groups, rather than targeting individuals.

The slide below details the general psychological framework for deception. As far as I can tell, this is the only original piece of psychological theory in the presentation.
 

It’s more a useful way of organising different approaches to deception rather than a theory in itself. It’s what clinical psychologists would call a ‘formulation’. It’s a way of organising evidence-based effects that may not be thoroughly tested itself but works well enough to aid understanding.

Perhaps the key thing to note is the sensemaking component. Sensemaking is a key concept in management science that just describes the different ways in which people come to conclusions about the meaning and significance of things.

It should be a well-known concept in intelligence circles because it is used both in military people management and military intelligence analysis. Interestingly, they treat individuals as like naive intelligence analysts who are trying to piece together their own understanding of the world and aim to exploit some of the weaknesses in this process. The big messy ‘concept map’ slides mentions ‘destructive organisational psychology’ which presumably refers to using the understanding of what keeps organisation together to break them apart.

However, in terms of the psychological science which underlies their approach, the next slide is key.
 

You can see several influences here. The techniques listed under ‘attention’ are all taken from research on the psychology of magic tricks, particularly from Susana Martinez-Conde’s work on how sleight-of-hand artists manipulate attention. Most of it is reviewed in a paper she wrote with a series of co-authors including pickpocket Apollo Robbins.

The HSOC spooks clearly love the idea of the psychology of magic and they refer to it a lot in their presentation. One slide just says ‘We want to build Cyber Magicians’, but it’s really not clear how it applies online. The whole point of sleight-of-hand is that it is dynamic and takes advantage of how you pay attention. When online, however, users’ attention doesn’t necessary flow in a predictable pattern because you can wander off from the screen, pause, grab screenshots and so on. In other words, individuals have better control over the flow of information because online interaction is designed for information control and therefore partial staggered attention.

The ‘perception’ techniques listed on the slide are largely taken from Stefano Grazioli and Sirkka Jarvenpaa’s classic paper [pdf] on online deception entitled ‘Deceived: Under Target Online’. The paper looks at how internet scammers rip people off and assuming that successful online con artists have found useful techniques by natural selection, HSOC just borrow them.

The techniques to exploit sensemaking are largely based on theories of sensemaking itself although the story fragments components seems to be drawn from research on relational agents that are designed ‘to form long-term, social-emotional relationships with their users’. Rather than actually deploying autonomous relational agents, I suspect it’s simply a case of using research insights from the area that suggests, for example, that presenting fragments of the agent’s backstory and letting the other person piece them together makes the person seem more believable.

The techniques in the ‘affect’ section are some general points taken from a vast experimental literature on the psychology of marketing and persuasion that describes how emotion modulates the heuristics (judgement processes) involved in persuasion.

The ‘behaviour’ section is the only part I don’t recognise as coming from the psychological literature. This makes me suspect it comes from PSYOPS or IO practice, but if you recognise it, leave a comment below.
 

The ’10 Principles of Influence’ is perhaps one of the most interesting slides in terms of illustrating the empirical basis for their approach as they use research both on the strategies of honest persuasion and dishonest scammers.

‘Principles are influence’ are largely associated with the work of consumer psychologist Robert Cialdini but the list actually consists of three of his six principles (Reciprocity, Social Compliance / Authority, Consistency).

Another six are taken from Stajano and Wilson’s classic study ‘Understanding scam victims: seven principles for systems security’ which describes six methods used by con artists. One item overlaps with the Cialdini principles and additionally they’ve included flattery (known to be an effective persuasive tool) and time – although it’s not clear whether they’re referring to giving people time and putting people under time pressure.
 

This section seems to be about gaining people’s trust to encourage disclosure and the slide you see above refers to social penetration theory which describes how relationships progress to increased levels of intimate connection through self-disclosure. The slide that follows this gives some basic advice about encouraging this: mirroring communication cues, adjust speech patters and so on – the sort of things you get taught in the first week of a psychotherapy course.

So here’s what the Online Covert Action Accreditation’ course looks like: like a PhD psychologist was given the task to come up with a plausible psychological framework for practical deception and influence online. It draws on a mix of persuasion psychology from marketing, studies on scammers and con-men, the social psychology of trust and disclosure, studies of how stage magic works psychologically, and work on what makes organisations work effectively and what degrades their performance.

This is a comprehensive approach to the problem, but the trouble is, this probably only translates approximately and probably rather poorly into practical effects.

In place of this, HSOC would be better of doing research and lots of it. They could do lots of informal RCTs online and gather a large amount of data quite quickly to test out which techniques actually increase influence or lead to successful deception. What behaviours on the part of the actor lead to increased self-disclosure the quickest? Does a laggy internet connection mean people’s increased frustration affects their evaluation of honest? and so on.

I suspect, however, that the Human Science Operations Cell were, and maybe still are, quite a small outfit and so they’re restricted to a consultancy role which will ultimately limit their effectiveness.

We tend to think that the secret services are super efficient experts with an infinite budget, but they probably just work like any other organisation. HSOC were probably told to deliver an Online Covert Action Accreditation course with few resources and not enough time and came up with the most sensible thing in the time allowed.

Oh, and by the way, hello spooks, and welcome to Mind Hacks.
 

Link to copy of slides.
Link to coverage from The Intercept.

Brains in their feat

Footballers skills seem light years from our own. But, Tom Stafford argues, the jaw-dropping talents on the World Cup pitch have more in common with everyday life than you might think.

The first week of the 2014 World Cup has already given us a clutch of classic moments: Robin Van Persie’s perfect header to open the Dutch onslaught against the Spanish; Australian Tim Cahill’s breathtaking volley to equalise against Holland; and Mexican keeper Guillermo Ochoa defying an increasingly desperate Brazilian attack.

We can’t help but be dazzled by the skills on display. Whether it is a header lobbed over an open-mouthed goalie, or a keeper’s last-second leap to save the goal, it can seem as if the footballers have access to talents that are not just beyond description, but beyond conscious comprehension. But the players sprinting, diving and straining on Brazil’s football pitches have a lot more in common with everyday intelligence than you might think.

We often talk about astonishing athletic feats as if they are something completely different from everyday thought. When we say a footballer acts on instinct, out of habit or due to his training, we distance what they do from that we hear echoing within our own heads.

The idea of “muscle memory” encourages this – allowing us to cordon off feats of motor skill as a special kind of psychological phenomenon, something stored, like magic potion, in our muscles. But the truth, of course, is that so called muscle memories are stored in our brains, just like every other kind of memory. What is more, these examples of great skill are not so different from ordinary thought.

If you speak to world-class athletes, such as World Cup footballers, about what they do, they reveal that a lot of conscious reasoning goes into those moments of sublime skill. Here’s England’s Wayne Rooney, in 2012, describing what it feels like as a cross comes into the penalty box: “You’re asking yourself six questions in a split second. Maybe you’ve got time to bring it down on the chest and shoot, or you have to head it first-time. If the defender is there, you’ve obviously got to try and hit it first-time. If he’s farther back, you’ve got space to take a touch. You get the decision made. Then it’s obviously about the execution.”

All this in half a second! Rooney is obviously thinking more, not less, during these most crucial moments.

This is not an isolated example. Dennis Bergkamp delighted Dutch fans by scoring a beautiful winning goal from a long pass in the 1998 World Cup quarter final against Argentina (and if you watch a clip on YouTube, make sure it the one with the ecstatic commentary by Jack van Gelder). In a subsequent interview Bergkamp describes in minute detail all the factors leading up to the goal, from the moment he made eye contact with the defender who was about to pass the ball, to his calculations about how to control the ball. He even lets slip that part of his brain is keeping track of the wind conditions. Just as with Rooney, this isn’t just a moment of unconscious instinct, but of instinct combined with a whirlwind of conscious reasoning. And it all comes together.

Studies of the way the brain embeds new skills, until the movements become automatic, may help make sense of this picture. We know that athletes like those performing in the World Cup train with many years of deliberate, mindful, practice . As they go through their drills, dedicated brain networks develop, allowing the movements to be deployed with less effort and more control. As well as the brain networks involved becoming more refined, the areas of the brain most active in controlling a movement change with increased skill  – as we practice, areas deeper within the brain reorganise to take on more of the work, leaving the cortex, including areas associated with planning and reasoning, free to take on new tasks.

But this doesn’t mean we think less when we’re highly skilled. On the contrary, this process called automatisation means that we think differently. Bergkamp doesn’t have to think about his foot when he wants to control a ball, so he’s free to think about the wind, or the defender, or when  exactly he wants to control the ball. For highly practiced movements we have to think less about controlling every action but what we do is still ultimately in the service of our overall targets (like scoring a goal in the case of football). In line with this, and contrary to the idea of skills as robotic-reflexes, experiments show that more flexibility develops alongside increased automaticity.

Maybe we like to think footballers are stupid because we want to feel good about ourselves, and many footballers aren’t as articulate as some of the eggheads we traditionally associate with intelligence (and aren’t trained in being articulate), but all the evidence suggests that the feats we see in the World Cup take an immense amount of thought.

Intelligence involves using conscious deliberation at the right level to optimally control your actions. Driving a car is easier because you don’t have to think about the physics of the combustion engine, and it’s also easier because you no longer have to think about the movements required to change gear or turn on the indicators. But just because driving a car relies on automatic skills like these, doesn’t mean that you’re mindless when driving a car. The better drivers, just like the better footballers, are making more choices each time they show off their talents, not fewer.

So footballer’s immense skills aren’t that different from many everyday things we do like walking, talking or driving a car. We’ve practiced these things so much we don’t have to think about how we’re doing them. We may even not pay much attention to what we’re doing, or have much of a memory for them (ever reached the end of a journey and realised you don’t recall a single thing about the trip?), but that doesn’t mean that we aren’t or couldn’t. In fact, because we have practiced these skills we can deploy them at the same time as other things (walking and chewing gum, talking while tying our shoe laces, etc). This doesn’t diminish their mystery, but it does align it with the central mystery of psychology – how we learn to do anything.

So while you may be unlikely to find yourself in the boots of Bergkamp and Rooney, preparing to drill one past a sprawling keeper, you can at least console yourself with the thought that you’re showing the skills of a World Cup legend every time you get behind the wheel of your car.

A bonus BBC Future column from last week. Here’s the original.

The normality trap

I remember taking a bus to London Bridge when, after a few stops, a woman got on who seemed to move with a subtle but twitchy disregard for her surroundings. She found herself a seat among the Saturday shoppers and divided her time between looking out the window and responding to invisible companions, occasionally shouting at her unseen persecutors.

By East Street, the bus was empty.

You’ve probably encountered fellow travellers who are strikingly out of the ordinary, sometimes quite distressed, scattered among the urban landscape where they seem to have a social forcefield around them that makes crowds part in their presence.

If you’ve ever worked in a hospital or support service for people with psychological or neurological difficulties, you’ve probably met lots of people who are markedly out of step with the mundane rules of social engagement.

They seem to talk too loud, or too fast, or too much. They can be full of fantastical things or fantasies. They may be afraid or angry, difficult or disengaged or intent on rewind-replay behaviours. Their dress can be notable for its eccentricity or decay.

So why don’t we see people like these in anti-stigma campaigns?

Don’t get me wrong, I’m a massive fan of the great work anti-stigma campaigns do. Everybody is susceptible to mental health problems and the reason these campaigns are necessary is that they often go unrecognised by other people and instead of help, too often people receive misunderstanding and ignorance.

But there’s more to mental health than normality.

That woman on the bus shouting at her voices, she deserves respect too. That guy who posts those leaflets about Masons and thought-stealing all over town, deserves your time. The guy that speaks in a clumsy monotone voice and doesn’t look you in the eye, is also worthy of compassion.

Disability charities don’t base their campaigns solely on ‘nice people in wheelchairs’. They’re happy to show people who represent the full range of appearance and presentation. So why not mental health?

Step up mental health organisations, you’ve got nothing to lose except your conformity.

Follow

Get every new post delivered to your Inbox.

Join 22,488 other followers