Synaesthesia in Frankenstein

One of the new ideas in synaesthesia research is that affected people perhaps don’t develop mixed senses as their brains develop, they just fail to lose them. It seems most children might start with naturally mixed senses before perception becomes segregated through pruning of the fuzzy neural pathways.

I’ve just noted an interesting article in Cognitive Neuropsychology on how this idea actually has long historical routes, and even influenced Mary Shelly’s cryopunk classic Frankenstein.

Although Mary Shelley was only 19 when she wrote her timeless novel, Frankenstein (1818), she combined contemporary philosophical and moral issues with a vision of the danger of emerging sciences that still has relevance today. The specific idea of early unity of the senses, very likely inspired by Rousseau, was articulated by Frankenstein’s creation in his first-person account of his early experiences:

“It is with considerable difficulty that I remember the original era of my being: all the events of that period appear confused and indistinct. A strange multiplicity of sensations seized me, and I saw, felt, heard, and smelt, at the same time; and it was, indeed, a long time before I learned to distinguish between the operations of my various senses. [Mary Shelley, Frankenstein (1818), chapter 11]”

Shelley goes on to present the creature as very humanlike, and it appears here that she wished to show that this extended to the earliest moments of his mental life. With the publication of Frankenstein, the unified-senses idea was thus brought into the popular culture, and Shelley’s words were probably read by some cognitive neuropsychologists in elementary school, even if they paid little heed to the sentiment. The idea also lived on within philosophy and, later, in the science of psychology.

In their professional career, very many cognitive neuropsychologists become acquainted with William James, and indeed the majority should recognize the phrase “one great blooming, buzzing confusion”. Most also recognize this as referring to the world of the infant, but few are probably aware that James was writing about his view that information from different senses is first fused in a child before later segregation.

Link to article.
Link to PubMed entry for same.

Mad honey

Photo by Flickr user Purrpl Haze. Click for sourceI’ve just discovered there’s a form of neurotoxic honey, genuinely known as “mad honey“, created by bees taking nectar from the beautiful rhododendron ponticum flower, pictured on the right.

The nectar from these plants, prevalent around the Black Sea region of Turkey, occasionally contains grayanotoxins, a class of neurotoxin that interferes with the action potential (electrical signalling) of nerve cells by blocking sodium channels in the cell membranes. This leads to problems with the muscles, peripheral nerves, and the central nervous system.

Mad honey apparently causes “a sharp burning sensation in the throat” and poisoning leads to dizziness, weakness, excessive sweating, hypersalivation, nausea, vomiting and ‘pins and needles’ although severe intoxication can cause dangerous heart problems.

Luckily, most cases aren’t fatal and resolve after 24 hours.

Mad honey was known to the Romans, and was specifically discussed by Pliny the Elder.

Link to brief review article on mad honey.
Link to PubMed entry for same.

The demon drink

Oh dear. It looks like psychologist Glenn Wilson has fallen off the wagon again. From the man who brought you the ’email hurts IQ more than cannabis’ PR stunt before repenting, comes the ‘the way you hold your drink reveals personality’ PR stunt.

This time it’s to promote a British pub chain and God bless those drink sodden journos who have gone and given it pride of place in the science section of today’s papers.

Even the BBC (who should know better but rarely do) have put it in their health section:

Dr Glenn Wilson, a consultant psychologist, observed the body language of 500 drinkers and divided them into eight personality types.

These were the flirt, the gossip, fun lover, wallflower, the ice-queen, the playboy, Jack-the-lad and browbeater.

Dr Wilson, who carried out the work for the [get free advertising somewhere else] bar chain, said glass hold “reflected the person you are”.

I would point out that it’s not published, or even sensical, but is there really any point when the whole premise is so ridiculous that you’d have to be virtually paralytic to take it seriously.

Wilson has actually done a great deal of serious research and is well known for his work on personality but occasionally seems to go on inexplicable media binges on the tab of corporate advertising.

Sadly, we’re the ones left with the hangover.

2009-05-29 Spike activity

Quick links from the past week in mind and brain news:

HBO launches the awesome Alzheimer’s Project online. Video, documentary, facts, stories. Very nicely put together.

Teen mental health and mindfulness are the focus of a recent ABC Radio National Health Report.

The LA Times has more on the ongoing <a href="Psychiatrists rewriting the mental health bible
http://www.latimes.com/news/nationworld/nation/la-sci-mental-disorder26-2009may26,0,3081443.story”>revision of the psychiatrists diagnostic manual, the DSM.

God bless ’em. The British Journal of Psychiatry publish a letter (scroll down) in which I complain about people ignoring research when talking about ‘internet addiction’ and other fictional monstrosities. The original authors write a lovely reply and I feel a bit sheepish.

The BPS Research Digest has a great post on simulating déjà vu in the lab.

If you haven’t seen it somewhere else, the excellent Mary Roach does a fascinating TED talk on ’10 things you didn’t know about orgasm‘ (although she doesn’t mention that the case of toothbrushing triggered orgasm was due to epilepsy).

People are universally optimistic according to a survey of over 140 countries reported in Science Daily. “At the country level, optimism is highest in Ireland, Brazil, Denmark, and New Zealand and lowest in Zimbabwe, Egypt, Haiti and Bulgaria.”

New Scientist has an interesting ‘science of the female orgasm’ series but drops the ball (if you’ll excuse the pun) with a ‘brain shuts down during female orgasm because I can’t critically evaluate the results of brain imaging studies’ piece.

There’s an interesting discussion on differing conceptions of the self, Jekyll and Hyde, and the modern of historical concept of criminal responsibility on ABC Radio National’s The Philosopher’s Zone.

New Scientist has an excellent article on eight ancient writing systems that still haven’t been cracked. Where’s Fairlight when you need them?

An article on how meditation alters brain activity and structure appears in Scientific American.

Frontier Psychiatrist has an excellent piece on the concept of a rational suicide.

It’s raining fantastic essays on mind, brain and culture over at Neuroanthropology!

The New York Times has an article on the recent ‘super-recogniser‘ research on people who have spectacularly good memory for faces.

Graph theory slinging, network mongering, sociologically inclined mathematician Steven Strogatz has an excellent short piece in The New York Times on the mathematics of love.

New Scientist reports on a twin study that suggests intellectual confidence is inherited, predicts grades, and is independent from IQ.

The better trust and communication style between father and daughter, the better it is likely to be between the daughter and her partner, according to research reported by the new-to-me but seemingly excellent Child Psychology Research Blog .

The Times Higher Education Supplement notes concerns over the falling numbers of UK medical students who start training to be psychiatrists.

A big budget TV drama series about psychiatrists called ‘Mental‘ has just launched and you can watch the first episode online. Apparently being filmed in Bogot√°, Colombia.

Scientific American has another Jesse Bering column, this time on adolescent girl social aggression, or, in more colloquial terms, bitchyness.

Women are more likely than men to suffer feelings of inadequacy at home and at work and have perfectionist tendencies, according to a US study reported by BBC News.

Cerebrum, Dana’s excellent neuroscience magazine, has a great piece on the limits of neuroimaging.

Replicant Roy Kurzweil furiously responds to recent Newsweek article that apparently contained inaccuracies over his predictions, opinions, incept date.

Advances in the History of Psychology discovers that Harvard psychologist Dan Wegner has posted an electrogroove mashup that incorporates sampled snippets of the recordings of Stanley Milgram‚Äôs famous obedience experiments of the early 1960s. Like a disturbing social psychology 70’s porn soundtrack.

Valuing the unusual illness debate

One of the particular joys of psychiatry is the regular ritual where a small but determined group of researchers try and get their idea for a new diagnosis accepted into the DSM. The most recent outbreak has hit the LA Times where a short article notes the proposal for ‘posttraumatic embitterment disorder’.

The idea for the disorder, where people are impaired by feelings of bitterness after “a severe and negative life event”, is not new. A small group of German researchers have been proposing the disorder in the medical literature since 2003 and have recently released a psychometric scale which they argue can diagnose the condition.

The last incarnation of this debate to hit the mainstream press was discussion over whether extreme racism could or should be diagnosed as ‘racist personality disorder’.

The discussions are interesting because they cut to the heart of how we define an illness. This is usually discussed as if it is a problem specific to psychiatry, as if diagnoses in other areas of medicine are more obvious, but this is not the case.

Implicit in medical diagnoses is the concept that the change or difference in the person has a negative impact.

Importantly, the biological ‘facts’ have little to do with this, because whether something has a ‘negative impact’ is largely a value judgement.

An infectious disease is not defined solely on the basis that it is a bacteria or virus, as we have many bacteria or viruses in our bodies that cause no problems. It’s only when they cause us distress or impairment that they’re classified as an illness.

In fact, there are some bacteria or viruses that are completely harmless in certain areas of the body, but cause problems in others. Like in cases of viral encephalitis where otherwise benign viruses can cause problems when they get into brain tissue.

In some cases the definition is partly based on a comparison to what’s average for a person of this type. Differences in brain structure, such as some white matter lesions, may be considered medical problems in young people but normal in older people.

But there are many human characteristics that we could equally classify as being ‘not normal’ and ‘negative’ but we don’t currently accept as illnesses.

Being left-handed is clearly a statistical deviation from the average, has been associated with a greater risk of breast cancer, an increase in accidental injuries, and has been genetically linked to schizophrenia. But left-handedness is not considered an illness.

In other words, there is no definition of an illness which is divorced from a subjective interpretation of what counts as ‘negative’.

We also have some subjective and fairly fuzzy cultural ideas just about what sort of things count as medical conditions and require attention from doctors. Someone born with a missing thumb – yes, someone born left-handed – no.

Many of these assumptions are not about the properties of the ‘illness’ but about what we think doctors should be doing and what we feel the place of medicine in society should be.

Psychiatric disorders are just another instance of this. So when you hear proposals for seemingly wacky mental illnesses, think to yourself, why is this not an illness?

Importantly, we should do the same for widely accepted mental illnesses, such as schizophrenia or depression. Ask yourself, on what basis is this an illness?

It’s not that all new diagnoses are useful or all existing ones are nonsense, it’s just that the process of questioning highlights our assumptions regarding the relationship between normality, human distress, impairment and the role of medicine in society.

Link to LA Times piece on bitterness as a mental illness.
Link to brilliant Stanford Philosophy Encyclopaedia entry on mental illness.

Winning the vaccine wars

PLoS Biology has an excellent article on the social factors behind how recent vaccination scares sparked off and continue, despite them having no scientific basis and having been repeatedly proved incorrect.

I’m morbidly fascinated by the autism scares because they are meeting of two very different forms of systems in which to think about knowledge.

Broadly, scientists think about how well a belief is supported by looking at its justifying evidence, whereas the antivaxxers decide on the conclusion often based on what they believe about their children and then bend or reject any evidence to fit the mould.

The piece focuses on the American antivaxxers and looks at how the US media amplified the scare story through focusing on personal stories and presenting them heavy weight scientific evidence.

Rachel Casiday, a medical anthropologist at the Centre for Integrated Health Care Research at Durham University, UK, who studied British parents’ attitudes toward MMR, says scientists should not underestimate the importance of narrative. People relate much more to a dramatic story‚Äî‚Äúhe got his vaccination, he stopped interacting, and he hasn’t been the same since‚Äù‚Äîthan they do to facts, risk analyses, and statistical studies.

‚ÄúIf you discount these stories, people think you have an ulterior motive or you’re not taking them seriously,‚Äù she explains. Casiday suggests providing an alternative, science-based explanation or relating emotionally compelling tales about counter-risk‚Äîsuch as helplessly watching a young child die of a vaccine-preventable disease‚Äîin the same narrative format.

While scientists have been (for years now) presenting the facts to people, it has really made very little difference and this is the first article I know of that suggests that science uses the power of the narrative to gets its vaccine safety message across.

UPDATE: I really recommend a post on the Providentia blog where psychologist Romeo Vitelli describes how the first life-saving smallpox vaccinations were opposed by a fledgling anti-vaccination movement that bear remarkable similarities to their modern day counterparts. The series on the historical antivaccination theme will continue, so look out for further posts on the same blog.

Link to PLoS Biology article (via @bengoldacre).

The phantom from the battle field

The Lancet recently published a fantastic article on one of the earliest cases of phantom limb. It was written by American Civil War surgeon Silas Weir Mitchell but not as a study in a medical journal, but as a short story in a popular magazine.

The story was titled The Case of George Dedlow in which Mitchell gives a careful medical description of sensations coming from a recently amputated limb, a portrait of how the amputation affected the soldier, and some musings on what it means about our relation to reality.

At this stage in the story, Mitchell uses his fictional character to muse on the neurological phenomenon of phantom limbs. Phantom limbs had been described in the mid-16th century by French military surgeon Ambroise Paré, but very little was known about what caused stump neuralgia (in the 1860s, the only treatments were electrotherapy, leeching, irritation of the surface of the stump, and re-amputation, none of which were very successful).

In The Case of George Dedlow, Mitchell speculates freely about what caused absent limbs to itch and feel pain. According to him, sensory impressions were transmitted through nerves to spinal nerve-cells and then to the brain. When a limb was removed, and until the stump healed, nerves continued to accept sensory impressions and to convey these impressions to the brain. If the stump never fully recovered, the result was constant irritation or a burning neuralgia. As Mitchell later explained in his famous textbook, Injuries of the Nerves and Their Consequences (1872), phantom limbs made “the strongest man…scarcely less nervous than the most hysterical girl”.

Somewhat poignantly, it seems Mitchell was haunted by his own phantoms from the war. In his later years he was troubled by ‘ghosts’ and intrusive memories from his gruesome years as a military surgeon.

It’s a fantastic short article that really conjures up the feel of the time as well as giving an insight into this important point in medical history.

Link to Lancet article.
Link to PubMed entry for same.
Link to text of short story The Case of George Dedlow.

Evolving causal belief

Photo by Flickr user evoo73. Click for sourceThere’s an interesting letter in this week’s edition of Nature from biologist Lewis Wolpert making the speculative but interesting claim that the development of causal belief may have been a key turning point in human evolution.

Wolpert is responding to a recent Nature essay critiquing the idea that closely related species will have evolved similar psychological processes, suggesting that it is shared selection pressures rather than genetic similarity that more greatly influences mental make up.

He responds by saying that we should focus on some of things that have uniquely evolved in humans rather than shared processes. He cites the ability to understand cause as a key example.

The feature that is peculiar to humans is their understanding about the causal interactions between physical objects (see, for example, L. Wolpert Six Impossible Things Before Breakfast; Faber, 2006). For example, children realize from an early age that one moving object can make another move on impact. It is this primitive concept of mechanics that is a crucial feature of causal belief, and that conferred an advantage in tool-making and the use of tools — which, in turn, drove human evolution.

Animals, by contrast, have very limited causal beliefs, although they can learn to carry out complex tasks. According to Michael Tomasello (The Cultural Origins of Human Cognition; Harvard Univ. Press, 1999), only human primates understand the causal and intentional relations that hold among external entities. Tomasello illustrates this point for non-human primates with the claim that even though they might watch the wind shaking a branch until its fruit falls, they would never shake the branch themselves to obtain the fruit. Some primates are, nevertheless, at the edge of having causal understanding.

Once causal belief evolved in relation to tools and language, it was inevitable that people would want to understand the causes of all the events that might affect their lives — such as illness, changes in climate and death itself. Once there was a concept of cause and effect, ignorance was no longer bliss, and this could have led to the development of religious beliefs.

Link to Wolpert’s letter in Nature.

All smoke and mirror neurons?

Photo by Flickr user Mike_in_Kboro. Click for sourceNew Scientist has a tantalising snippet reporting on a shortly to be released and potentially important new study challenging the idea of ‘mirror neurons’.

Mirror neurons fire both when we perform an action and when we see someone else doing it. The theory is that by simulating action even when watching an act, the neurons allow us to recognise and understand other people’s actions and intentions…

However, Alfonso Caramazza at Harvard University and colleagues say their research suggests this theory is flawed.

Neurons that encounter repeated stimulus reduce their successive response, a process called adaptation. If mirror neurons existed in the activated part of the brain, reasoned Caramazza, adaptation should be triggered by both observation and performance.

To test the theory, his team asked 12 volunteers to watch videos of hand gestures and, when instructed, to mimic the action. However, fMRI scans of the participants’ brains showed that the neurons only adapted when gestures were observed then enacted, but not the other way around.

Caramazza says the finding overturns the core theory of mirror neurons that activation is a precursor to recognition and understanding of an action. If after executing an act, “you need to activate the same neurons to recognise the act, then those neurons should have adapted,” he says.

The study is to appear in the Proceedings of the National Academy of Sciences and apparently is embargoed so the full text is not yet available, although it should appear here when it is.

The announcement is interesting because using adaptation is a novel way of testing ‘mirror neurons’ and the lead researcher, Alfonso Caramazza, is known for a long series of influential neuropsychology studies.

He has a reputation for being a sober and considered scientist so it will be interesting to see if the final study is really the challenge to mirror neurons as it seems.

Although the hype has subsided a little, the years following the initial reports saw these now famous neurons being used to explain everything from language, to empathy, to why we love art.

We’re now in a period where we’re taking, if you’ll excuse the pun, a somewhat more reflective look at the topic and developing more nuanced theories about how this brain system functions.

UPDATE: Grabbed from the comments. Looks like this paper might have the potential to cause a ruckus. A comment from mirror neuron researcher Marco Iacoboni:

Caramazza‚Äôs paper is seriously flawed. The technique of fMRI adaptation seemed very promising ten years ago, but careful studies on its neurophysiological correlates have demonstrated that its findings are uninterpretable. Indeed, Caramazza‚Äôs manuscript has been around for many years and nobody wanted to publish it. Caramazza managed to publish with an old trick that only PNAS allows: he handed it personally to a friend of his. The paper is basically unrefereed (this is what it means ‚ÄòEdited by…‚Äô under its title).

Link to NewSci on ‘Role of mirror neurons may need a rethink’.

Changes to psychiatrists’ diagnostic ‘bible’ hinted at

PsychCentral reports on the likely changes to appear in the DSM-V, the new version of the psychiatrist’s diagnostic manual, due out in 2012 and discussed in a recent presentation in last week’s American Psychiatric Association annual conference.

The most significant change proposed has to do with the inclusion of dimensional assessments for depression, anxiety, cognitive impairment and reality distortion that span across many major mental disorders. So a clinician might diagnose schizophrenia, but then also rate these four dimensions for the patient to characterize the schizophrenia in a more detailed and descriptive manner.

Despite the PR spin that “no limits” were placed on this revision of the DSM, the reality is that there will be very few significant changes from the existing edition of the DSM-IV. While virtually all disorders will be revised, the revisions will, for the most part, be incremental and small. Why? Because the APA recognizes that you can’t retrain 300,000 mental health professionals (not to mention the 500,000 general physicians) in the field to completely relearn their way of diagnosing common mental disorders such as depression, bipolar disorder, ADHD and schizophrenia. Changes are always incremental and tweak the existing system, nothing more.

The inclusion of dimensional ratings owes much to the role of psychometrics in the assessment of mental illness, but it remains to be seen how extensively this is implemented as it could just be a fancy label for sub-categories of degree (slight, moderate, severe etc) rather than the reliance on statistically sound measurements.

The post also mentions that there may be some moving of the diagnostic furniture with some additions and retractions but no major shakeups.

There’s more coverage on MedPage, but bear in mind that as we’re still three years away from publication so it’s worth bearing in mind that some of the final decisions have still to be made.

Link to PsychCentral post ‘Update: DSM-V Major Changes’.
Link to MedPage coverage.

Russian roulette in the medical literature

Photo by Flickr user bk1bennet. Click for sourceI’ve just discovered there’s a small medical literature on deaths by Russian roulette, where people put one bullet in a revolver, spin the chamber, put the gun to their head and pull the trigger.

A recent article from the The American Journal of Forensic Medicine and Pathology has a 10-year case review covering 24 deaths (wow) from the US state of Kentucky alone and serves as a summary of the research into this fate-tempting and most suicidal of games.

It’s a curious set of studies for which the most reliable finding is that people who die by Russian roulette are mostly young men who were drunk or had taken drugs.

On the more unusual side, one study found a link between participation in Russian roulette and “the types and number of tattoos and body piercing”.

The article also briefly describes a number of previous case reports from the literature, including this one which is remarkable for both mathematical and ultimately tragic reasons:

Playing a variation of traditional Russian roulette with his brother and 2 friends, the victim placed 5 live rounds in the cylinder, leaving one empty chamber, of a .357 Traus revolver. He spun the cylinder, put the gun to his right temple, and pulled the trigger. Postmortem blood toxicology revealed an ethanol level of 0.01% and the presence of diazepam and nordiazepam. The decedent had played Russian roulette on 2 occasions in the previous several weeks, each time placing only one live round in the cylinder.

Link to study on Russian roulette and risk-taking behaviour.
Link to DOI entry for same.

Freestyle Lehrer

Edge has an excellent interview with science writer Jonah Lehrer who riffs on consciousness, the joy of discovery, the importance of the marshmallows in psychology and how he fell in love with science.

It’s interesting because rarely do science writers get the opportunity to give their own opinions on the big questions in neuroscience, despite the fact that, as Lehrer mentions, they have a distinct way of looking at the field as a whole.

Writers have a massive influence on politics, economics, business and the arts, to the point where they are actively courted and coerced by those wanting to control the agenda, but there is much less of a tradition of writers influencing science outside the political sphere.

In fact, it’d be interesting to directly ask science writers for their own theories one day, but in the mean time here’s a rare opportunity to see one ‘in action’ on the big issues.

The questions I’m asking myself right now are on a couple different levels. For a long time there’s been this necessary drive towards reductionism; towards looking at the brain, these three pounds of gelatinous flesh, as nothing but a loop of kinase enzymes. You’re a trillion synaptic connections. Of course, that‚Äôs a necessary foundation for trying to understand the mind and the brain, simply trying to decode the wet stuff.

And that’s essential, and we’ve made astonishing progress thanks to the work of people like Eric Kandel, who has helped outline the chemistry behind memory and all these other fundamental mental processes. Yet now we’re beginning to know enough about the wet stuff, about these three pounds, to see that that’s at best only a partial glance, a glimpse of human nature; that we’re not just these brains in a vat, but these brains that interact with other brains and we are starting to realize that the fundamental approach we’ve taken to the mind and the brain, looking at it as this system of ingredients, chemical ingredients, enzymatic pathways, is actually profoundly limited.

The question now is, how do you then extrapolate it upwards? How do you take this organ, this piece of meat that runs on 10 watts of electricity, and how do you study it in its actual context, which is that it’s not a brain in a vat. It’s a brain interacting with other brains. How do you study things like social networks and human interactions?

Link to Jonah Lehrer interview on Edge.

Encephalon 71 welcomes new diners

The 71st edition of the Encephalon psychology and neuroscience writing carnival has just been served in the welcoming surroundings of the stylish Neuroanthropology blog.

A couple of my favourites include a podcast interview with neuropsychologist Chris Frith from the Brain Science Podcast blog, and a post on the development of early language from Babel’s Dawn.

There plenty more on the menu, so you should find something to suit every taste. Bon appétit!

Link to Encephalon 71.

A hostage to hallucination

Photo by Flickr user Meredith Farmer. Click for sourceI’ve just found a morbidly fascinating 1984 study on hallucinations in hostages and kidnap victims.

The paper is from the Journal of Nervous and Mental Disease and contains case studies of people who have been held captive by terrorists, kidnappers, rapists, robbers, enemy troops and, er… UFOs.

The reasoning behind including two ‘alien abductees’ was to compare hallucinations in verified versus unverified hostage situations. Cases of people who were hostages but did not hallucinate are also included.

The study found that one in four hostages had intense hallucinations, and these were invariably people who were in life-threatening situations. Isolation, visual deprivation, physical restraint, violence and death threats also seemed to contribute to the chance of having a hallucinatory experience.

Case 14

A 23-year-old member of a street gang was taken hostage by a rival gang. He was kept in a warehouse, blindfolded and tied to a chair, for 32 hours. He was severely beaten and forced to record ransom demands on a tape recorder. During captivity he became dissociated – “even when they were hitting on me I just tripped out, got out of my body… it was like I was high on Sherms (phencyclidine).” At one point he felt detached from his body and “floated” to the ceiling where he observed himself being beaten and burned with cigarettes but denied having any pain. He saw colorful geometric patterns in the air and flashes of past memories “like a dream, only I kept seeing devils and cops and monsters… nightmares I guess”. Eventually he was released when his gang paid the ransom.

Some of the case studies are a little disturbing, but it’s worth reading the paper in full if you can, or at least from the beginning of the case studies, as it’s a rarely discussed but remarkably striking aspect of human experience.

Link to article on ‘Hostage Hallucinations’.
Link to PubMed entry for same.

On the information alarmageddon

New York Magazine has an article arguing that the concerns about digital technology drastically affecting our minds are just hype. I really wanted to like it but it’s just another poorly researched piece on the psychology of digital technology.

Research has shown that distraction can improve exactly the sorts of skills that the digital doomsayers say will be broken by the high-tech world, but I’ve never seen it mentioned in any of the recent high-profile articles on the predicted digital meltdown.

In fact, there is a fairly sizeable scientific literature on how interruption affects the ability to complete a task, and instant messaging has been specifically studied.

But despite getting lots of opinions from everyone from attention researcher David Meyer to lifehacker Merlin Mann only one single ‘study’ on the distracting effect of technology is mentioned in the New York Magazine article: “people who frequently check their email has tested less intelligent than people who are actually high on marijuana”.

This is quite amazing because not only was the ‘study’ in question not an actual scientific study, it was PR stunt for Hewlett Packard, this isn’t even an accurate description of it (users were interrupted with email during an IQ test and scored worse, big surprise).

The issue actually breaks down into two parts, one is a scientific question: what is the psychological effect of distraction? and the other, a cultural one: have we become a society where high levels of distraction are more acceptable?

As I mentioned, the first question has been very well researched and the general conclusion is that distraction reduces our ability to complete tasks. Essentially, it’s saying that distraction is distracting, which is hardly headline news.

But it also turns out that distraction is most disruptive to stimulus based search tasks, when we are flicking our attention around scanning for bits of information. Perhaps unsurprisingly, when we’re on alert for new and different things, something salient like an instant message grabs our attention and knocks us off course.

More thoughtful tasks involving processing meaning are the least affected. This is interesting because most of the digital doomsayers suggest it is exactly this sort of deep thought that being affected by communication technology.

The other line of argument is that all this distraction makes us less creative because creativity needs focus to flourish.

Although not as well studied, it seems this is unlikely. While we assume that distraction reduces creativity, but lab studies tend to show the reverse.

Distraction has also been found to improve decision making, especially for complex fuzzy decisions – again exactly the sort that the doomsayers say will most be at peril.

These studies find that too much concentration reduces our creative thinking because we’re stuck in one mind-set, deliberately filtering out what we’ve already decided is irrelevant, thereby already discarding counter-intuitive ideas (actually this is something the article does touch on). We can speculate that this may be why a preliminary study found that amphetamine-based concentration drug Adderall reduced creativity.

The cultural issue is perhaps more important, but on an individual level is more easily addressed.

You have control over the technology of distraction. If you can’t concentrate, switch it off. It it is your job to be distracted and it is affecting other essential parts of your role, that is something to take up with your employers.

It’s no different than if you’re being distracted by the sound of traffic and can’t do your job. Maybe you need an office away from the street? If you or your employers can’t do anything about it, maybe that’s just one of the downsides of the job.

What research hasn’t yet shown is that digital technology is having a significant negative influence on our minds or brains. In some cases, it’s showing the reverse.

History has taught us that we worry about widespread new technology and this is usually expressed in society in terms of its negative impact on our minds and social relationships.

If you’re really concerned about cognitive abilities, look after your cardiovascular health (eat well and exercise), cherish your relationships, stay mentally active and experience diverse and interesting things. All of which have been shown to maintain mental function, especially as we age.

Technology has an impact on the mind but it’s a drop in the ocean compared to the influence of your health and your relationships.

I’m constantly surprised that the impact of technology is clearly of such widespread interest to merit headline grabbing articles in international publications, but apparently not interesting enough that journalists will actually use the internet to find the research.

It’s like writing a travel guide without ever visiting the country. I’m just guessing the editors have yet to catch on to the scam.

Link to NYMag article ‘In Defense of Distraction’.

Can’t put the thought genie back into the bottle

Photo by Flickr user kaneda99. Click for sourcePsyBlog has an excellent piece on the counter-intuitive psychology of thought suppression – the deliberate attempt to not think of something that almost invariably backfires.

The article is both fascinating from a scientific point-of-view but also important as a personal mental health resource if you’re one of the many people who intuitively think that the best way of dealing with ‘bad’ thoughts is to try and push them out of the mind.

What psychology research has shown us is that not trying to think of something makes us think of it more frequently (the “don’t think of a pink elephant” phenomenon), and that this counter-productive effect is enhanced for emotion-heavy thoughts and in people with mental illnesses where intrusive thoughts are a problem.

Psychologists often use the metaphor of noisy trains passing through the station. Thought suppression is like standing in the middle of the tracks trying to push the train back. You’re just going to get run over. Instead, people are encouraged to just wait on the platform, observe the train of thought and wait for it to pass.

The ability to act as a ‘detached observer’ to the mind’s distressing thoughts is a useful cognitive skill and one that is cultivated by mindfulness mediation, something that has increasing evidence as a useful treatment for mental health problems.

There’s lots of good research on thought suppression, much of which is covered in PsyBlog article, but this study struck me as particularly inventive:

Wegner and Gold (1995) examined emotional suppression by delving into people’s romantic pasts using a neat comparison between ‘hot flames’ and ‘cold flames’. A ‘hot flame’ is a previous partner who still fires the imagination, while a ‘cold flame’ is a previous partner for whom the thrill is gone. In theory the ‘hot flame’ should produce more intrusive thoughts so people should have more practice suppressing them. Meanwhile because the cold flame doesn’t produce intrusive thoughts, people should have less practice suppressing them.

The results revealed exactly the expected pattern: people found it harder to to suppress thoughts about cold flames presumably because they had less practice.

Link to PsyBlog on ‘Why Thought Suppression is Counter-Productive’.