Fraudian slip

Today’s BPS Research Digest has a wonderfully ironic and recursive Freudian slip in a post about the misdiagnosis of women with mental illness in Victorian Britain.

It highlights how misdiagnosis could get the doctor in hot water, and makes a link with Freud’s later ideas about hysteria – symptoms that appear to be neurological, such as paralysis, but aren’t accounted for by damage to the nervous system.

I hope Christian won’t mind me pointing out that the misspelling of Freud is brilliantly paradoxical:

Remember this is some decades before Fraud started applying the diagnosis of conversion disorder or hysteria to so many women, many of whom probably had organic illnesses.

Freud argued that the ‘Freudian slip‘, or parapraxis, is an example of the unconscious mind slipping past our conscious editing of speech and action, potentially revealing the true beliefs of desires of the person in question.

I wonder whether he’d feel vindicated over the sentence above, or would just despair that such talented psychologists think he was talking bunk on this occasion.

With regards to the question over the reliability of diagnosing hysteria, now reclassified as ‘conversion disorder‘, Slater completed a famous 1965 study where he followed up patients who had been diagnosed with hysteria to see if they later showed definite signs of neurological illness.

He found that over 60% later showed signs of genuine neurological illness and dryly stated that “The only thing that ‚Äòhysterical‚Äô patients have in common is that they are all patients”.

Although influential at the time, it has subsequently been discredited as lacking rigorous methods (taking family doctor notes as follow-up data, for example).

The most comprehensive study was published in 2005 and looked at patients diagnosed with hysteria over many decades and found that misdiagnosis rates were one third in the 1950s, but have been at 4% since the 1970s – probably due to the emergence of reliable brain imaging technologies.

Incidentally, the image on the left is a slightly edited panel from a six page comic called The New Adventures of Sigmund Freud where an Uzi toting Sigmund takes on Osama Bin Laden in his secret lair.

Link to BPSRD on ‘The Suspicions of Mr Whicher’.
Link to 2005 hysteria follow-up study with full text link.
Link to The New Adventures of Sigmund Freud.

Drug-fuelled shooting as a spectator sport

The Atlantic has a provocative article arguing that drug-fuelled shootings would make competitive sport more interesting, although probably not in the way you’re thinking.

The piece discusses beta blockers such as propranolol, drugs that have their major effect on the peripheral part of the autonomic nervous system.

They don’t actually make the user feel less psychologically anxious, but just reduce the normal ‘fight or flight’ pumped feeling, so the bodily effects of anxiety such as shaking, sweating, heart pounding and muscle tension are reduced.

These drugs are used widely by professional musicians to stop performance jitters and the Atlantic article argues that they should be allowed in sports like shooting and archery so competitors aren’t disadvantaged by performance anxiety.

From a competitive standpoint, this is what makes beta blockers so interesting : they seem to level the playing field for anxious and non-anxious performers, helping nervous performers much more than they help performers who are naturally relaxed. In the British study, for example, the musician who experienced the greatest benefit was the one with the worst nervous tremor. This player’s score increased by a whopping 73%, whereas the musicians who were not nervous saw hardly any effect at all.

One of the most compelling arguments against performance enhancing drugs is that they produce an arms race among competitors, who feel compelled to use the drugs even when they would prefer not to, simply to stay competitive. But this argument falls away if the effects of the drug are distributed so unequally. If it’s only the nervous performers who are helped by beta blockers, there’s no reason for anyone other than nervous performers to use them.

Link to ‘In Defense of the Beta Blocker’ (via 3QD).

NeuroPod on altruism, imprinting, eating and magic

The August edition of the Nature Neuroscience podcast, NeuroPod, arrived online after a summer break with some fascinating discussions on everything from altruism to magic.

Perhaps the most interesting bit is on genomic imprinting – a curious effect where the same gene may be expressed differently depending on whether you inherited it from your mother or your father.

The most widely known examples are the Prader-Willi and Angelman syndromes, both of which are genetic disorders linked to learning disabilities and neurological problems.

Both are caused by a partial deletion of genes from chromosome 15. When this is inherited from the mother, it causes Angelman syndrome, when inherited from the father, it causes Prader-Willi syndrome.

A recent opinion piece published in Nature, written by sociologist Christopher Badcock and biologist Bernard Crespi, argued that genetic imprinting may be key to a much wider range of conditions – including many of the more common psychiatric disorders such as depression or schizophrenia.

We believe that psychiatric illness may be less to do with the genes a mother and father pass down, and more to do with which genes they program for expression. By our hypothesis, a hidden battle of the sexes — where a mother’s egg and a father’s sperm engage in an evolutionary struggle to turn gene expression up or down — could play a crucial part in determining the balance or imbalance of an offspring’s brain. If this proves true, it would greatly clarify the diagnosis of mental disorders. It might even make it possible to reset the mind’s balance with targeted drugs.

The article then goes on to propose the idea (presumably related to a similar Chris Frith theory) that autism and psychosis might be ‘diametric opposites’, echoing an argument they expanded on more fully in a larger article earlier this year.

I’ve not read the bigger piece, but my first thought is how they manage to account for the fact that people with Asperger’s or autism can become psychotic. I shall look forward to seeing what they have to say in more detail.

Anyway, the podcast discusses the main points, as well as getting some comments from some more sceptical scientists.

Link to NeuroPod homepage (now with flash streaming).
mp3 of August Neuropod.
Link to piece on genetic imprinting and psychiatric disorder.
Link to PubMed entry for same.

Encephalon 53 hails from a big continent

The 53rd edition of the Encephalon psychology and neuroscience writing carnival comes to us from the beautiful continent of Africa and has all the latest from the last fortnight in mind and brain news.

A couple of my favourites include an article from the appropriately named Brain Stimulant on the experience of a person with Asperger’s who took part in a TMS experiment, and another from Neuronism on the expert perceptual judgements of players vs wannabees in basketball.

This fortnite’s Encephalon is hosted by Ionian Enchantment a blog which I’d not discovered before but looks very good and is updated remarkably frequently.

Link to Encephalon 53.

It’s all gone scare shaped

The Guardian is currently running a series of extracts from Ben Goldacre’s new book, Bad Science. The first two are witty, acerbic and address how implausible vaccine scare stories get picked up by a scandal hungry media, and how pharmaceutical companies attempt to persuade us that every discomfort is a medical disorder.

Actually, I’m still waiting for the copy I’ve ordered to arrive so haven’t seen the whole thing yet, but if you’re a fan of the Bad Science column then the extracts suggest that the book will be just as insightful.

Times have changed. The pharmaceutical industry is in trouble: the golden age of medicine has creaked to a halt, the low-hanging fruit of medical research has all been harvested, and the industry is rapidly running out of new drugs. Fifty “novel molecular entities” a year were registered in the 1990s, but now it’s down to 20, and many of those are just copies of other companies’ products, changed only enough to justify a new patent. So the story of “disease mongering” goes like this: because they cannot find new treatments for the diseases we already have, the pill companies have instead had to invent new diseases for the treatments they already have.

Recent favourites include social anxiety disorder (a new use for SSRI antidepressant drugs), female sexual dysfunction (a new use for Viagra in women), the widening diagnostic boundaries of “restless leg syndrome”, and of course “night eating syndrome” (another attempt to sell SSRI medication, bordering on self-parody) to name just a few: all problems, in a very real sense, but perhaps not necessarily the stuff of pills, and perhaps not all best viewed in reductionist biomedical terms. In fact, you might consider that reframing intelligence, loss of libido, shyness and tiredness as medical pill problems is a crass, exploitative, and frankly disempowering act.

Night eating syndrome? No wonder those Goths look so pale.

Link to ‘The media‚Äôs MMR hoax’.
Link to ‘The Medicalisation of Everyday Life’.
Link to book details.

A vision of a daydream, or a fragment of reality

The Boston Globe has an interesting piece on daydreaming, touching on the link between daydreaming and creativity and discussing the possibly brain networks that might support our pleasant mental wanderings.

The article discusses some of the recent work on the default brain network and how this might be related to daydreaming:

Every time we slip effortlessly into a daydream, a distinct pattern of brain areas is activated, which is known as the default network. Studies show that this network is most engaged when people are performing tasks that require little conscious attention, such as routine driving on the highway or reading a tedious text. Although such mental trances are often seen as a sign of lethargy – we are staring haplessly into space – the cortex is actually very active during this default state, as numerous brain regions interact. Instead of responding to the outside world, the brain starts to contemplate its internal landscape. This is when new and creative connections are made between seemingly unrelated ideas.

“When you don’t use a muscle, that muscle really isn’t doing much of anything,” says Dr. Marcus Raichle, a neurologist and radiologist at Washington University who was one of the first scientists to locate the default network in the brain. “But when your brain is supposedly doing nothing and daydreaming, it’s really doing a tremendous amount. We call it the ‘resting state,’ but the brain isn’t resting at all.”

It’s worth bearing in mind that the connection between this network and daydreaming is only one theory, and other researchers think of it quite differently.

The ‘default network’ was suggested owing to measurements of how the brain uses energy at rest, and when brain imaging researchers noted that certain parts of the brain (mainly midline areas) were more active when participants didn’t seem to be doing very much but showed reduced activity when we participants were most engaged in attention-demanding tasks.

Neurologist Marcus Raichle has been most vocal in proposing that the network is linked to what we might broadly call daydreaming, mostly notably on the basis of a study that found that default network activity was related to what they called ‘stimulus independent thought’.

They determined this by training people on a memory task until they could do it so easily their minds wandered. They then put people in a scanner, compared brain activation in this condition to brain activation with a similar memory task but where the material was new, so they had to concentrate and weren’t able to think about other stuff.

They found that the practised condition was associated with activity in a default network, and, therefore, they linked it to daydreaming.

The trouble is, is that they only confirmed that participants were doing more off topic thinking, not what they were thinking about.

We might think of daydreaming as having thoughts about being the lead singer of an all-girl skiffle band, fighting a dragon if it happened to burst through the lab door, or screwing the research assistant who took us through the consent form, but it could be that the participants were just focused on the other stuff that was happening around them at the time.

Like the horrendous noise of the fMRI scanner, as some commentators suggested. Or perhaps, they were just being more aware of their wider environment.

And in fact, one theory suggests that the default network is not concerned with daydreaming, but maintains a background level of watchful attention to detect potentially dangerous external events (real dragons, for example), or perhaps processes memories – essentially doing our mental filekeeping.

One big problem with this area, is that it attempts to study a network which is supposedly most active when when not doing deliberate mental tasks, by extrapolating from data that involves the participants doing deliberate mental tasks.

This makes it difficult to tie it specifically to daydreaming, which is a subjective mental state that has a tendency of dancing away whenever we try and catch it.

Link to Globe article ‘Daydream achiever’ (via Frontal Cortex).

Monty Python’s fluent aphasia

Thripshaw’s Disease was a fictional medical condition shown in a sketch from the classic comedy series Monty Python’s Flying Circus that bears a remarkably similarity to fluent aphasia, a speech impairment that can occur after brain injury.

Mind Hacks reader Patricio sent in this fascinating observation, and we can see from the sketch that the man can understand what is said to him (intact comprehension), but produces fluent but jumbled sentences.

Speech problems after (usually left-sided) brain injury are called aphasia and the concept reflects the various ways speech can be impaired.

Sometimes aphasia affects speech production, so people can hardly seem to get a word out, while other people can produce fluent speech although it can be full of misplaced words, odd word order or nonwords. Often in fluent aphasia, people can also have difficulties in understanding what is said, but it’s not always the case.

Of course, there can be a mix of all sorts of problems, but the type of speech disorder depicted in the Monty Python sketch is called paragrammaticism and was tackled by a classic study by Butterworth and Howard.

Most interestingly, the researchers found that these errors are identical to the grammatical errors people without brain injury tend to make on a day-to-day basis, but just happen much more frequently.

Here’s one of the examples from the study:

My father, he is the biggest envelope ever worked in Ipswich. He strikes every competition and constitution that’s going. He’s got everybody situated and they’ve got to talk to him.

And there’s also a lovely example from this book:

I’ll tell you, not like before, I must say that once the beginning happened in the beginning, as I arrived and naturally it was, of course, quite decisive.

The gentleman in Monty Python sketch also shows paraphasias (saying the wrong word where you intended to say another) and neologisms (creating instant nonsense words).

Interestingly, the interviewer on the TV chat show slightly later in the sketch shows a classic transcortical motor aphasia – a slow halting speech with inappropriate word stress – typically caused by damage to areas of the mid part of the left frontal lobe.

This character is played by Graham Chapman who studied medicine and qualified as a doctor although apparently never practiced owing to the success of Monty Python.

I wonder if he was inspired by some of the usual speech patterns of aphasia, or whether this was just an interesting coincidence.

Link to video of Monty Python sketch (thanks Patricio!).
Link to Butterworth and Howard study.
Link to PubMed entry for study.

Through a lab darkly

Cognitive scientists should be explorers of the mind, forging a path through the chaotic world of everyday life before even thinking of retreating to the lab, according to a critical article in the latest edition of the British Journal of Psychology.

Cognitive science often works like this: researchers notice something interesting in the world, they create a lab-based experiment in an attempt to control everything except what they think is the core mental process, they then test the data to see if it predicts real-world performance.

A new approach, proposed by psychologist Alan Kingstone and colleagues, suggests this is fundamentally wrong-headed and we need to completely rethink how we study the human mind to make it relevant to the real world.

The authors suggest that the standard approach relies on a flawed assumption – that mental processes are like off-the-shelf tools that do the same job, but are just assembled by the mind in different ways depending on the situation.

But imagine if this isn’t the case and mental processes are, in fact, much more fluid and adapt to fit the environment and situation. Not only would we have to change our psychological theories, we would have to change how we study the mind itself because the assumption that we can isolate and test the same mental process in different environments justifies the whole tradition of lab-based research.

The authors suggest an alternative they call ‘cognitive ethology’ and it focuses the efforts of cognitive scientists on a different part of the research process.

Let’s just revisit our potted example of what most cognitive scientists do: they notice something in the world, they create a lab-based experiment, they test to see if it predicts real-world performance.

The first part of this process (noticing -> lab-experiment) is often based on subjective judgements and rough descriptions and isn’t validated until the lab-based experiment is tested.

Kingston and his colleagues argue that scientists should be applying the techniques of science to the first stage – measuring and describing behaviour as it happens in the real world – and only then taking to the lab to see what happens when conditions change.

They give an example of this approach in an interesting driving study:

A Nature publication by Land and Lee (1994) provides a good illustration of a research approach that is grounded in the principle of first examining performance as it naturally occurs. These investigators were interested in understanding where people look when they are steering a car around a corner. This simple issue had obvious implications for human attention and action, as well as for matters as diverse as human performance modelling, vehicle engineering, and road design.

To study this issue, Land and Lee monitored eye, head, steering wheel position, and car speed, as drivers navigated a particularly tortuous section of road. Their study revealed the new and important finding that drivers rely on a ‘tangent point’ on the inside of each curve, seeking out this point 1–2 seconds before each bend and returning to it reliably.

Later, other researchers used a lab-based driving simulator study to systematically alter how much of this ‘tangent point’ was available to see what caused abnormal driving.

The authors also make the point that this approach is much better at helping us understand why something happens the way it does, because it ties it to the real world and helps us integrate it with the our knowledge of personal meaning.

It’s an interesting approach and meshes nicely with a recent article on cultural cognitive neuroscience in Nature Reviews Neuroscience. It looked at a number of fascinating studies on cultural influences on mind and brain function and discusses how we can go about understanding the interaction between culture and the brain.

If you want to skip the theoretical parts, Box 1 is worth looking at just for a brief summary of some intriguing cultural differences in the way we think.

The piece was also rather expertly covered by Neuroanthropology who cover the main punchlines and discuss some of the claims.

Link to ‘cognitive ethology’ article.
Link to PubMed entry for ‘cognitive ethology’ article.
Link to ‘cultural neuroscience’ article.
Link to PubMed entry for ‘cultural neuroscience’ article.

Computers cause abnormal brain growth – proof!

I have discovered shocking evidence that computers are affecting the brain. After extensive research, I have discovered the problem is remarkably specific and I have isolated it to an individual brain area affected by one particular application. Microsoft Word is causing abnormal growth in the frontal lobes.

The cingulate cortex is a part of the frontal lobe that is known to be involved with conflict monitoring, pain and emotion, while Microsoft Word is a clumsy but ubiquitous word processing package that has an annoying habit of auto-correcting things you don’t want to be auto-corrected.

For example, try typing the words ‘cingulate cortex’ into Word and see what happens. It changes it to ‘cingulated cortex’, adding an annoying ‘d’ onto the end of the first word.

Whenever I’m writing a neuropsychology article, I now have the habit of doing a search and replace before I finish to sweep up any of these auto-errors. So I was wondering whether anyone else had suffered the same problem and searched the scientific literature.

Now, it could be that people have just been making standard typos throughout history, as adding a rogue ‘d’ is not uncommon, even when we’re writing with a pen, but this doesn’t seem to be the case.

While the use of the term ‘cingulate cortex’ stretches back to at least the beginning of the 20th century, the term ‘cingulated cortex’ barely appears, until Microsoft Word’s autocorrection tool arrives on the scene.

There are 15 uses of the phrase “cingulated cortex” from 1900 to 2000. There are 1,740 uses from 2000 to now.

Microsoft Word, it seems, is slowly changing the brain.

Without further ado, I have named the disorder Bell’s Frontal Nomenclature Hypertrophy Syndrome and demand that it be included in the diagnostic manuals.

Thousands of disturbed people will not get the help they need without this essential recognition, although in the mean time I will be offering private treatment at special rates.

Of course, I strongly encourage further research and welcome offers of interviews from the press, radio or television.

I am also available for weddings, funerals and Bar Mitzvahs.

Minds and myths

The September issue of The Psychologist has two excellent and freely available articles that smash the popular myths of scientific psychology.

The first examines the widely mythologised story of hole-in-the head celebrity Phineas Gage, and the other tackles commonly repeated stories of famous studies that don’t stand up to scrutiny.

Gage, whose skull is pictured on the front cover, is legendary, but, as the article makes clear, there’s actually a great deal we don’t know about his life and the information that typically accompanies his story is based on only a very few sources.

The article on other myths in psychology focuses on some of the most widely incidents and studies in the field: the murder of Kitty Genovese, Asch’s conformity experiments, Little Albert and the Hawthorne Effect.

Particularly interesting is a discussion of the role of myths in science and what benefit they bring to the study of the human mind:

Other sciences certainly do have their own myths – just think of the story of Newton and the falling apple or Archimedes leaping out of the bath following his Eureka insight. Perhaps myths just seem more prominent in psychology because we tend to talk and write about our science in terms of studies rather than facts. Certainly the work of Mary Smyth at Lancaster University would appear to be consistent with this view – she has compared psychology and biology textbooks and found that psychology appears to have comparatively few taken-for-granted facts. Instead, numerous experiments are described in detail, lending scientific credence to any factual claims being made.

Related to this, there’s no doubt that the actual subject matter of psychology plays a part too – there’s that ever-present pressure to demonstrate that psychological findings are more than mere common sense. Benjamin Harris says that historians have described psychology as putting a scientific gloss on the accepted social wisdom of the day. ‘Psychology is always going to have a strong social component,’ he explains. ‘With psychological theories speaking to the human condition, there’s always going to be an appeal to myths that resonate more with experience than something coming out of the lab that’s sterile and ultra scientific.’

Another role that myths play is to reinforce the empirical legitimacy of psychology and to create a sense of a shared knowledge base. ‘In this way, tales such as of Kitty Genovese or Little Albert are rather like origin myths, pushing the creation of psychology, or a particular approach within psychology back in time, thus giving an air of greater authority,’ says Harris. Hobbs agrees: ‘It’s nice to have something that you can take for granted,’ he says. ‘In the case of the Hawthorne effect and other myths, you shouldn’t take it for granted, but it’s comforting to be able to say “Oh, this could be the Hawthorne effect” and for others to nod and say “Ah yes, that’s right”.’

Link to article ‘Phineas Gage ‚Äì Unravelling the myth’.
Link to article ‘Foundations of sand?’.

Full disclosure: I’m an unpaid associate editor for The Psychologist.

2008-08-29 Spike activity

Quick links from the past week in mind and brain news:

Choreography and Cognition is a project examining the cognitive science of dance. Try this for some experimental data. Get down.

The myth of undecided voters is tackled head on by Frontal Cortex.

Gin, Television and Cognitive Surplus. No, not a traditional English weekend, an Edge article by Clay Shirky on the internet and mental aggregators.

PsychCentral’s Sandra lists her Top 10 online psychology experiments.

ABC Radio National’s Life Matters explores out relationship to colour.

Corpus Callosum has an interesting role reversal art project where a psychiatrist has painted his emotional impression of patients.

Epigenetics or the ‘Ghost in Your Genes’ is a new TV programme and is linked to and discussed by Neuroanthropology.

The Smart Set review a book on loneliness.

The Guardian’s examination of the supposedly mandatory but widely ignored drug company gift registers for UK doctors, shows (can we guess) widespread soul selling.

Be sure to check ABC Radio National’s All in the Mind blog for extended comments and extra audio from the recent programme on the mind, markets and morality.

Wired Sciece on why early stone tools suggest Neanderthals were equally as intelligent as early humans, contrary to popular belief. Researchers now exploring lack of style, poor personal hygiene as reason for extinction.

The rubber hand illusion is accompanied by a drop in temperature of the ‘displaced’ hand. Another from Wired Science.

The BPS Research Digest reports a interesting study that finds we tend to overestimate the size of our own heads, but not those of others.

The three critical techniques for stage magic discussed in the recent paper on the cognitive science of magic are summarised by PsyBlog.

Harvard Magazine has an article on ‘A Work in Progress: The Teen Brain‘. Due to be completed shortly after Duke Nukem Forever.

July’s Neuropod appeared and we didn’t even notice. Still, the programme has been eerily quiet since then.

The Times reports that more sex by braver soldiers suggests an evolutionary explanation for rhubarb, hat stands, pink elephants, blah blah blah…

Why Are ‘Mama’ and ‘Dada’ a Baby’s First Words? Sounds obvious but it’s actually an interesting study into developmental phonetics.

BBC News reports that the drug rasagiline may may actually slow down Parkinson’s disease according to an early study.

Cool photo on Flickr appropriately called ‘applied radiology‘.

Cannabis use went down in the UK after it was reclassified as a ‘softer’ drug, reports of The Guardian. Buckets of urine at the ready to be flung into the wind when government shortly re-reclassifies it as a ‘harder’ drug.

Interesting experimental philosophy paper makes it into the top 10 philosophy papers of the year.

Furious Seasons catches two interesting antipsychotic news nuggets: Nature Neuroscience editorial says credibility lacking in child psychiatry after recent payments scandal / BMJ reports antipsychotics really, really bad in older folks.

Count ’em

Wikipedia has a short but fascinating page listing animals by the number of neurons they have. There’s only about a dozen entries on there, but most interesting is that there is an animal with no nerve cells at all.

It’s called Trichoplax and apparently is a “a simple balloon-like marine animal with a body cavity filled with pressurized fluid”.

Apparently humans don’t come top of the pile, as both elephants and whales have more neurons.

However, it’s not the best referenced article in the world, to say the least, so I’m taking this last claim with a pinch of salt for the time being.

If you know better, do update the article with some more reliable sources.

Link to ‘List of animals by number of neurons’.

Wilder Penfield – charting the brain’s unknown territory

Neurophilosophy has a stimulating article on Wilder Penfield, the legendary Canadian neurosurgeon who pionered neuropsychological studies on the awake patient during brain surgery.

Penfield is most famous for his experiments where he electrically stimulated the brain of patients who had part of their skull removed during surgery to record what thoughts, behaviours and sensations arose from the excitation of specific parts of the cortex.

This research is still being done in modern times. My favourite is a 1991 study on electrical stimulation of the supplementary motor area SMA) by (no laughing now) Fried and colleagues.

What is most fascinating is that they found electrical stimulation could trigger the urge to movement or the expactation that a movement might occur, without triggering any movement itself. This stretched from quite vague feelings such as the “need to do something with right
hand” to very specific movement intentions such as the “urge to move right thumb and index finger”.

The gripping and typically well-researched Neurophilosophy article takes us right into the middle of one of these experiments performed by Penfield, and goes on to explain how his work became so influential in science and medicine.

Penfield was a pupil of Harvey Cushing, considered the founder of scientific neurosurgery, who was featured only last week on the same excellent blog.

Unlike Cushing though, who was reknowned for being a bit spiky, Penfield was widely considered to be a warm and friendly individual.

It’s probably the best article on Penfield you’re likely to find on the net, so well worth taking the opportunity of learning more about this key figure in our understanding of the brain.

Link to article ‘Wilder Penfield, Neural Cartographer’.
Link to previous Mind Hacks post on Wilder’s operation on his sister.

Unreality TV and the culture of delusions

Today’s New York Times has an interesting article on the tug-of-war over the cultural influence on paranoid delusions and whether contemporary-themed psychosis is a new form of mental illness or just a modern colouring of an old disorder.

The article focuses on the recent interest in the ‘Truman Show delusion’, splashed over the media by two Canadian psychiatrists.

It’s quite hard to judge what they’re aiming to do as they’ve not published a scientific paper, and the article suggests they’re writing a book (is that the sounds of alarm bells I hear?), so I’m solely going on secondary sources.

But if they’re saying that delusions specifically about being in the Truman Show are somehow new and interesting, then they’re right in a way. Popular culture often turns up in paranoid beliefs – I worked with a gentleman once who believed he was in The Matrix – but its not earth shattering. It happens all the time.

If they’re saying that the general experience of The Truman Show – feeling that the world is being controlled, is unexplainably altered, or is uncannily mysterious – is somehow new, then they’re wrong by a good 100 years.

This was described by the German psychiatrist Karl Jaspers in the early part of the 20th century who called it Wahnstimmung, which is translated in the modern English literature as delusional mood or delusional atmosphere.

This is the description from Andrew Sims’ book on descriptive psychopathology Symptoms in the Mind:

“For the patient experiencing delusional atmosphere, his world has been subtly altered ‘Something funny is going on’; ‘I have been offered a whole new world of meaning’. He experiences everything around him as sinister, portentous, uncanny, peculiar in an undefinable way. He knows that he is personally involved but cannot tell how. He has the feeling of anticipation, sometimes even of excitement, that soon all the separate parts of his experience will to reveal something immensely significant.”

Actually, the article has a quote from me, although miscasts my view a little. I’m quoted as saying:

“Cultural influences don’t tell us anything fundamental about delusion,” said Vaughan Bell, a psychologist at the Institute of Psychiatry at King’s College in London, who has studied Internet delusion.

“We can look at the influence of television, computer games, rock ’n’ roll, but these things don’t tell us about new forms of being mentally ill,” said Dr. Bell, who said he had also treated patients who believed they were part of a reality television show.

Actually, I do think that cultural influences are fundamental in understanding delusions, but not in themselves. [Squiggly sound of tape rewinding] It seems the crucial qualification “in themselves” was missed off the quote.

In fact, in the paper I wrote on delusions about the internet I concluded by saying “The extent of influence may not be equal for all aspects of society and culture, although the fact that there is an influence at all, suggests that psychosis is only fully understandable in light of the wider social context.”

To quote John Donne, “no man is an island” and we can only fully understand or thoughts and behaviour, either everyday or pathological, with reference to the cultures we live in. But this doesn’t mean that each aspect of cultural influences us equally on all levels.

Link to NYT article ‘Look Closely, Doctor: See the Camera?’.

The music’s too loud and you can’t hear the lyrics

Today’s Nature has a teeth-grittingly bitchy review of psychologist Daniel Levitin’s new music and psychology book The World In Six Songs that would be entertaining were it not so surprisingly vitriolic.

I’ve not read the book, but when someone is criticising the author’s musical taste as immature, not once, but twice, in the world’s leading science publication, you know the review has gone beyond the point of healthy knock-about into the zone of below-the-belt punches.

What is it about Nature book reviews? We covered one in 2007 where the reviewer got stuck in despite not seeming to have read the book.

Actually, no one does a good book barney like the philosophers, who at least have the good grace to wrap their barbs in dry wit and satire rather than just spitting venom at each other (although they do that too).

If you want to get an idea of Levitin’s basic premise, New Scientist has an online article on the book. It seems to be applying the ‘basic plots’ idea to music.

This is widely discussed in literature where many people have claimed to have identified the seven, eight, twenty, thirty six (you get the idea) basic plots in stories, literature and plays throughout history.

Link to hatchet job in Nature.
Link to NewSci on The World In Six Songs.

Who needs sleep? The evolutionary slumber party

PLoS Biology has a cozy essay entitled “Is Sleep Essential?” that addresses the mystery of the purpose of sleep.

The article looks at sleep across the whole of the animal kingdom to examine how different species sleep and whether there are any animals that don’t sleep at all.

There are no convincing cases of sleepless animals it seems, and the authors, neuroscientists Chiara Cirelli and Giulio Tononi, argue that sleep is therefore likely to be an essential function of living creatures.

The three corollaries of the null hypothesis [‘sleep is not required’] do not seem to square well with the available evidence: there is no convincing case of a species that does not sleep, no clear instance of an animal that forgoes sleep without some compensatory mechanism, and no indication that one can truly go without sleep without paying a high price. What many concluded long ago still seems to hold: the case is strong for sleep serving one or more essential functions. But which ones?

The article goes on to examine the hypotheses that sleep is important for regulating the body’s core functions, the brain, individual cells and that it is common to all species and must involve something that cannot be provided by quiet wakefulness.

More interesting is the question of whether all animals dream – and perhaps most intriguing, if so, how they might dream.

Indeed, it would be interesting to discover whether dreaming is a necessary function of sleep, or whether it is specifically linked to certain neurocognitive processes or even particular creatures.

Link to PLoS Biology article ‘Is Sleep Essential?’ (via Wired Science).