Coma alarm dreams

Intensive Care Medicine has published a wonderfully written and vivid account from a teenager who spent time brain injured and hallucinating in an intensive care unit.

The writer describes how he was admitted to intensive care at the age of 15 after suffering a head injury and had intense and bizarre hallucinations which are, as we know now, surprisingly common in critical care patients.

My experience of the time under sedation can be split into two. There was what I could perceive of the real world around me, and then there was my dream world.

In the real world, the most constant feature was sound. I could hear the nurses talking, understanding everything they said. They always spoke their names. They were always kind, conscious I think that I might hear them. They helped me to relax. I could hear the noises of the ward, tones of voices and alarms. The alarms made me tense. I can remember Mum talking to me a lot and Dad reading me ‘The Hobbit’, although I still can’t remember the names of all the dwarves. Mum and Dad’s voices always came from the left.

My other senses were not wholly switched off either. Things were put in my mouth: tubes, sucky things, wet watery pads and a toothbrush. Someone moved my hair about. I felt furry and silky toys placed under my fingers. My brother and sisters had brought a knitted tortoise and a horse for me. My feet were moved about and stretched, which felt really good. I remember that the rolled-up bed sheets were uncomfortable.

Other sensations were less good. The constant, repetitive shining of a bright light in my remaining eye really annoyed me – I am sure I can remember every single time.

Then there was my dreaming. I lived in the dream world nearly all the time and it went on and on. The dreams were vivid, terrifying and very disturbing. There were some good ones but unfortunately for me a lot of really bad ones. I can still remember most of them even now, more than a year since.

At the sound of an alarm, a giant monster appeared with a meat cleaver and pursued me around the sports hall. I had to protect a girl and prevent an army from crossing a river. The whole river and hall were aflame. I was burning from the heat.

In another I had to stop an alarm-driven colossal centipede from crossing a bridge. I could see the shadow of monsters looming towards me behind a curtain. I knew the monsters were there and about to consume me, but I lay transfixed, unable to move, and I remember feeling myself sweating with excruciating fear. I was then on the bridge of a nuclear submarine with maniacs trying to blow up the world, there was a huge explosion. Then it ended.

I was aboard a flying craft. I was there to stop green-coated aliens from creating human missiles. The aliens were forcing people into missile tubes. They were going to drop the human bombs from the aircraft.

Then there was a shape-shifter leopard beast chasing me and my friends. We were working in a fast-food place on a ship. It cornered us, and the Kentucky Fried Chicken sign burst into red lightning.

But I knew when something really nasty was going to happen. I could always hear the same alarm going off. It was a signal for the monsters to appear, for the centipede to attack, for bombs to be dropped, I would be sacrificed…I was very afraid. Tension would build to some hideous climax. Looking back, I suspect the pressure in my brain was causing both the nightmares and the alarm to go off.

I have made a great recovery from my injuries due in large part to the excellent care that was taken of my brain in intensive care. I have been into see the team a few times but I never stay too long. Those alarms still make me feel nervous!

As I noted in a recent article, these sorts of hallucinations were thought to be a distressing but ultimately irrelevant part of recovery but more recent studies suggests that have longer-term psychological impact that can be problematic in its own right.
 

Link to locked article ‘Coma alarm dreams on paediatric intensive care’

Circumstances of the life and brain

Neurosurgeon Henry Marsh has written a philosophical, incisive and exasperated book about brain surgery called Do No Harm.

It’s a hugely entertaining read as Marsh takes us through the practical and emotional process of operating, or not operating, on patients with neurological disorders.

He does a lot of moaning – about hospital management, computerisation, administration – sometimes quite enjoyably it must be said, but in some ways he does reflect the stereotype of the bellowing “I’ve got lives to save!” surgeon that stalks hospital corridors.

Most strikingly though, Marsh is clearly aware of his faults and he is a tough critic of himself and his decisions, often to the point of guilt. But it is through the many battles won and lost where you can see the wisdom shine through.

It is a brilliant insight, more than anything, into the decision-making involved in neurosurgery and the emotional impact these professional choices have on patients and professionals alike.

It’s interesting to compare in tone to Katrina Firlik’s neurosurgical biography Another Day in the Frontal Lobe which is equally candid about the fog of surgery but relentlessly optimistic in conclusion.

In contrast, Marsh is a man trying his best in difficult circumstances. Some of those circumstances just happen to be several centimetres deep in the brain.

The book is also wonderfully written by the way. One not to miss.
 

Link to details of book Do No Harm.

This is how stigma works

Sussex Police issue a statement about ‘Concern for missing Chichester man’, ITN News report it as ‘Police warn public over missing mental health patient’.

Sussex police:

Police are appealing for information about missing 43-year old Jason Merriman, who left The Oaklands Centre for Acute Care in Chichester on unescorted leave at 12.45pm on Friday 11 April. He was due back the same afternoon but has so far failed to return.
There are concerns for Jason’s welfare as he has mental health problems, and police advise that he is not approached by members of the public.

ITN News:

A mental health patient who has been missing from a care unit in Chichester for more than a day should not be approached by the public, police have warned.

Amazing really – (via @Sectioned_)

Is there creative accounting in behavioural economics?

The Financial Times has an excellent article on the future of behavioural economics.

Despite the fact that it is an incisive piece on a form of applied psychology that won Daniel Kahneman the Nobel Prize and has revolutionised political thinking, the FT has entitled the article ‘Behavioural economics and public policy’, to ensure it doesn’t arouse any passions which could bias your understanding of the text.

Ignore the title though, and it’s a fascinating and astutely critical piece on how the promises of behavioural economics haven’t always delivered and where it needs to go next.

So popular is the field that behavioural economics is now often misapplied as a catch-all term to refer to almost anything that’s cool in popular social science, from the storycraft of Malcolm Gladwell, author of The Tipping Point (2000), to the empirical investigations of Steven Levitt, co-author of Freakonomics (2005).

Yet, as with any success story, the backlash has begun. Critics argue that the field is overhyped, trivial, unreliable, a smokescreen for bad policy, an intellectual dead-end – or possibly all of the above. Is behavioural economics doomed to reflect the limitations of its intellectual parents, psychology and economics? Or can it build on their strengths and offer a powerful set of tools for policy makers and academics alike?

It’s by economist Tim Harford who also does good things on the Twitter.
 

Link to FT article ‘Behavioural economics and public policy’.
Link to alternate copy on Tim Harford’s blog.

Does the unconscious know when you’re being lied to?

The headlines
BBC: Truth or lie – trust your instinct, says research

British Psychological Society: Our subconscious mind may detect liars

Daily Mail: Why you SHOULD go with your gut: Instinct is better at detecting lies than our conscious mind

The Story
Researchers at the University of California, Berkeley, have shown that we have the ability to unconsciously detect lies, even when we’re not able to explicitly say who is lying and who is telling the truth.

What they actually did
The team, led by Leanne ten Brinke of the Haas School of Business, created a set of videos using a “mock high-stakes crime scenario”. This involved asking 12 volunteers to be filmed while being interrogated about whether they had taken US$100 dollars from the testing room. Half the volunteers had been asked to take the $100, and had been told they could keep it if they persuaded the experimenter that they hadn’t. In this way the researchers generated videos of both sincere denials and people who were trying hard to deceive.

They then showed these videos to experimental participants who had to judge if the people in the videos were lying or telling the truth. As well as this measure of conscious lie detection, the participants also completed a task designed to measure their automatic feelings towards the people in the videos.

In experiment one this was a so-called Implicit Association Test which works by comparing the ease with which the participants associated the faces of the people in the videos with the words TRUTH or LIE. Experiment two was a priming test, where the faces of the people in the videos changed the speed at which people then made judgements about words they were then given related to truth-telling and deception.

The results of the study showed that people were no better than chance in their explicit judgements of who was telling the truth and who was lying, but the measurements of their other behaviours showed significant differences. Specifically, for people who were actually lying, observers were slower to associate their faces with the word TRUTH or quicker to associate it with the word LIE. The second experiment showed that after seeing someone who was actually telling the truth people made faster judgements about words related to truth-telling and slower judgements about words related to deception (and vice versa after a video of someone who was actually lying).

How plausible is this?
The result that people aren’t good at detecting lies is very well established. Even professionals, such as police officers, perform poorly when formally tested on their ability to discriminate lying from truth telling.

It’s also very plausible that the way in which you measure someone’s judgement can reveal different things. For example, people are in general notoriously bad at reasoning about risk when they are asked to give estimates verbally, but measurements of behaviour show that we are able to make very accurate estimates of risk in the right circumstances.

It fits with other results in psychological research which show that over thinking certain judgements can reduce their accuracy

Tom’s take
The researchers are trying to have it both ways. The surprise of the result rests on the fact that people don’t score well when asked to make a simple truth vs lie judgement, but their behavioural measures suggest people would be able to make this judgement if asked differently. Claiming the unconscious mind knows what the conscious mind doesn’t is going too far – it could be that the simple truth vs lie judgement isn’t sensitive enough, or is subject to some bias (participants afraid of being wrong for example).

Alternatively, it could be that the researchers’ measures of the unconscious are only sensitive to one aspect of the unconscious – and it happens to be an aspect that can distinguish lies from an honest report. How much can we infer from the unconscious mind as a whole from the behavioural measures?

When reports of this study say “trust your instincts” they ignore the fact that the participants in this study did have the opportunity to trust their instincts – they made a judgement of whether individuals were lying or not, presumably following the combination of all the instincts they had, including those that produced the unconscious measures the researchers tested. Despite this, they couldn’t guess correctly if someone was lying or not.

If the unconscious is anything it will be made up of all the automatic processes that run under the surface of our conscious minds. For any particular judgement – in this case detecting truth telling – some process may be accurate at above chance levels, but that doesn’t mean the unconscious mind as a whole knows who is lying or not.

It doesn’t even mean there is such as thing as the unconscious mind, just that there are aspects to what we think that aren’t reported by people if you ask them directly. We can’t say that people “knew” who was lying, when the evidence shows that they didn’t or couldn’t use this information to make correct judgements.

Read more
The original paper: Some evidence for unconscious lie detection”

The data and stimuli for this experiment are freely available – a wonderful example of “open science.”

A short piece I wrote about how articulating your feelings can get in the way of realising them.

The Conversation

This article was originally published on The Conversation.
Read the original article.