Let there be light, finally

A documentary on the trauma of war, banned by the US government for more than 30 years, has found its way onto YouTube as a freely viewable video.

During World War Two, legendary director John Huston, then a fresh face in Hollywood, was commissioned to make three propaganda films for the US Army.

The third film, Let There Be Light, was made in 1946 – just as the war ended – and focussed on the psychiatric treatment of soldiers traumatised in combat.

This is a description from the fantastic book The Empire of Trauma:

With no political agenda, and anxious to keep scrupulously to the task he had been given, Huston applied to the letter the principle of objectivity he had followed in the two previous documentaries. For more than three months, he filmed the daily life of former combatants hospitalized at Mason General, a military hospital on Long Island. The courage and sense of sacrifice of these men was clearly portrayed, as the Pentagon had clearly requested. But equally apparent was the fact that some of them were utterly destroyed: their fear, their shame, and their tears showed clearly, as did their contempt for military authorities. The film also documented the arrogance and harshness of the psychiatrists and brutality of some of their therapeutic methods. Remarkably, when the film received its world premiere at the Cannes Film Festival in 1981, the emotional response of the viewers and critics was muted, for the film did not meet the expectations of an audience seeking revelations about the military and medical practices of the time.

What made the film so controversial in 1946, made it commonplace in 1981. But this was nothing to do with film-making, and instead concerned the way it portrayed the effects of trauma.

Let There Be Light portrays the “emotionally damaged” soldier as an everyday person “forced beyond the limit of human endurance”. “Every man”, it says, “has his breaking point”.

This is the modern view of trauma, widely accepted in psychiatry and in today’s media narratives, and is itself somewhat of a simplification of what we actually know about how people react to extreme events.

But in 1946, and especially in military psychiatry, the most widely accepted view was that soldiers who became mentally ill were psychologically weak or malingering.

The fact that film showed US Soldiers, not as the glorified heroes the public wanted, but as disabled veterans, meant that the film would be a huge propaganda disaster – likely compounded by the fact that most people saw these conditions as character flaws or shameful faking.

The idea that these were ordinary men who had been through extraordinary circumstances was just too far ahead of its time to seem realistic.

And this is why it was censored, for 35 years, until it had its first public showing in 1981, when it seemed nothing more than a passé propaganda film that just reflected what we all assumed was always the case, but actually, never was.
 

Link to film on YouTube
Link to downloadable version on Internet Archive.

BBC Column: stopped clocks and dead phones

My column for BBC Future from last week. It’s another example of how consciousness isn’t just constructed, but is a construction for which the signs of artifice are hidden. The original is here

 

Ever stared at a second hand and think that time stands still for a moment? It’s not just you.

Sometimes, when I look at a clock time seems to stand still. Maybe you’ve noticed this to your bemusement or horror as well. You’ll be in the middle of something, and flick your eyes up to an analogue clock on the wall to see what the time is. The second hand of the clock seems to hang in space, as if you’ve just caught the clock in a moment of laziness. After this pause, time seems to restart and the clock ticks on as normal.

It gives us the disconcerting idea that even something as undeniable as time can be a bit less reliable than we think.

This happened to me for years, but I never spoke about it. Secretly I thought it was either evidence of my special insight to reality, or final proof that I was a little unhinged (or both). But then I found out that it’s a normal experience. Psychologists even have a name for it – they call it the “stopped clock illusion”. Thanks psychologists, you really nailed that one.

An ingenious experiment from a team at University College London recreated the experience in the lab and managed to connect the experience of the stopped clock to the action of the person experiencing it. They asked volunteers to look away and then suddenly shift their gaze to a digital counter. When the subjects tried to judge how long they had been looking at the digit that first appeared, they systematically assumed it had been on for longer than it had.

 

Filling gaps

Moving our eyes from one point to another is so quick and automatic that most of us probably don’t even think about what we are doing. But when you move your eyes rapidly there is a momentary break in visual experience. You can get a feel for this now by stretching your arms out and moving your eyes between your two index fingers. (If you are reading this in a public place, feel free to pretend you are having a good stretch.) As you flick your eyes from left to right you should be able to detect an almost imperceptibly brief “flash” of darkness as input from your eyes is cut off.

It is this interruption in consciousness that leads to the illusion of the stopped clock. The theory is that our brains attempt to build a seamless story about the world from the ongoing input of our senses. Rapid eye movements create a break in information, which needs to be covered up. Always keen to hide its tracks, the brain fills in this gap with whatever comes after the break.

Normally this subterfuge is undetectable, but if you happen to move your eyes to something that is moving with precise regularity – like a clock – you will spot this pause in the form of an extra long “second”. Fitting with this theory, the UCL team also showed that longer eye-movements lead to longer pauses in the stopped clock.

It doesn’t have to be an eye movement that generates the stopped clock – all that appears to be important is that you shift your attention. (Although moving our eyes is the most obvious way we shift our attention, I’m guessing that the “inner eye” has gaps in processing in the same way our outer eyes do, and these are what cause the stopped clock illusion.) This accounts for a sister illusion we experience with our hearing – the so-called “dead phone illusion”, which is when you pick up an old-fashioned phone and catch an initial pause between the dial tone that seems to last longer than the others.

These, and other illusions show that something as basic as the experience of time passing is constructed by our brains – and that this is based on what we experience and what seems the most likely explanation for those experiences, rather than some reliable internal signal. Like with everything else, what we experience is our brain’s best guess about the world. We don’t ever get to know time directly. In this sense we are all time travellers.

An in-brain stimulation grid

Implanted electrode grids are used to record brain activity in people who need neurosurgery – a technique known as electrocorticography.

But rather than just ‘reading’ from the brain, neuroscientists are starting to use them to ‘write’ to the brain, to the point of being able to temporarily simulate specific brain disorders for experimental studies.

This is the subject of my latest Observer column which looks at the history of open-brain stimulation studies and covers recent research by a joint British – Japanese team which has been using the grids to temporarily simulate a form of brain disorder called ‘semantic dementia’ in live volunteers.

The precision is such that the Lambon Ralph team and a team at Kyoto University Medical School, led by Riki Matsumoto, have used an implanted grid to temporarily simulate characteristics of a brain disease called semantic dementia. Like Alzheimer’s, semantic dementia is a degenerative disorder, but one in which brain cells that specifically support our understanding of meaning rapidly decline. Studies of patients with semantic dementia have taught us a great deal about how memory is organised in the brain but the disorder is swift and unpredictable, and a method that can mimic the effects while recording directly from the cortex is a powerful tool.

To be clear, the grids are not installed for this purpose. They’re installed because they are part of brain surgery to treat otherwise untreatable epilepsy. The grids allow neurosurgeons to locate the exact bit of the brain that triggers seizures so it can be removed.

The article is in part a coverage of the amazing neuroscience, from 1886 to the present day, and in part a tribute to the neurosurgery patients who have volunteered to help us understand the brain.
 

Link to Observer article.

The kings of Kingsley Hall

The Observer has an article on some of the residents of R.D. Laing’s chaos-as-therapy residential centre at Kingsley Hall, five decades on.

The idea was that people with psychosis and therapists would live together in a therapeutic environment and effect change without the use of medical drugs. Residents could ‘live out’ their delusions and come to terms with the early traumas which R.D. Laing saw as the root of their difficulties.

But as the documentary Asylum shows, the place was more chaos than freedom, and the residence became a stop-in for hippies, lost souls and acid dealers.

Most accounts of the place have focussed on Laing but photographer Dominic Harris decided to track down the residents for a portrait project.

The Observer article has some of their stories:

One patient had been in a mental hospital: John Woods, I think. His label in orthodox psychiatry was paranoid schizophrenic. He had some fantasy about some young woman and he couldn’t write letters to her himself so he dictated them to me. When it turned out this woman wasn’t interested, he assumed wrongly that I was preventing her from coming to visit him. He thought I was a black magician and was controlling her. Then living in there became quite scary. There was a chapel in the building, with a huge crucifix, and he burst into my room early one morning holding it. I thought he was going to attack me with it but he wanted to exorcise me. Eventually, I did something that was against the whole ideology of the place: I tried to have him sectioned.

There are many more fascinating, if not troubling, insights to the heart of the chaos.
 

Link to Observer article.
Link to project, book and exhibition on the residents by Dominic Harris

A pain to describe

RadioLab has an excellent mini-episode on the difficulties of communicating the subjective feeling of pain.

As you might expect, it is both wonderfully put together and unexpectedly beautiful in places, but for such a uncomfortable subject, it is also very funny.

Particularly wonderful is a segment on the originator of the Schmidt index that rates the intensity of insect sting pain from “Light, ephemeral, almost fruity” to “Pure, intense, brilliant pain”.
 

Link to RadioLab mini-episode on pain.

A traditional IRA welcome to the sociologist

An amazing description of how sociologists who wanted to do field studies in Belfast during the height of The Troubles were put through some seemingly routine but terrifying vetting by the IRA to check they were up to the job.

The piece is from an article by Lorraine Dowler, who starts by recounting a tale from legendary social scientist Frank Burton.

Burton worked extensively amid the violence of Belfast and woke up one morning to find someone was pointing a sub-machine gun in his face and suggesting he was a “Four Square Laundry job” an allusion to being an army spy.

Dowler continues:

Thanks to his dangerous and frightening experiences in West Belfast, Frank Burton’s ethnographic research on Northern Ireland is considered legendary. At first glance the incident Burton describes would seem mad to anyone who has not spent time living and working in the Catholic ghettos of Belfast. However, as alarming as this event may seem, it speaks more to the rapport Burton established with his respondents than to the perils of fieldwork. In actuality this was a prank brought about by one of his Irish Republican Army (IRA) informants.

The hazing of researchers is a common practice in Belfast, and anyone who conducts inquiries of this nature is bound to collect a few such “war stories”. The obvious reason for such a vetting is that the IRA feared that a British undercover operative disguised as an academic would infiltrate the organization. Having said that, I believe that researchers are not only checked out as potential spies but also tested to see whether they have the “salt” to stick it out when the political atmosphere makes day-to-day life difficult. In other words, the researcher has to prove that, when placed in a life-threatening situation, even for just a moment, she or he won’t simply pack up and go home.

How weird that amongst all of the violence and subterfuge, the IRA was actively managing its policy on collaborating with ethnography researchers.

Dowler herself also worked as a sociologist amid the The Troubles and has more than a few stories of her own to tell – not least having to flee an assassination attempt on one of her interviewees.

However, she wisely notes that the greatest risk was not to her, but to her participants, who were giving sensitive information to her in the name of impartial research.

Despite the fact that the hazing was extreme, you can understand why trust was considered important.
 

Link to locked version of article.
pdf of article, freely available.

A dark and complex past

In a story that could be the plot for a film, one of the world’s pioneering anthropologists has been found to have been a member of both the Nazi SS and the French resistance during the Second World War.

Gerardo Reichel-Dolmatoff retains legendary status in anthropology and particularly in Colombia, where he first lived with many of the country’s remote indigenous people during the 1950s and 60s and founded the first department of anthropology. He died in 1994 but his legend has only grown since his passing.

In many ways, the classic image of the anthropologist was shaped by Reichel-Dolmatoff. He lived with remote communities to learn the language and worldviews of previously unknown societies. He trekked through jungles and participated in the hallucinatory ceremonies of local religions. He pioneered the archaeology not of the giant civilization, but of the lost peoples of specific valleys and mountain ridges.

He was actually born in Austria but talked little about his past. This is not surprising in the light of new revelations.

Augusto Oyuela-Caycedo, an anthropologist at the University of Florida, has been researching the background of this legendary figure but found far more than the echo of myth.

If you speak Spanish you can watch his recent conference presentation. But even if you don’t, you can see it has a power absent from most academic talks.

Oyuela-Caycedo began his investigation as a tribute to his friend and mentor only to discover a grim past well documented in the Nazi archives. At one point in the presentation, he is brought to tears as he reads a description of how the yet-to-be Austrian anthropologist murders an old man with a pistol.

It turns out that Reichel-Dolmatoff was a member of both the Nazi Party and the SS, in the personal guard of Hitler himself and a participant in Gestapo death squads. He later trained guards in the Dachau Concentration Camp.

In light of his subsequent life in Colombia, it would be easy to chalk this up as another bitter tale of a Nazi who escaped justice to the anonymity of Latin America, but Reichel-Dolmatoff did not seem to make the typical Nazi exit from Europe. He had what is vaguely described as a ‘mental crisis’ in 1936 and was declared unfit for the SS and publicly expelled from the Nazi party.

Curiously, he turned up immediately afterwards working for the anti-Hitler resistance in France and continued to support the French resistance after he arrived in Colombia in 1939, to the point where he was eventually awarded the National Order of Merit by the French president.

Reichel-Dolmatof’s subsequent anthropological work is completely devoid of Nazi overtones – no hints of eugenics or ‘racial hygiene’ – and throughout his life he attempted to demonstrate the amazing diversity of the native peoples of Colombia, the Amazon and the Sierra Nevada mountains.

The case raises a number of difficult questions. The nature of Reichel-Dolmatof’s ‘mental crisis’ remains completely obscure. As the Spanish-language magazine Arcadia asks – how did a young Nazi end up working in Colombia for a Hitler resistance movement? Was it a crisis of conscious or something more opportunistic?

But perhaps more important is the question of whether Reichel-Dolmatof can ever redeem himself. Is his life and his work now forever tainted? Does his good work drown under the tide of his dark and vicious past?

It may have been a question he asked himself many times.
 

Link to English-language coverage of discovery.
Link to Oyuela-Caycedo’s Spanish-language presentation.
Link to Spanish-language coverage from Arcadia magazine.

The neurology of Psalm 137

I’ve just found a short but interesting study on Psalm 137 and how it likely has one of the first descriptions of brain damage after stroke.

The Psalm is still widely sung but it has some particular lines which made the researchers take notice. Here they are in modern English from the New International Version of the bible:

If I forget you, Jerusalem,
  may my right hand forget its skill.
May my tongue cling to the roof of my mouth
  if I do not remember you,
if I do not consider Jerusalem
  my highest joy.

This seems to describe some clear physical symptoms but the bible has numerous versions all of which have different translations and even in their early versions do not always agree on the exact wording.

As a result the researchers looked at Spanish, English, German, Dutch, Russian, Greek, and Hebrew versions to examine the consistency of the text and the variations in description of these curious physical effects. The combined description includes:

If I forget of you, oh Jerusalem, my right hand (my right side) shall dry, be paralyzed, loose its ability, its dexterity… That my tongue shall stick (shall be weakened, arrested) to my palate (in my throat), if I remember you, if I do not permit Jerusalem to be my greatest joy (if I do not sing of Jerusalem as my greatest joy)

Both right-sided paralysis and loss of expressive speech are clear symptoms of a stroke of the left middle cerebral artery, where the blood flow is blocked – leading to the death of the surrounding brain tissue, suggesting that the Psalm may be wishing these effects on people who forget the importance of Jerusalem.

The powerful nature of the wish is perhaps explained by the fact that the Psalm is widely believed to have been written as a lament by Jewish people exiled after the conquest of Jerusalem by the Babylonians in 586 BCE.

But why these specific symptoms are mentioned may have more to do with ancient beliefs about stroke itself.

The reason the condition is still called stroke is because people originally believed that it was a result of being ‘struck down’ by God.

The Psalm still remains popular and the opening lines “By the rivers of Babylon…” have spawned a cottage industry in bad pop songs most of which miss out the lines concerning stroke.

However, the track Jerusalem by Jewish reggae hip-hop maestro Matisyahu does focus on this part of the Psalm, which he mashes-up along with lyrics from Matthew Wilder’s Break My Stride.
 

Link to study on the neurology of Psalm 137.
Link to Jerusalem by Matisyahu.

A guided tour of bad neuroscience

Oxford neuropsychologist Dorothy Bishop has given a fantastic video lecture about how neuroscience can be misinterpreted and how it can be misleading.

If you check out nothing else, do read the summary on the Neurobonkers blog, which highlights Bishop’s four main criticisms of how neuroscience is misused.

But if you have the time, sit back and see the lecture in full.

The key is that these are not slip-ups only restricted to the popular press and self-help books – they are exactly the sort of poor reasoning about neuroscience that affects many scientists as well.

Essentially, if you get the Bishop’s four main points of how ‘neurosciency stuff leads to a loss of critical faculties’, you’re on fine form to separate the wheat from the chaff in the world of cognitive neuroscience.

Excellent stuff.
 

Link to coverage on the Neurobonkers blog.
Link to streamed video of the lecture.

Consciousness after decapitation

How long is a severed head conscious for? The question has troubled students of the human body for centuries and generated countless, possibly mythical stories. History of medicine blog The Chirurgeon’s Apprentice has finally looked through the records to find out which of the accounts are based in blood-curdling fact.

A common tale involves someone trying to test the idea during the French revolution by taking a severed head directly after it has fallen from the guillotine and asking questions, with the unfortunate victim communicating via blinks until it loses consciousness.

We’ve covered exactly such a story previously on Mind Hacks, but historian of medicine Lindsey Fitzharris thought it sounded a bit too much like a tall tale and decided to find out if anything like this had ever actually happened.

She ended up on a wonderfully macabre journey through the science of consciousness after decapitation, involving everything from electrocuting severed heads to grimacing dead people:

The first to reportedly do so was a Dr Séguret, who subjected a number of guillotined heads to a series of experiments during the French Revolution. In several instances, he exposed their eyes to the sun and observed that they ‘promptly closed, of their own accord, and with an aliveness that was both abrupt and startling’. He also pricked one of the severed head’s tongue with a lancet, noting that the tongue immediately retracted and ‘the facial features grimaced as if in pain’. Was this my urban legend?

Right century, wrong story.

Fitzharris eventually finds the source of the story, but I wouldn’t want to spoil the, er, fun for you.
 

Link to ‘Losing One’s Head’ (via @TheNeuroTimes)

Animals conscious say leading neuroscientists

A group of leading neuroscientists has used a conference at Cambridge University to make an official declaration recognising consciousness in animals.

The declaration was made at the Francis Crick Memorial Conference and signed by some of the leading lights in consciousness research, including Christof Koch and David Edelman.

You can read the full text as a pdf file, however, the main part of the declaration reads:

We declare the following: “The absence of a neocortex does not appear to preclude an organism from experiencing affective states. Convergent evidence indicates that non-human animals have the neuroanatomical, neurochemical, and neurophysiological substrates of conscious states along with the capacity to exhibit intentional behaviors. Consequently, the weight of evidence indicates that humans are not unique in possessing the neurological substrates that generate consciousness. Non- human animals, including all mammals and birds, and many other creatures, including octopuses, also possess these neurological substrates.”

You can also see all of the talks on the conference’s webpage. Curiously, physicist Stephen Hawking was there and the declaration was signed in his presence.
 

Link to conference website.
pdf of full declaration.
Link to coverage from Janet Kwasniak.

Hacking the brain for fun and profit

A study presented at the recent Usenix conference demonstrated how it is possible to get private information from the brains of people who use commercial brain-computer interfaces – like NeuroSky and Emotiv.

These headsets are designed for gamers and are cheaper, less accurate versions of EEG devices – used by scientists to read the electrical activity of the brain by attaching electrodes to the surface of the scalp.

The new study, titled ‘On the Feasibility of Side-Channel Attacks with Brain-Computer Interfaces’ (available online as a pdf), took advantage of a reliable brain signal called the P300.

The P300 reflects the brain’s categorisation of something as relevant, important or meaningful. If you’re shown a series of photo portraits, for example, the P300 will kick in when you see photos of people you recognise but not to strangers.

One form of the not-very-reliable EEG ‘lie detector’ is based on this principle. Called the Guilty Knowledge Test, the idea is that the police would show you photos of the crime scene, and if you had actually been there, your P300 would kick in.

This new study was based on a similar principle. The researchers ran various experiments based on the same idea: they’d ask a question to make sure the key information was at the forefront of the study participant’s mind, and then they’d fire a bunch of information at the volunteer to pick out which was most associated with the P300.

For example, in one experiment participants were told they would have to type in the first digit of their newly acquired PIN number into the computer, but before this happened, the volunteers were shown a series of single digits, while the software recorded which numerals were most associated with the P300.

In another, the P300 was recorded while participants were shown pictures of branded credit cards and bank machines. Another experiment asked participants to think of their month of birth before showing them all the options, while another flashed up maps of the local area to determine their approximate home address.

You can see how the researchers were angling to get the equivalent of essential account details out of the volunteers.

Although the set-up was a little artificial, the researchers note that this sort of unconscious personal detail dredging could be incorporated into a game-like activity, so people would be unaware of what was really happening.

The test was a success scientifically, in that the key information was identified more often than chance, but fraudsters are unlikely to be eschewing email hacking for NeuroSky pwning anytime soon. The hit rate was about 10-20%.

Nevertheless, as a demonstration of a ‘hacking brain wave data from a commercial gaming equipment to get personal information’ you have to take your hat off to the research team.

Even more interestingly, perhaps, is the increasing trend for security technology to move towards the interface between mind and machine.

Another study presented at the same conference showed how people could input ‘passwords’ into a system without any conscious knowledge of knowing a password.

The idea relies on implicit learning – which is where you learn connections between things without having any conscious knowledge of doing so.

For example, when playing a computer game like Guitar Hero or Dance Dance Revolution, the same short sequence of moves might come up several times but you might not be aware of it, because they would be embedded within a larger sequence.

However, simply by having encountered the sequence before you will do better the second time – because you have practised the response – even if you have no conscious memory of it.

For each user, this new study embedded a newly generated ‘password of moves’ several times into a longer sequence and made sure they were well practised. Later, the software could identify each user by spitting out those moves again and checking the performance to see if they’d been encountered before. The participants were unaware of anything except that they were playing a game.

Looking at the bigger picture, the fact that computer security could rely on the fine detail of how the brain works could open up a whole new arena of security vulnerabilities.

Perhaps you could be covertly trained to enter someone else’s security details, or perhaps that last game you played actually trained you to leak your login details in another activity – all of which may be completely unnoticeable to your conscious mind.

Black hat neuroscientists may suddenly become very concerned with how these automatic effects could be influenced in very specific, and of course, very lucrative, ways.
 

Link to study on brain-based personal details hacking (via BoingBoing)
Link to unconscious password study.

A country on the couch

The New York Times discusses Argentina’s love affair with psychoanalysis. A country that has more psychologists – the majority Freudian – than any other nation on Earth.

Argentina is genuinely unique with regard to psychology. Even in Latin America, where Freudian ideas remain relatively strong, Argentina remains a stronghold of the undiluted classic schools of psychoanalysis.

It is also unique in terms of the access people have to the practice. In the majority of the world, psychoanalysis is the reserve of the upper middle classes and aristocracy – both in terms of the analysts and the patients.

While the watered-down (some would say made sensible) psychodynamic psychotherapy is more widely available, psychoanalytic training and therapy is extremely expensive. You could easily spend a couple of thousand US dollars a month on therapy alone.

As trainees have to be taught, supervised and be in constant treatment themselves (although the latter usually at a discounted rate) it remains a practice by and for a very narrow group from society. If you want to see this for yourself, training institutes often have open evenings, which I highly recommend as an interesting anthropological field trip.

This elitism is much less the case in Argentina, however, meaning that people from all walks of life see psychoanalysts and Freudian-inspired commentary is an integral part of popular culture.

The NYT article is a little puzzled as to why psychoanalysis has gained such a foothold in the country. Of course, it received a great many psychoanalyst émigrés in the years surrounding the Second World War, as many were Jewish, but in covering similar ground myself, I wondered whether there are good psychological reasons for its continued popularity.
 

Link to NYT piece on psychoanalysis in Argentina.
Link to earlier piece by me on the same.

Communicating at the speed of thought

Your humble hosts, Tom and Vaughan, have written an article for Trends in Cognitive Sciences about how social media is changing mind and brain research.

The piece is both a brief introduction to blogs and Twitter, as well as an overview of how scientific debate happens online and how it is affecting the traditional approach to cognitive science.

Although we focus on cognitive science, it actually applies to science and science communication in general:

Fundamentally, there are important similarities between principles of traditional scientific culture and on-line culture: both prioritise access to information, citation (whether to journals or via links to other online sources), and kudos for whoever does good work. Academia aspires to openness, engagement, and respect for the principles of rational discussion. Social media facilitate these. The online community is free-flowing, somewhat chaotic, and information-rich – much the same as science has ever been.

In the same spirit, Trends in Cognitive Sciences have made the article freely available online, so you can read it at the link below.
 

Link to ‘Brain network: social media and the cognitive scientist’.

BBC Future column: What a silver medal teaches us about regret

Here’s my column from last week for BBC Future. The original is here

The London 2012 Olympic Games are almost over now, and those Olympians with medals are able to relax and rest on the laurels of victory. Or so you might think. Spare a thought for the likes of Yohan Blake, McKayla Maroney, or Emily Seebohm – those people who are taking home silver.

Yes, that’s right, I’m asking you to feel sorry for silver medallists, not for the bronze medallists or for those who didn’t get the chance to stand on the podium at all.

Research has shown that silver medallists feel worse, on average, than bronze medallists. (Gold medallists, obviously, feel best of all.) The effect is written all over their faces, as psychologists led by Thomas Gilovich of Cornell University found out when they collected footage of the medallists at the 1992 Olympic games in Barcelona. Gilovich’s team looked at images of medal winners either at the end of events – that is, when they had just discovered their medal position – or as they collected their medals on the podium. They then asked volunteers who were ignorant of the athlete’s medal position to rate their facial expressions. Sure enough, the volunteers rated bronze medallists as consistently and significantly happier than silver medallists, both immediately after competing, and on the podium.

The reason is all to do with how bronze and silver medallists differ in the way they think events could have turned out – what psychologists call “counterfactual thinking”. In a follow-up study, the team went to the 1994 Empire State Games and interviewed athletes immediately after they had competed. Silver medallists were more likely to use phrases like “I almost…”, concentrating their responses on what they missed out on. Bronze medallists, on the other hand, tended to contemplate the idea of missing out on a medal altogether. These differences in counterfactual thinking make silver medallists feel unlucky, in comparison to a possible world where they could have won gold, and make bronze medallists feel lucky, in comparison to a possible world where they could have returned home with nothing.

So the research seems to add a bit of scientific meat to Hamlet’s famous line “there is nothing either good or bad, but thinking makes it so“, as well as revealing something about the psychology of regret. Even though we must deal with the world as it is, a vital part of life is imagining the world as it could be – thinking about a job you should have applied for (or said “no” to), or someone you should (or shouldn’t) have asked out on a date, for instance.

Haunted by the past

Different possible worlds crowd compete, some seeming closer than others, and this is what drives regret. This is illustrated by a study that asked volunteers to read a story about a plane crash survivor who walked through the wilderness for days, collapsing and die before reaching civilisation. They were then asked how much compensation the victim’s family should receive. People who read a version where the survivor collapsed 75 miles (120 kilometres) from safety awarded less compensation than those who read that the survivor collapsed just a quarter of a mile from safety.

Both scenarios ended the same, but the second version seems more tragic to us because the person seemed so much closer to safety. Remember that the next time you see a Hollywood film that plays with your emotions in this manner.

Understanding the psychology of regret also helps to put our own thoughts and emotions into context. We’re all haunted by things we could have done, or shouldn’t have done. What’s the point in dwelling on such matters, we may ask, when we can’t change the past? But the study of the Olympic medallists gives us two thoughts that might help us deal with regret.

The first is that regret, like imagination generally, exists for a reason – this amazing cognitive ability is what allows us to plan for the future and, with luck, change things based on how we imagine they might turn out. Medallists who feel more regret may well go on to train harder, and smarter, and so be better able to win gold at the next Olympics. Regret, like so many of the territories of the mind, can hurt. It hurts whether we can change how things have worked out, or not, but the feeling is built into our brains for a good reason (however little comfort that provides).

The second thought that might help us deal with regret is to realise that there are many possible worlds we could compare events to. It’s natural for many silver medallists to feel that they’ve missed out on gold, and to the extent we can choose what we compare ourselves to, we can choose how we feel about our regrets. We can use them to drive us to future success, but also to appreciate what we do have.

So maybe it isn’t all bad for Blake, Maroney or Seebohm after all?