Stroop: an unrecognised legacy

The man who discovered the Stroop effect and created the Stroop test, something which is now a keystone of cognitive science research, never realised the massive impact he had on psychology.

A short but fascinating news item from Vanderbilt University discusses its creator, the psychologist and preacher J. Ridley Stroop.

J. Ridley Stroop was born on a farm 40 miles from Nashville and was the only person in his family to attend college. He began preaching the gospel when he was 20 years old and continued to do so throughout his life. He spent nearly 40 years as a teacher and administrator at David Lipscomb College, now Lipscomb University, in Nashville….

According to his son, Stroop was unaware of the growing importance of his discovery when he died in 1973. Toward the end of his life, he had largely abandoned the field of psychology and immersed himself in Biblical studies. “He would say that Christ was the world’s greatest psychologist,” Faye Stroop recalled.

The task is very simple and relies on the fact that we automatically process word meaning when we see words. We don’t have to recognise each letter, consciously string them together, and ‘work out’ what word it is, it just happens straight away.

Stroop’s insight was to wonder what would happen if he asked people to do something that directly conflicted with this automatic processing.

So if I ask you to name the colour the following word is written in: blue; or name the colour this word is written in: red; you do it a little more slowly than naming the colour that these words are written in: blue, red.

This is because you have to inhibit or consciously ‘get round’ the word’s automatically recognised meaning.

This inhibition of automatic responses turns out to be a key function of attention and is heavily linked to the workings of the pre-frontal cortex.

There are many variations, all based on the fact that word meanings can relate to many different forms of psychological process, bias or experience.

For example, the ‘emotional Stroop‘ asks people to name the ‘ink colour’ of either emotionally neutral words (like ‘apple’, ‘soap’) and more emotionally intense words (like ‘violence’ or ‘torture’).

People who have been traumatised, will be more affected by these sorts of emotionally intense words and so they will identify the ‘ink colour’ of trauma-related words more slowly than when compared to non-traumatised people.

The same happens for people with spider phobia when they read spider-related words, and so on.

And because it allows experimenters to measure the interaction between attention and meaning, it has become a massively useful and popular tool.
 

Link to piece on the history of the Stroop task.

The Society of Mutual Autopsy

The Society of Mutual Autopsy was an organisation formed in the late 1800s to advance neuroscience by examining dead members’ brains and to promote atheism by breaking sacred taboos.

It included some of the great French intellectuals and radicals of the time and became remarkably fashionable – publishing the results in journals and showing plaster-casts of deceased members brains in world fairs.

In October 1876, twenty Parisian men joined together as the Society of Mutual Autopsy and pledged to dissect one another’s brains in the hopes of advancing science. The society acquired over a hundred members in its first few years, including many notable political figures of the left and far left. While its heyday was unquestionably the last two decades of the century, the society continued to attract members until the First World War. It continued its operations until just before World War II, effectuating many detailed encephalic autopsies, the results of which were periodically published in scientific journals.

The quote is from a fascinating but locked academic article by historian Jennifer Michael Hecht and notes that The Society was partly motivated by self-nominated ‘great minds’ who wanted to better understand how brain structure related to personal characteristics.

It was no backwater project and attracted significant thinkers and scientists. Most notably, Paul Broca dissected brains for the society and had his brain dissected by them, despite apparently never joining officially.

Part of the motivation for the society was that, at the time, most autopsies were carried out on poor people (often grave robbed) and criminals (often executed). The intellectual elite – not without a touch of snobbery – didn’t think this was a good basis on which to understand human nature.

Also, these bodies usually turned up at the dead of night, no questions asked, and no one knew much about the person or their personality.

In response to this, the Society of Mutual Autopsy functioned as a respectable source of body parts and also requested that members write an essay describing their life, character and preferences, so that it could all be related to the shape and size of their brain when autopsied by the other members.

There was also another motive: they were atheists in early secular France and they wanted to demonstrate that they could use their remains for science without consideration of religious dogma.

As with most revolutionary societies, it seems to have fallen apart for the usual reasons: petty disagreements.

One person took exception to a slightly less than flattering analysis of his father’s brain and character traits. Another starting flirting with religion, causing a leading member to storm off in a huff.

In a sense though, the society lives on. You can donate your body to science in many ways after death:

To medical schools to teach students. To forensic science labs to help improve body identification. To brain banks to help cure neurological disorders.

But it’s no longer a revolutionary act. Your dead body will no longer reshape society or fight religion like it did in 1870’s France. The politics are dead. But neither will you gradually fade away into dust and memories.

Jennifer Michael Hecht finishes her article with some insightful words about The Society of Mutual Autopsy which could still apply to modern body donation.

It’s “both mundane – offering eternity in the guise of a brief report and a collection of specimens – and wildly exotic – allowing the individual to climb up onto the altar of science and suggesting that this act might change the world”.
 

Link to locked and buried article on The Society of Mutual Autopsy.

The pull for lobotomy

The Psychologist has a fascinating article by historian Mical Raz on what patients and families thought about the effects of lobotomy.

Raz looks at the letters sent between arch-lobotomist Walter Freeman and the many families he affected through his use of the procedure.

Contrary to the image of the ‘evil surgeon who didn’t care about the harm he was doing’ many patients and families gave warm and favourable feedback on the effects of the operation.

Even some very worrying details about the post-operative results are recounted in glowing terms. Freeman had every reason to suspend his disbelief.

What it does illustrate is how a damaging and useless treatment could be perceived as helpful and compassionate by Freeman and, presumably, other doctors because of how docility and, in some cases, genuine reduced distress were valued above the person’s self-integrity and autonomy.

An interesting and challenging article.
 

Link to ‘Interpreting lobotomy – the patients’ stories’.

Whatever happened to Hans Eysenck?

Psychologist Hans Eysenck was once one of the most cited and controversial scientists on the planet and a major force in the development of psychology but he now barely merits a mention. Whatever happened to Hans Eysenck?

To start off, it’s probably worth noting that Eysenck did a lot to ensure his legacy would be difficult to maintain. He specifically discouraged an ‘Eysenck school’ of psychology and encouraged people to question all his ideas – an important and humble move considering that history favours the arrogant.

But he also argued for a lot of rubbish and that is what he’s become most remembered for.

He did a lot of work on IQ but took a hard line of its significance. Rather than thinking of it as simply a broad-based psychological test that is useful as a clinical measure of outcome, he persistently championed it as a measure of ‘intelligence’ – a fuzzy social idea that implies someone’s value.

Without any insight into the cultural specificity of these tests Eysenck argued for racial differences in IQ as likely based in genetics, and signed the notorious ‘Mainstream Science on Intelligence’ statement which reads like your drunk grandpa trying to justify why there are no black Nobel science winners.

Eysenck was apparently not racist himself, but believing that science was ‘value free’ he was also incredibly politically naive and took money from clearly racist organisations or published in their journals, thinking that the data would speak for itself.

He also doubted that smoking caused lung cancer and took money from tobacco giant Philip Morris to try and show that the link was mediated by personality, and at one point started espousing that there was some statistical basis behind astrology.

Some of his other main interests have not been rejected, but have just become less popular – not least the psychology of personality and personality tests.

This area is still important but has become a minority sport in contemporary psychology, whereas previously it was central to a field that was still battling fairytale Freudian theories as a way of understanding personal tendencies.

But perhaps his most important contributions to psychology are now so widely accepted that no-one really thinks about their origin.

When he was asked to create the UK’s first training course for clinical psychology he created a scientifically informed approach to understanding which treatments work but extended this philosophy to focus on a hypothesis-testing approach to work with individuals. This is now a core aspect of practice across the world.

His belief that psychologists should consistently look to make links between thoughts, experience, behaviour and biology is something that has been widely taken up by researchers, even if clinical psychologists remain a little neurophobic as a profession.

Because Eysenck loved an academic dust-up, he is most remembered for the IQ debate, on which he took a rigid position which history has, justifiably, not looked kindly on. But as someone who influenced the practice of psychology, his legacy remains important, if largely unappreciated.

A quarter century of All in the Mind

A new series of BBC Radio 4’s All in the Mind has just kicked off and to celebrate 25 years of broadcasting they’ve just had three great episodes looking back on the last quarter century of psychology, neuroscience and mental health.

Each make for a interesting discussion of how science and attitudes have changed.

As per BBC usual, you can access the streamed versions at the links above, but you have to go to an entirely separate page for the podcasts.

And because there are no separate podcast pages for specific episodes, I’ve linked them directly below. Here’s hoping that in the next 25 years, the BBC can fix their website.
 

mp3 of ’25 years of understanding the brain’
mp3 of ‘What has psychology research taught us in the last 25 years?’
mp3 of ‘How have attitudes to mental health changed in the last 25 years?’

Look before you tweak: a history of amphetamine

I’ve just found a fascinating article in the American Journal of Public Health on ‘America’s First Amphetamine Epidemic’ and how it compares to the current boom in meth and Ritalin use.

The first amphetamine epidemic ran from 1929–1971 and was largely based on easily available over-the-counter speed in the form of ‘pep pills’, widely abused decongestant inhalers and amphetamine-based ‘anti-depressants’.

The idea of giving speed to depressed people seems quite amazing now, especially considering its tendency to cause anxiety, addition and psychosis in the doses prescribed at the time, but it was widely promoted for this purpose.

The following is a 1945 advert for Benzedrine showing a gentleman who has just been treated for depression and is now a proud and dynamic member of society. Thanks pharmaceutical grade crank!

Benzedrine_advert

As an aside, when patients complained about the agitation associated with amphetamine treatment, the drug companies brought out new medications which were speed mixed with barbituates, a class of sedatives.

Not mentioned by the article is the fact that one particular brand of this upper-downer mix called Dexamyl has had a remarkable effect on history – but you’ll have to check the Wikipedia page for the details.

As it happens America is in the midst of another phase of massive stimulant popularity – in the form of street methamphetamine and prescribed Ritalin. In fact, use is at virtually the same levels as when you could buy speed over-the-counter.

By the way, the author of the article also wrote the excellent book On Speed if you want a more in-depth look at the history of the drug.
 

Link to article in American Journal of Public Health (via @medskep)

Period architecture, majestic views, history of madness

Regular readers will know of my ongoing fascination with the fate of the old psychiatric asylums and how they’re often turned into luxury apartments with not a whisper of their previous life.

It turns out, a 2003 article in The Psychiatrist looked at exactly this in 71 former asylum care hospitals.

It’s cheekily called ‘The Executives Have Taken Over the Asylum’ and notes how almost all have been turned into luxury developments. Have a look at Table 1 for a summary.

The authors also had a look at the marketing material for these new developments and wrote a cutting commentary on how the glossy brochures deal with the institutions’ mixed legacies.

The estate agents want to play on the often genuinely beautiful architecture and, more oddly, the security of the sites, while papering over the fact the buildings had anything to do with mental illness.

Examples of the language employed by property developers in sales brochures advertising old hospital buildings included ‘sanctuary’ and ‘seclusion’ in ‘grade II listed buildings’, ‘tastefully converted period buildings’ and ‘luxury penthouses’. There was a strong emphasis on security, with ‘a secure and private environment’, ‘24 hour security guards’, ‘security gates’ and ‘CCTV surveillance’. Original asylum architecture is even imitated in modern buildings: ‘the classic facades that emulate the original architecture’, and the clock tower of one former hospital was used as a symbol to represent the whole development.

Residents at the redeveloped site of Nethern Hospital will be greeted by ‘the gentle bounce of tennis balls on private courts’ and ‘the distant voices of children’. They will, however, remain unaware of the 1976 inquiry into high levels of suicides that found serious understaffing and unsatisfactory conditions on the wards.

At St George’s Park in Oxfordshire [previously Littlemore Hospital], prospective buyers were informed of the ‘original 19th century elegance’ and ‘original features including high ceilings’. They are not informed that the original psychiatric hospital has been newly built over the road.

In total, reference was made to the former psychiatric hospitals in only four of the 12 promotional brochures and web sites. This was in the general reference to a former hospital or by euphemistic language, such as ‘society’s less able’, referring to people with learning disability at Earlswood Hospital.

Since the article was written in 2003, many more have gone the same way.
 

Link to ‘The Executives Have Taken Over the Asylum’.

A technoculture of psychosis

Aeon Magazine has an amazing article on the history of technology in paranoid delusions and how cultural developments are starting to mirror the accidental inventions of psychosis.

It’s by the fantastic Mike Jay, who wrote The Air Loom Gang, an essential book that looks at one of the most famous cases of ‘influencing machine’ psychosis.

In his article, Jay applies the same keen eye for history and culture and explores how the delusions of psychosis are carefully intertwined with culture.

Persecutory delusions, for example, can be found throughout history and across cultures; but within this category a desert nomad is more likely to believe that he is being buried alive in sand by a djinn, and an urban American that he has been implanted with a microchip and is being monitored by the CIA. ‘For an illness that is often characterised as a break with reality,’ they observe, ‘psychosis keeps remarkably up to date.’ Rather than being estranged from the culture around them, psychotic subjects can be seen as consumed by it: unable to establish the boundaries of the self, they are at the mercy of their often heightened sensitivity to social threats.

The article covers everything from Victorian delusions of electrical control, to the breakdown of novelist Evelyn Waugh, to the fiction of Philip K Dick.

It’s an excellent piece, and even those who have a special interest in the history of psychosis will find it full of fascinating gems.

By the way, it looks like Jay’s book The Air Loom Gang is about to be re-released in a newly updated version, under a new title The Influencing Machine.
 

Link to ‘The Reality Show’ in Aeon Magazine.

A notorious song

A song banned was banned by the BBC until 2002 because worries that it may cause a suicide epidemic. The piece is titled Gloomy Sunday and was written by the Hungarian composer Rezső Seress.

The following abstract tip-toes around the point that there is no evidence it ever caused suicides but the history and hand-wringing about the song are interesting in themselves.

Gloomy Sunday: did the “Hungarian suicide song” really create a suicide epidemic?

Omega (Westport). 2007-2008;56(4):349-58.

Stack S, Krysinska K, Lester D.

The effect of art on suicide risk has been a neglected topic in suicidology. The present article focuses on what is probably the best known song concerning suicide, Gloomy Sunday, the “Hungarian suicide song.” An analysis of historical sources suggests that the song was believed to trigger suicides. It was, for example, banned by the BBC in England until 2002. The alleged increase in suicides in the 1930s associated with the playing of the song may be attributed to audience mood, especially the presence of a large number of depressed persons as a result of the Great Depression.

The influence of music on suicide may be contingent on societal, social, and individual conditions, such as economic recessions, membership in musical subcultures, and psychiatric disturbance. Further research is needed on art forms, such as feature films, paintings, novels, and music that portray suicides in order to identify the conditions under which the triggering of suicides occurs.

There are lots of versions of the song, including the original, available on YouTube. As you might expect, the best is a version by Billie Holliday.

It is indeed kinda gloomy, but it’s hardly like to spark a wave of suicidal thinking.

There is, however, a minor history concerning how works of art affect real-world suicide practices.

Most famously, the Aokigahara forest in Japan at the base of Mount Fuji has become a common suicide destination after the characters in Seichō Matsumoto’s 1961 novel Kuroi Jukai end their lives there.
 

Link to abstract of article about ‘Gloomy Sunday’ on PubMed.

Crystal history

Spiegel Online has an excellent article that traces the history of methamphetamine from its early days as synthetic soldier fuel in Nazi Germany to its recent history as street crank.

There is one curious bit though:

Pervitin remained easy to obtain even after the war, on the black market or as a prescription drug from pharmacies. Doctors didn’t hesitate to prescribe it to patients as an appetite suppressant or to improve the mood of those struggling with depression. Students, especially medical students, turned to the stimulant to help them cram through the night and finish their studies faster.

Numerous athletes found Pervitin decreased their sensitivity to pain, while simultaneously increasing performance and endurance. In 1968, boxer Joseph “Jupp” Elze, 28, failed to wake again after a knockout in the ring following some 150 blows to the head. Without methamphetamine, he would have collapsed much sooner and might not have died. Elze became Germany’s first known victim of doping. Yet the drug remained on the market.

This was probably not mainly due to increased pain tolerance. In fact, studies on the pain-killing effects of amphetamine show quite modest effects on reducing discomfort.

Being knocked out is basically where the brain has sustained so much damage that it cannot maintain sufficient arousal to support consciousness.

Amphetamine artificially increases arousal, so you’re likely able to sustain much more brain damage before passing out.

Or to put it another way, after dropping speed, the point at which you sustain enough brain damage to pass out becomes much closer to the point at which you’re likely to die.

There is also a chronic effect of amphetamine raising blood pressure, which increases the chance of stroke, so getting repeatedly punched in the head while on speed is probably not a good idea. I suspect this was the more likely route to the death of boxer Joseph “Jupp” Elze.

If you want a background on the science and history of stimulants, I never miss the opportunity to recommend the brilliant book Speed, Ecstasy, Ritalin: The Science of Amphetamines.

However, if you want a quick primer (no, not that sort) the Spiegel article is a great place to start.
 

Link to Spiegel article ‘The German Granddaddy of Crystal Meth’.

Why you might prefer more pain

When is the best treatment for pain more pain? When you’re taking part in an experiment published by a Nobel prize winner and one of the leading lights in behavioural psychology, that is.

The psychologist in question is Daniel Kahneman; the experiment described by the self-explanatory title of: When More Pain Is Preferred to Less: Adding a Better End. In the study, Kahneman and colleagues looked at the pain participants felt by asking them to put their hands in ice-cold water twice (one trial for each hand). In one trial, the water was at 14C (59F) for 60 seconds. In the other trial the water was 14C for 60 seconds, but then rose slightly and gradually to about 15C by the end of an additional 30-second period.

Both trials were equally painful for the first sixty seconds, as indicated by a dial participants had to adjust to show how they were feeling. On average, participants’ discomfort started out at the low end of the pain scale and steadily increased. When people experienced an additional thirty seconds of slightly less cold water, discomfort ratings tended to level off or drop.

Next, the experimenters asked participants which kind of trial they would choose to repeat if they had to. You’ve guessed the answer: nearly 70% of participants chose to repeat the 90-second trial, even though it involved 30 extra seconds of pain. Participants also said that the longer trial was less painful overall, less cold, and easier to cope with. Some even reported that it took less time.

In case you think this is a freakish outcome of some artificial lab scenario, Kahneman saw a similar result when he interviewed patients who had undergone a colonoscopy examination – a procedure universally described as being decidedly unpleasant. Patients in Kahneman’s study group had colonoscopies that lasted from four to 69 minutes, but the duration of the procedure did not predict how they felt about it afterwards. Instead, it was the strength of their discomfort at its most intense, and the level of discomfort they felt towards the end of the procedure.

These studies support what Kahneman called the Peak-End rule – that our perceptions about an experience are determined by how it feels at its most intense, and how it feels at the end. The actual duration is irrelevant. It appears we don’t rationally calculate each moment of pleasure or pain using some kind of mental ledger. Instead, our memories filter how we feel about the things we’ve done and experienced, and our memories are defined more by the moments that seem most characteristic – the peaks and the finish – than by how we actually felt most of the time during the experience.

Kahneman wondered whether this finding meant that surgeons should extend painful operations needlessly to leave patients with happier memories, even though it would mean inflicting more pain overall. Others have asked whether this means that the most important thing about a holiday is that it includes some great times, rather than the length of time you are away for. (It certainly makes you think it would be worth doing if you could avoid the typical end to a holiday – queues, lumping heavy luggage around and jetlag.)

But I think the most important lesson of the Peak-End experiments is something else. Rather than saying that the duration isn’t important, the rule tells me that it is just as important to control how we mentally package our time. What defines an “experience” is somewhat arbitrary. If a weekend break where you forget everything can be as refreshing as a two-week holiday then maybe a secret to a happy life is to organise your time so it is broken up into as many distinct (and enjoyable) experiences as possible, rather than being just an unbroken succession of events which bleed into one another in memory.

All I need to do now is find the time to take a holiday and test my theory.

This is my BBC Future column, originally published last week. The original is here.

Prescribe it again, Sam

We tend to think of Prozac as the first ‘fashionable’ psychiatric drug but it turns out popular memory is short because a tranquilizer called Miltown hit the big time thirty years before.

This is from a wonderful book called The Age of Anxiety: A History of America’s Turbulent Affair with Tranquilizers by Andrea Tone and it describes how the drug became a Hollywood favourite and even inspired its own cocktails.

Miltown was frequently handed out at parties and premieres, a kind of pharmaceutical appetizer for jittery celebrities. Frances Kaye, a publicity agent, described a movie party she attended at a Palm Springs resort. A live orchestra entertained a thousand-odd guests while a fountain spouted champagne against the backdrop of a desert sky. As partiers circulated, a doctor made rounds like a waiter, dispensing drugs to guests from a bulging sack. On offer were amphetamines and barbituates, standard Hollywood party fare, but guests wanted Miltown. The little white pills “were passed around like peanuts,” Kaye remembered. What she observed about party pill popping was not unique. “They all used to go for ‘up pills’ or ‘down pills,'” one Hollywood regular noted. “But now it’s the ‘don’t-give-a-darn-pills.'”

The Hollywood entertainment culture transformed a pharmaceutical concoction into a celebrity fetish, a coveted commodity of the fad-prone glamour set. Female entertainers toted theirs in chic pill boxes designed especially for tranquilizers, which became, according to one celebrity, as ubiquitous at Hollywood parties as the climatically unnecessary mink coat…

Miltown even inspired a barrage of new alcoholic temptations, in which the pill was the new defining ingredient. The Miltown Cocktail was a Bloody Mary (vodka and tomato juice) spiked with a single pill, and a Guided Missile, popular among the late night crowd on the Sunset Strip, consisted of a double shot of vodka and two Miltowns. More popular still was the Miltini, a dry martini in which Miltown replaced the customary olive.

Andrea Tone’s book is full of surprising snippets about how tranquilisers and anti-anxiety drugs have affected our understanding of ourselves and our culture.

It’s very well researched and manages to hit that niche of being gripping for the non-specialist while being extensive enough that professionals will learn a lot.
 

Link to details for The Age of Anxiety book.

A stiff moment in scientific history

Photo by Flickr user NASA's Marshall Space Flight Center. Click for source.In 1983 psychiatrist Giles Brindley demonstrated the first drug treatment for erectile dysfunction in a rather unique way. He took the drug and demonstrated his stiff wicket to the audience mid-way through his talk.

Scientific journal BJU International has a pant-wettingly hilarious account of the events of that day which made both scientific and presentation history.

Professor Brindley, still in his blue track suit, was introduced as a psychiatrist with broad research interests. He began his lecture without aplomb. He had, he indicated, hypothesized that injection with vasoactive agents into the corporal bodies of the penis might induce an erection. Lacking ready access to an appropriate animal model, and cognisant of the long medical tradition of using oneself as a research subject, he began a series of experiments on self-injection of his penis with various vasoactive agents, including papaverine, phentolamine, and several others. (While this is now commonplace, at the time it was unheard of). His slide-based talk consisted of a large series of photographs of his penis in various states of tumescence after injection with a variety of doses of phentolamine and papaverine. After viewing about 30 of these slides, there was no doubt in my mind that, at least in Professor Brindley’s case, the therapy was effective. Of course, one could not exclude the possibility that erotic stimulation had played a role in acquiring these erections, and Professor Brindley acknowledged this.

The Professor wanted to make his case in the most convincing style possible. He indicated that, in his view, no normal person would find the experience of giving a lecture to a large audience to be erotically stimulating or erection-inducing. He had, he said, therefore injected himself with papaverine in his hotel room before coming to give the lecture, and deliberately wore loose clothes (hence the track-suit) to make it possible to exhibit the results. He stepped around the podium, and pulled his loose pants tight up around his genitalia in an attempt to demonstrate his erection.

At this point, I, and I believe everyone else in the room, was agog. I could scarcely believe what was occurring on stage. But Prof. Brindley was not satisfied. He looked down sceptically at his pants and shook his head with dismay. ‘Unfortunately, this doesn’t display the results clearly enough’. He then summarily dropped his trousers and shorts, revealing a long, thin, clearly erect penis. There was not a sound in the room. Everyone had stopped breathing.

But the mere public showing of his erection from the podium was not sufficient. He paused, and seemed to ponder his next move. The sense of drama in the room was palpable. He then said, with gravity, ‘I’d like to give some of the audience the opportunity to confirm the degree of tumescence’. With his pants at his knees, he waddled down the stairs, approaching (to their horror) the urologists and their partners in the front row. As he approached them, erection waggling before him, four or five of the women in the front rows threw their arms up in the air, seemingly in unison, and screamed loudly. The scientific merits of the presentation had been overwhelmed, for them, by the novel and unusual mode of demonstrating the results.

The screams seemed to shock Professor Brindley, who rapidly pulled up his trousers, returned to the podium, and terminated the lecture. The crowd dispersed in a state of flabbergasted disarray. I imagine that the urologists who attended with their partners had a lot of explaining to do. The rest is history. Prof Brindley’s single-author paper reporting these results was published about 6 months later.

 

Link to full account of that fateful day (via @DrPetra)

A cuckoo’s nest museum

The New York Times reports that the psychiatric hospital used as the backdrop for the 1975 film One Flew Over the Cuckoo’s Nest has been turned into a museum of mental health.

In real life the institution was Oregon State Hospital and the article is accompanied by a slide show of images from the hospital and museum.

The piece also mentions some fascinating facts about the film – not least that some of the actors were actually genuine employees and patients in the hospital.

But the melding of real life and art went far beyond the film set. Take the character of John Spivey, a doctor who ministers to Jack Nicholson’s doomed insurrectionist character, Randle McMurphy. Dr. Spivey was played by Dr. Dean Brooks, the real hospital’s superintendent at the time.

Dr. Brooks read for the role, he said, and threw the script to the floor, calling it unrealistic — a tirade that apparently impressed the director, Milos Forman. Mr. Forman ultimately offered him the part, Dr. Brooks said, and told the doctor-turned-actor to rewrite his lines to make them medically correct. Other hospital staff members and patients had walk-on roles.

 

Link to NYT article ‘Once a ‘Cuckoo’s Nest,’ Now a Museum’.

The postmortem portraits of Phineas Gage

A new artform has emerged – the post-mortem neuroportrait. Its finest subject, Phineas Gage.

Gage was a worker extending the tracks of the great railways until he suffered the most spectacular injury. As he was setting a gunpowder charge in a rock with a large tamping iron, the powder was lit by an accidental spark. The iron was launched through his skull.

He became famous in neuroscience because he lived – rare for the time – and had psychological changes as a result of his neurological damage.

His story has been better told elsewhere but the interest has not died – studies on Gage’s injury have continued to the present day.

There is a scientific veneer, of course, but it’s clear that the fascination with the freak Phineas has its own morbid undercurrents.

Image from Wikipedia. Click for source.The image is key.

The first such picture was constructed with nothing more than pen and ink. Gage’s doctor John Harlow sketched his skull which Harlow had acquired after the patient’s death.

This Gage is forever fleshless, the iron stuck mid-flight, the shattered skull frozen as it fragments.

Harlow’s sketch is the original and the originator. The first impression of Gage’s immortal soul.

Gage rested as this rough sketch for over 100 years but he would rise again.

In 1994, a team led by neuroscientist Hannah Damasio used measurements of Gage’s skull to trace the path of the tamping iron and reconstruct its probable effect on the brain.

Gage’s disembodied skull appears as a strobe lit danse macabre, the tamping iron turned into a bolt of pure digital red and Gage’s brain, a deep shadowy grey.

It made Gage a superstar but it sealed his fate.

Every outing needed a more freaky Phineas. Like a low-rent-celebrity, every new exposure demanded something more shocking.

A 2004 study by Peter Ratiu and Ion-Florin Talos depicted Gage alongside his actual cranium – his digital skull screaming as a perfect blue iron pushed through his brain and shattered his face – the disfigurement now a gory new twist to the portrait.

In contrast, his human remains are peaceful – unmoved by the horrors inflicted on their virtual twin.

But the most recent Gage is the most otherworldly. A study by John Darrell Van Horn and colleagues examined how the path of the tamping iron would have affected the strands of white matter – the “brain’s wiring” – that connects cortical areas.

From Van Horn et al. (2012) PLoS One. 2012;7(5):e37454A slack-jawed Gage is now pierced by a ghostly iron bar that passes almost silently though his skull.

Gage himself is equally supernatural.

Blank white eyes float lifelessly in his eye sockets – staring into the digital blackness.

His white matter tracts appear within his cranium but are digitally dyed and seem to resemble multi-coloured hair standing on end like the electrified mop of a fairground ghoul.

But as the immortal Gage has become more horrifying over time, living portraits of the railwayman have been discovered. They show an entirely different side to the shattered skull celebrity.

To date, two portraits have been identified. They both show a ruggedly handsome, well-dressed man.

He has gentle flesh. Rather than staring into blackness, he looks at us.

Like a 19th century auto-whaler holding his self-harpoon, he grips the tamping iron, proud and defiant.

I prefer this living Phineas.

He does not become more alien with every new image.

He is at peace with a brutal, chaotic world.

He knows what he has lived through.

Fuck the freak flag, he says.

I’m a survivor.

A brief history of narcoanalysis

Photo by Flickr user Andres Rueda. Click for source.The judge in the case of ‘Colorado shooter’ James Holmes has made the baffling decision that a ‘narcoanalytic interview’ and ‘polygraph examination’ can be used in an attempt to support an insanity plea.

While polygraph ‘lie detectors’ are known to be seriously flawed, some US states still allow evidence from them to be admitted in court although the fact they’re being considered in such a key case is frankly odd.

But the ‘narcoanalytic interview’ is so left-field as to leave some people scratching their heads as to whether the judge has been at the narcotics himself.

The ‘narcoanalytic interview’ is sometimes described as the application of a ‘truth drug’ but the actual practice is far more interesting.

It has been variously called ‘narcoanalysis’, ‘narcosynthesis’ and the ‘amytal interview’ and involves, as you might expect, interviewing the person under the influence of some sort of narcotic.

It’s roots lie in the very early days of 1890s pre-psychoanalysis where Freud used hypnosis to relax patients to help them discuss emotionally difficult matters.

The idea that being relaxed overcame the mind’s natural resistance to entertaining difficult thoughts and helped get access to the unconscious became the foundation of Freud’s work. Narcoanalysis is still essentially based on this idea.

But, of course, the concept had to wait until the discovery of the first suitable drugs – the barbituates.

Psychiatrist William Bleckwenn found that giving barbital to patients with catatonic schizophrenia led to a “lucid interval” where they seemed to be able to discuss their own mental state in a way previously impossible.

You can see the parallels in the first ever use of ‘narcoanalysis’ to the current case, but through the rest of the century the concept merged with the idea of creating a “truth drug”.

This was born in the 1920s where the gynaecologist Robert House noticed that women who were given scopolamine to ease the birth process seemed to go into a ‘twilight state’ and were more pliant and talkative.

House decided to test this on criminals and went about putting prisoners under the influence of the drug while interviewing them as a way of ‘determining innocence or guilt’. Encouraged by some initial, albeit later recanted, confessions House began to claim that it should be used routinely in police investigations.

This probably would have died a death as a dubious medical curiosity had Time magazine not run an article in their 1923 edition entitled “The Truth-Compeller” about House’s theory – making him and the ‘truth drug’ idea national stars.

These approaches became militarised: firstly as ‘narcoanalysis’ was used to treat traumatised soldiers in the World War Two, and secondly as it was taken up by the CIA in the Cold War as a method for interrogation and became a centrepiece of the secret Project MKUltra.

It has continued to be used in criminal investigations in the US, albeit infrequently, although it has popped up in the legal rulings.

In 1985 the US Supreme Court rejected an appeal by two people convicted of murder that their ‘narcoanalysis police interview’ made their conviction unsafe.

However, the psychiatrist who conducted the interview didn’t convince any of the judges that ‘narcoanalysis’ was actually of benefit:

At one point he testified that it would elicit an accurate statement of subjective memory, but later said that the subject could fabricate memories. He refused to agree that the subject would be more likely to tell the truth under narcoanalysis than if not so treated.

The concept seemed to disappear after that but strong suspicions were raised that ‘narcoanalysis’ was still a CIA favourite when the Bush government’s infamous ‘torture memo‘ justified the use of “mind-altering substances” as part of ‘enhanced interrogation techniques’.

There is no evidence that ‘narcoanalysis’ actually helps in any way, shape or form, and at moderate to high doses, some of the drugs may actually impede memory or make it more likely that the person misremembers.

I suspect that the actual result of the bizarre ruling in the ‘Colorado shooter’ case will just be that psychiatrists will be able to give a potentially psychotic suspect a simple anti-anxiety drug without the resulting evidence being challenged.

This would be no different than giving an anxious or agitated witness the same drug to help them recount what happened.

But the fact that the judge included ‘lie detectors’ and ‘narcoanalysis’ in his ruling as useful legal tools rather than recognising them as flawed investigative techniques is still very concerning and suggests legal thinking mired in the 1950s.
 

pdf of judge’s ruling.
Link to (ironically locked) article on the history of ‘narcoanalysis’