The internet, depression and drinking a glass of water

Photo by Flickr user Hoggheff aka Hank Ashby aka Mr. Freshtags. Click for sourceA new study has made headlines around the world that claim that internet use is linked to depression despite better evidence from previous studies that there is no substantial link.

The study itself is a fairly straightforward online survey with the key finding that out of 1,319 people who completed the questionnaires, 18 were identified as ‘addicted’ by Kimberly Young’s Internet Addiction Questionnaire and these people were more likely to score highly on the BDI – a standard questionnaire to measure depression.

The study itself was well conducted although it is not a surprising finding because Young’s Internet Addiction Questionnaire (which you can read online here) asks lots of questions about emotional distress, so it’s hardly surprising that people who say they’re distressed on one questionnaire will say they’re distressed on the other.

I have criticised the concept of internet addiction on the basis that the whole concept doesn’t make sense, but research has also shown that these ‘diagnostic’ questionnaires are not particularly reliable, meaning they not a good guide even to what they claim they’re doing.

But perhaps the most important point, is that this study is just one in a long line of studies that have looked at whether internet use is linked to changes in mood.

Recently, a type of study called a meta-analysis was published that looked at all of these previous studies to see what the overall effect was – in essence, a mathematical aggregation of all the reported findings to get at the big picture.

This meta-analysis found that there was a statistically reliable link between internet use and depression, but one so small as to be insignificant. In fact, it found that internet was responsible for between 0.02% and 0.03% of total changes in mood (stats geeks: the variance was not reported directly but I calculated it from the r by the coefficient of determination).

In other words, internet use explains so little of a person’s depression that it’s irrelevant. It’s like knowing that hypothermia is a serious medical condition and that drinking a glass of water reliably lowers body temperature, but by such a small amount as to be medically unimportant.

Interestingly, I am quoted in some of the news stories about the study. Actually, I was contacted by a BBC journalist and some other stories have seemingly just nicked the quotes (often wrongly describing me as a psychiatrist).

What’s curious is that I sent the BBC journalist a link to the meta-analysis, even explained what it found and what a meta-analysis is, and included comments about why the study doesn’t change the general conclusion.

Instead of focusing on the existing evidence, I am quoted as being a naysayer. I have not been misquoted but the most important scientific point is omitted at the expense of presenting my words. This seems to be a common pattern where news stories often privilege opinion over data, when science privileges data over opinion.

In fact, the motto of the Royal Society, the world’s oldest scientific society, translates as “on the words on no-one”, but news stories often turn the hierarchy of evidence on its head, giving a skewed impression of the most fundamental way in which science works.

In this case, to suggest that science has established that internet use is strongly linked to depression when we know that it isn’t.

Link to PubMed entry for latest study.
Link to PubMed entry for meta-analysis.

Blue Brain Year One

Film-maker Noah Hutton has just released an excellent 15-minute documentary on the Blue Brain project that captures the team as they work and explains the goals of the ambitious attempt to simulate animal, and eventually, human scale neural networks on computer.

It’s an interesting look both inside the scientific mission and inside the mind of project leader Henry Markram, whom it must be said, is largely talking about the potential of the project rather than what it can do now.

It’s probably worth saying that Markram is not known for underselling his efforts, and some of his projections seem a little unrealistic.

At one point he mentions that the project could be used in hospital so doctors can simulate the effects of drugs on a digital brain to see if they’ll work before giving patients the real thing. Best of luck with that chaps.

It’s a great short piece, however, and apparently there are more to come in the future.

Link to Blue Brain: Year One.

Fight club debate on computers and kids’ brains

On Thursday, I shall be taking part in a live debate hosted by The Times Online entitled ‘Is screen culture damaging our children’s brains?’ where I will be debating psychologist Tracey Alloway who recently made headlines by suggesting Facebook ‘enhances intelligence’ but Twitter ‘diminishes it’.

It one of those online chat things but you are welcome to sign up and take part. It happens at 1pm UK time which turns out to be far-too-early-o’clock Colombian time so I may be in my dressing gown. Don’t let that put you off.

An article on the same topic will also be coming out on Thursday which should help set the scene and which I’ll link to when it appears.

Link to ‘Is screen culture damaging our children’s brains?’ debate.

Injecting heroin with a doctor

Slate has two articles on an innovative but controversial service in Vancouver, Canada, that provides injecting drug users with a place to safely inject drugs with clean equipment and medical staff on hand.

The project, ‘Insite’, is based on a ‘harm reduction‘ approach which is driven by the idea that users should be encouraged to take drugs in the safest way possible.

This is partly an admission that addiction treatment is not very successful on its own, but partly a public health measure in that injecting drug users have much higher rates of diseases such as HIV and hepatitis and are more likely to pass them on to other people.

It is also the case that one of the biggest dangers from injecting drugs is the actual practice of injecting, as unsanitary conditions, ad-hoc ‘cooking up’ and unpredictable street dope as become much more risky when the final product is injected into the bloodstream.

These services can be controversial in some places as they can be seen to be condoning drug use, although some countries are now going further and actually prescribing heroin to addicts.

One of the biggest impacts on society is not the fact that a tiny minority of people are damaging themselves with smack, but that they tend to commit crimes to feed their habit and support a violent criminal network of dealers.

Methadone is a heroin substitute that has been prescribed for years and we know that it can stabilise the lives of users and increase their chance of kicking the habit.

But it is often not what users want. It stops the withdrawals, as it’s another form of opioid drug, but it doesn’t feel the same and still has the danger of overdose. One common problem is known euphemistically in the medical literature ‘methadone diversion‘ where users sell their methadone to buy street drugs.

Several countries have trialled the prescription of heroin itself, with, it turns out, a great deal of success – including better health, a reduction in criminal activity and a higher chance of actually kicking the habit.

This may seem counter-intuitive, but one of the advantages of these projects is that the user is constantly in contact with health professionals who can provide addiction treatment.

The political and local opposition to harm reduction services is usually immense, however. Politicians want to be seen to be ‘tough on drugs’ and no-one, and I mean no-one, wants one of these clinics near where they live.

The Slate articles looks into the day-to-day running and talks to some of the clients of the Vancouver programme, and provides an insight into the challenges such services face. There’s also a gallery of photos that captures the project in action.

Link to Slate article ‘Welcome to Insite’.
Link to Slate article ‘Upstairs, Downstairs’.
Link to photos of the Insite project.

Gladiator’s blood as a cure for epilepsy

I just stumbled across this fascinating article from the Journal of the History of Neurosciences about the use of gladiators’ blood as a cure for epilepsy in Ancient Rome. Surprisingly, the practice continued into modern times.

Between horror and hope: gladiator’s blood as a cure for epileptics in ancient medicine.

J Hist Neurosci. 2003 Jun;12(2):137-43.

Moog FP, Karenberg A.

Between the first and the sixth century a single theological and several medical authors reported on the consumption of gladiator’s blood or liver to cure epileptics. The origins of the sacred or apoplectic properties of blood of a slain gladiator, likely lie in Etruscan funeral rites. Although the influence of this religious background faded during the Roman Republic, the magical use of gladiators’ blood continued for centuries. After the prohibition of gladiatorial combat in about 400 AD, an executed individual (particularly had he been beheaded) became the “legitimate” successor to the gladiator. Occasional indications in early modern textbooks on medicine as well as reports in the popular literature of the 19th and early 20th century document the existence of this ancient magical practice until modern times. Spontaneous recovery of some forms of epilepsy may be responsible for the illusion of therapeutic effectiveness and for the confirming statements by physicians who have commented on this cure.

The article has some amazing reports of how the practice continued into the last century:

In his autobiography, the Danish storyteller Hans Christian Andersen reported a striking observation in 1823: ‘‘I saw a pitiful poor person made to drink by his superstitious parents a cup of the blood of an executed person, in an attempt to cure him from epilepsy.’’ At the public execution of a murderer in the provincial town of Hanau near Frankfurt in 1861, a crowd of women had to be prevented by police from dipping rags into the freshly-spilled blood. At about the same time executioners in Berlin were paid two taler per blood-drenched handkerchief.

A last and final dramatic report of this kind was published in a Saxon newspaper in 1908 after the execution of a murderess: ‘‘On the day of the execution an old woman from a neighbouring village pushed her way through the crowds around the court buildings to request a small amount of the delinquent’s blood from the security officials. She wanted to help a young girl related to her who suffered from epilepsy, as the blood of an executed person was believed to have great healing power against this disease’’ (quoted from Seyfarth, 1913, p. 279).

Link to PubMed entry for article.
Link to DOI entry for same.

The rise and fall of antidepressants

Newsweek has an excellent article that charts the rise and fall of antidepressants from their status as a wonder drug that made people ‘better than well’ to the recent evidence that suggests for many people, they’re not much better than placebo.

The piece particularly follows the work of psychologist Irving Kirsch who was the first to conduct a meta-analysis of the effects of anti-depressants back in 1998.

Titled “Listening to Prozac but hearing placebo” it suggested that the drugs were hardly more effective than placebo and, for many, marked Kirsch out as a biased and dangerous ‘anti-psychiatrist’.

However, later studies in a similar vein by both Kirsch and others have supported his original findings and many countries have now changed their treatment recommendations as a result.

The Newsweek article tracks this story but also picks up on many important subtitles in the story, notably that the research doesn’t suggest that antidepressants are useless – quite the opposite – just that their effect is only in part due to their direct chemical effect; and that many patients in trials work out that they’re not taking placebo because of the side-effects and this realisation can trigger a stronger placebo effect.

It also integrates evidence from the recent STAR*D study, one of the most complete on the best methods to treat depression.

If you want a good overview of the debate on the effectiveness of these iconic drugs, this is a good place to start.

Additionally, if you’re interested in a good analysis of the most recent study in this area, just published in the Journal of the American Medical Association, the Neuroskeptic blog has a great write-up and analysis of what this means for the concept of depression itself.

Link to Newsweek piece on antidepressants (via @DrDavidBallard).
Link to write-up of JAMA study at Neuroskeptic.

World changing images

BBC Radio 4 has just concluded a wonderful series on medical imaging that overs everything from the microscope, to ultrasound, to the brain scanner.

The series is five 15 minute programmes that tackles the technology and its controversies. The brain scanning programme is particularly good and shows both ends of the spectrum of enthusiasm for the use of functional brain scans to understand human nature.

Because of the BBC’s black hole of death archive, the programmes will start being sucked into the void in three days time, so do catch them before then.

The programmes also cover DNA imaging and X-rays and the website apparently has a gallery of images on but I have given up trying to find them on the dreadful Radio 4 website.

Link to ‘Images That Changed The World’ audio links.

Can you actually be frightened to death?

Photo by Flickr user Kman999. Click for sourceScience isn’t sure whether fear can kill but several courts have been convinced and have convicted people for murder on the basis that they caused death through fright. An article just published in the American Journal of Cardiology summarises the eight murder trials.

The cases are not, as I first suspected, where someone had deliberately tried to kill someone else using fright as a ‘weapon’ (like in the infamous scene in Belgian serial killer mockumentary Man Bites Dog – clip here – warning: not pleasant).

Instead, they typically describe where someone has died of a heart attack in the midst of an armed robbery or assault, despite not being mortally wounded.

In a similar case, State v. Edwards,10 the defendant and his accomplices entered a bar in Tucson, Arizona and committed a robbery at gunpoint. Shortly after the robbers had fled, the proprietor experienced a heart attack and died. The defendant argued that the victim’s death was accidental and unintended and could not constitute murder. Moreover, the defendant maintained that the evidence was insufficient to prove that the robbery actually caused the victim’s death.

The court disagreed on both counts, finding first that accidental, unintended consequences could form the basis of a murder conviction. Second, the court pointed to the testimony of a pathologist that the death was caused by anxiety resulting from the robbery at gunpoint. The court held that this provided adequate evidence to support causation.

However, this is not the only area where supposedly being ‘frightened to death’ has caught the interest of psychologists. There is a small psychological literature on ‘psychogenic death’ that attempts to explore reports of death after curses, spells or violation of cultural taboos.

This is from an excellent brief article from 2003, published in the journal Mental Health, Religion & Culture:

Landy (1977, p. 327) describes the phenomenon as follows: ‘a process is set in motion, usually by a supposed religious or social transgression that results in the transgressor being marked out for death by a sorcerer acting on behalf of society through a ritual of accusation and condemnation; then death occurs within a brief span, usually 24 to 48 hours’. Ellenberger (1965) distinguishes acute from slow psychogenic death. In some cases, the death can be rapid, in other cases the process occurs over several weeks where the patient sickens and dies. There has been some doubt expressed as to whether voodoo death is part of ‘colonial folklore’ only based on anecdotal reports (Williams, 1928).

Lewis (1977, p. 11) asks, ‘Is it really the case that healthy people have died in a day or three days because they know they were victims of sorcery? Who has seen this happen with his own eyes? Is there no explanation for it but sorcery?’ Yap (1977) calls for concrete findings from anthropologists and medical field workers that can be appraised critically. Questions have arisen as to whether or not these victims had pre-existent pathological conditions predisposing them to death. There is however some direct evidence for its occurrence.

The evidence is not people just dropping dead, but from several documented cases where perfectly healthy people rapidly give up eating and drinking after being ‘cursed’ and dehydration leads to death.

Link to PubMed entry for ‘Homicide by fright’ article.
Link to DOI entry and summary for ‘psychogenic death’ article.

2010-01-29 Spike activity

Quick links from the past week in mind and brain news:

io9 has a great brief summary of a citation analysis that describe how neuroscience became a major scientific discipline in just one decade. Interestingly, it didn’t happen in the Decade of the Brain.

The ability to resist temptation is contagious, according a new study covered by The Frontal Cortex. I suspect this means I am patient zero of giving in to temptation.

Salon has an interview with psychologist Susan Clancy about her new book ‘The Trauma Myth’ on child abuse, which is likely to be both important and controversial. The comments are a mix of the insightful, angry and loopy.

This chap might have found a photo of Phineas Gage from before his injury.

Radio 4 has a good documentary on ‘Super Recognisers’ that will disappear off the face of the earth in only a few days if you miss your chance to listen to it.

The Prison Photography blog is excellent.

NPR has a brief segment on new evidence suggesting that heavy drinking in teenage years may have a lasting impact on the brain.

Special therapy bears work through mirror neurons (what else) according to a bizarre claim unearthed by The Neurocritic.

NeuroPod has just released a new edition covering optogenetics, AI cockroaches, stem and grid cells.

Does time dilate during a threatening situation? asks Neurophilosophy.

Science Daily reports that thinking of the past or future causes us to sway backward or forward on the basis of a new study.

C.G. Jung’s famous ‘Red Book‘ has finally been published and Brain Pickings has a fantastic review and preview.

The Journal of Neurology, Neurosurgery, and Psychiatry has launched a new podcast which is aimed at clinicians and is. a. bit. stilted. but sounds promising.

There’s a good piece about the new and not very effective female ‘sex drug‘ flibanserin in Inkling Magazine.

Horizon, the flagship BBC science programme, recently had an episode on the Big Pharma, medicalisation and disease mongering. Apart from some minor pharmacological dodginess (ADHD a ‘chemical imbalance’, Ritalin a ‘clever pill’) it’s excellent and features our very own Dr Petra. Torrent here.

A new study finding people’s personality is reflected in their internet use is covered by the BPS Research Digest. See also a new study finding social behaviour is similar both online and offline.

Quirks and Quarks, the excellent Canadian radio show, discusses kuru disease immunity in cannibals.

Why is there no anthropology journalism? asks Savage Minds.

The Economist covers a new study finding that the more widespread a language, the simpler it is, suggesting that that languages become streamlined as they spread.

Incoming! APA press release forewarns of imminent clinical psychology fight: psychodynamic therapy best says not yet published meta-analysis.

PsyBlog has an excellent round-up of 10 studies on why smart people do irrational things.

The secrets of looking good on the dance floor and research on the psychology of social dance is covered in Spiegel magazine.

Life magazine has a gallery of famous literary drunks and addicts.

The US is quietly abandoning the ‘war on drugs‘ according to an article in The Independent. Does this mean the expansion of military bases in Colombia is to be re-justified as part of a war on salsa music? Kids told to ‘just say no’ to fake tans and enthusiastic rhythm sections.

The BPS Research Digest reports the development of what could be the first anti-lie detector in neuroscience.

Bootleg Botox, a potent neurotoxin, could be a weapon of mass destruction according to a piece in the Washington Post.

Wired reports on the Jan 25th anniversary of the first recorded human death by robot which occurred in Flint, Michigan, 1979.

The marriage market and the social economics of high-end prostitutes are tackled in a new study discussed in Marginal Revolution.

Better Thinking Through Chemistry

This chapter was due for inclusion in The Rough Guide Book of Brain Training, but was cut – probably because the advice it gives is so unsexy!

rgbt_cover_small.jpgThe idea of cognitive enhancers is an appealing one, and its attraction is obvious. Who wouldn’t want to take a pill to make them smarter? It’s the sort of vision of the future we were promised on kids TV, alongside jetpacks and talking computers.

Sadly, this glorious future isn’t here yet. The original and best cognitive enhancer is caffeine (“creative lighter fluid” as one author called it), and experts agree that there isn’t anything else available to beat it. Lately, sleep researchers have been staying up and getting exciting about a stimulant called modafinil, which seems to temporarily eliminate the need for sleep without the jitters or comedown of caffeine. Modafinil isn’t a cognitive enhancer so much as something that might help with jetlag, or let you stay awake when you really should be getting some kip.

Creative types have had a long romance with alcohol and other more illicit narcotics. The big problem with this sort of drug (aside from the oft-documented propensity for turning people into terrible bores), is that your brain adapts to, and tries to counteract, the effects of foreign substances that affect its function. This produces the tolerance that is a feature of most prolonged drug use – whereby the user needs more and more to get the same effect – and also the withdrawal that characterises drug addiction. You might think this is a problem only for junkies but, if you are a coffee or tea drinker just pause for moment and reflect on any morning when you’ve felt stupid and unable to function until your morning cuppa. It might be for this reason that the pharmaceutical industry is not currently focusing on developing drugs for creativity. Plans for future cognitive enhancers focus on more mundane, workplace-useful skills such as memory and concentration. Memory-boosters would likely be most useful to older adults, especially those with worries about failing memories, rather than younger adults.

Although there is no reason in principle why cognitive enhancers couldn’t be found which fine-tune our concentration or hone our memories, the likelihood is that, as with recreational drugs, tolerance and addiction would develop. These enhancing drugs would need to be taken in moderate doses and have mild effects – just as many people successfully use caffeine and nicotine for their cognitive effects on concentration today. Even if this allowed us to manage the consequences of the brain trying to achieve its natural level, there’s still the very real possibility that use of the enhancing drugs would need to be fairly continuous – just as it is with smokers and drinkers of tea and coffee. And even then our brains would learn to associate the drug with the purpose for which they are taken, which means it would get harder and harder to perform that purpose without the drugs, as with the coffee drinker who can’t start work until he’s had his coffee. Furthermore, some reports suggest that those with high IQ who take cognitive enhancers are mostly likely to mistake the pleasurable effect of the substance in question for a performance benefit, while actually getting worse at the thing they’re taking the drug for.

The best cognitive enhancer may well be simply making best use of the brain’s natural ability to adapt. Over time we improve anything we practice, and we can practice almost anything. There’s a hundred better ways to think and learn – some of them are in this book. By practicing different mental activities we can enhance our cognitive skills without drugs. The effects can be long lasting, the side effects are positive, and we won’t have to put money in the pockets of a pharmaceutical company.

Link to more about The Rough Guide book of Brain Training
Three excellent magazine articles on cognitive enhancers, from: The New Yorker, Wired and Discover

We go with the flow

The Psychologist has a completely fascinating article on how we perceive things to be more appealing, easier to handle and more efficient based on how simple they are to understand – even when this is based on irrelevant or superficial properties – like its name or the font it is described in.

The core idea is that we partly judge things on ‘processing fluency’, that is, how easy it is to immediately grasp something. This seems intuitive, as we tend to prefer things that make sense to us, but it turns out that this preference is also heavily influenced by surface features.

For example, the article discusses the surprising amount of work on how simply changing the font can change our opinion of what the text is describing.

When they were presented [with physical exercise instructions] in an easy-to-read print font (Arial), readers assumed that the exercise would take 8.2 minutes to complete; but when they were presented in a difficult-to-read print font, readers assumed it would take nearly twice as long, a full 15.1 minutes (Song & Schwarz, 2008b). They also thought that the exercise would flow quite naturally when the font was easy to read, but feared that it would drag on when it was difficult to read. Given these impressions, they were more willing to incorporate the exercise into their daily routine when it was presented in an easy-to-read font. Quite clearly, people misread the difficulty of reading the exercise instructions as indicative of the difficulty involved in doing the exercise…

Novemsky and colleagues (2007) presented the same information about two cordless phones in easy- or difficult-to-read fonts. They observed that 17 per cent of their participants postponed choice when the font was easy to read, whereas 41 per cent did so when the font was difficult to read. Apparently, participants misread the difficulty arising from the print font as reflecting the difficulty of making a choice.

The article contains numerous examples of how changing surface features, such as giving something an easy or difficult to pronounce name, alters what we think about it.

However, the piece also mentions that giving something difficult-to-process or unfamiliar features also means we scrutinise it more closely, which means we often pick up errors more easily.

This is is a wonderfully elegant example:

As an example, consider the question ‚ÄòHow many animals of each kind did Moses take on the Ark?‚Äô Most people answer ‚Äòtwo‚Äô despite knowing that the biblical actor was Noah, not Moses. Even when warned that some of the statements may be distorted, most people fail to notice the error because both actors are similar in the context of biblical stories. However, a change in print fonts is sufficient to attenuate this Moses illusion. When the question was presented in an easy-to-read font, only 7 per cent of the readers noticed the error, whereas 40 per cent did so when it was presented in a difficult-to-read font…

Link to Psychologist article on processing fluency.

Full disclosure: I am an unpaid associate editor and columnist for The Psychologist and I have an unfamiliar first name – draw your own conclusions.

John Cleese on neuroanatomy

British comedian John Cleese tackles the brain and gives a tour of the organ’s major anatomical landmarks in this short video from 2008.

It’s a tour de force of descriptive neuroanatomy and even the most experienced neuroscientist is likely to encounter much that is new and interesting.

It also finished on a short but important piece of advice that is worth bearing in mind in all lab situations.

Link to John Cleese on the brain (via @brainshow).

Information channelling

Photo by Flickr user leSiege. Click for sourceThe Frontal Cortex has a fantastic piece discussing a new study finding that people choose TV news based on which channels are more likely to agree with their pre-existing opinions and how we have a tendency to filter for information that confirms, rather than challenges, what we believe.

Lehrer discusses various ways in which we selectively attend to information we agree with but the best bit is where he goes on to discuss a wonderful study from 1967 where people demonstrated in the starkest way that they’d rather block out information that doesn’t agree with their pre-existing beliefs.

Brock and Balloun played a group of people a tape-recorded message attacking Christianity. Half of the subjects were regular churchgoers while the other half were committed atheists. To make the experiment more interesting, Brock and Balloun added an annoying amount of static – a crackle of white noise – to the recording. However, they allowed listeners to reduce the static by pressing a button, so that the message suddenly became easier to understand. Their results were utterly predicable and rather depressing: the non-believers always tried to remove the static, while the religious subjects actually preferred the message that was harder to hear. Later experiments by Brock and Balloun demonstrated a similar effect with smokers listening to a speech on the link between smoking and cancer. We silence the cognitive dissonance through self-imposed ignorance.

Link to Frontal Cortex piece ‘Cable news’.
Link to summary of 1967 static study.
Link to PubMed entry for same.

The missing psychiatric file of Adolf Hitler

Photo by Flickr user ninja M. Click for sourceI’ve just found this fascinating 2007 snippet from the European Archives of Psychiatry and Clinical Neuroscience on Adolf Hitler’s mysteriously missing psychiatric file from the time he was admitted to hospital following First World War injuries.

The article mentions that he was reportedly diagnosed with hysterical or non-organic blindness, something that nowadays would be diagnosed as dissociative disorder or conversion disorder, which signifies that a seemingly ‘physical’ problem occurs without any detectable physical origin.

The traditional and still popular explanation is that the mind is converting trauma to a physical symptom to protect itself from distress, although there is not a great deal of evidence for this theory.

However, it seems his file from this hospital admission disappeared and everyone who had knowledge about the case was apparently killed by the SS.

The recent 60 years anniversary of the end of World War II and the Nazi regime may be reason for a short psychiatric-historical note to point out a frequently overlooked detail of Hitler’s life‚Äîhis hidden psychiatric biography. Besides his extreme anti-semitism, mentally ill were among the most threatened individuals with some 200,000 being killed. This was made public during World War II by the Muenster cardinal Galen who most recently was beatified by pope Benedikt XVI. While Hitler’s late Parkinson disease has attracted some attention, his former functional ‚Äòhysteric‚Äô blindness is almost unknown.

In fact on 14th October 1918 Hitler, who served as a private in World War I, survived a mustard gas attack in Belgium near Ypern. There are some reports that he consecutively had a mild resultant conjunctivitis. He also suffered from nonorganic blindness. His further treatment is nearly unknown. Hitler was transferred to the military hospital in Pasewalk near Stettin/Baltic sea. Prof. Forster, chair at that psychiatric clinic, treated him by using hypnosis. Hitler was discharged on 19th November 1918 and never mentioned this period again.

His treatment is proven by eyewitness of Dr. Karl Kroner who later reported the facts to the US intelligence Office of Strategic Services (OSS). Hitlers’ file disappeared and all people who were closely involved or had special knowledge of this file were killed by the ‘Gestapo’, including Prof. Forster who probably was forced to commit suicide on 9th November 1933. Before that he succeeded in presenting these documents to exile writers in Paris where his brother was employed at the German embassy. The German Jewish writer Ernst Weiss, a physician himself, used the original documents in his novel ‘Der Augenzeuge’ (The Eyewitness) before he committed suicide during the German occupation of Paris on 6th May 1940.

The original file is lost but for all we know Hitler had a psychiatric history, which may not explain his savage ideas but throws an interesting light on his anti-psychiatric attitude.

Maybe it’s in the Albert Hall, along with that other important medical artefact from the F√ºhrer.

However, I note from the Wikipedia page on Hitler’s medical history that there have been many claims about Hitler’s health, many of them not well verified.

Nevertheless, he was subject to not one, but two, wartime Freudian character analyses commissioned by the OSS – the forerunner to the CIA. The first was completed by psychologist Henry Murray and the second by psychoanalyst Walter Langer.

The reports have many oddities and are largely opinion but they concluded that Hitler was a neurotic psychopath, probably had paranoid schizophrenia, was likely impotent, was a repressed homosexual and, most famously, would likely kill himself.

Although to be fair, the latter point did not describe dying a miserable death in a bunker but included various movie-style scenarios where he would blow himself up in a dynamite rigged mountain, use a single silver bullet or throw himself off a parapet as troops came to take him prisoner.

I’ve no idea how useful these reports ever were but they probably tell us more about the trends in psychology of the time than anything about the Nazi leader’s mind.

UPDATE: Grabbed from the comments… There’s an excellent post on the wartime character analysis reports over at the Providentia blog.

Link to short article on ‘Hitler‚Äôs missing psychiatric file’.

The evolution of death and dying

The New Yorker has a wonderful article on the psychology of death and dying which is carefully woven into the curious life story of psychiatrist Elisabeth Kübler-Ross, the originator of the ‘stage’ model of grief.

If you only read one popular article on grief, you’d do a lot worse than reading this carefully researched and sensitively written piece which journeys through both the social and cultural rituals of dying and how psychological theories have changed over the years.

It also tackles the fascinating life of Elisabeth Kübler-Ross who was responsible for the influential but now discredited ‘stage’ model that suggested that both dying and grieving people experience denial, anger, bargaining, depression and acceptance.

Subsequent studies have not supported these stages but Kübler-Ross was a pioneer in encouraging clinicians to address death with their gravely ill patients and her first book, On Death and Dying, opened up the practice of bereavement counselling for people who feel they need help coming to terms with their own death or the loss of loved-ones.

Kübler-Ross later became interested in a range of, it must be said, fairly flaky practices, such as mediumship and ‘channelling’ the dead, and she fell out of favour with the medical mainstream.

Late in life, she was disabled by a stroke and had a great deal of trouble coming to terms with her own mortality although the experience helped her write her final and widely regarded book, On Grief and Grieving, where she reflects on her own death and her life’s work.

The New Yorker article looks at Kübler-Ross’ legacy, but much more than that, examines a great deal of what we know about the process and how it is being integrated into modern medicine. Highly recommended.

 
Link to New Yorker article ‘Good Grief’ (via @mocost).

Forgetting fear

Photo by Flickr user ia7mad. Click for sourceThe Times has an excellent article summarising recent research on the possibility of treating traumatic memories by tempering their impact either just after the event or when remembering the experience at a later point.

The ability to update our memories with new information highlights the flexibility of our brain. Every act of remembering gives us an opportunity to shape memories, or even erase them. The discovery of the reconsolidation window has kick-started a lot of new memory research, advances in which could have important implications for people who suffer from unwanted fearful memories. Potential treatments for anxiety, phobias or post-traumatic stress disorder (PTSD) may be close at hand.

It’s a remarkably wide-ranging article that covers both chemical and psychological methods that have been drawn from recent research and is probably the best concise summary of this research you’ll be likely to read for a good while.

Much of the initial interest in this area was on a drug called propranolol, that doesn’t affect the brain’s memory circuits directly but does reduce tension in the body. Several experiments showing it reduced traumatic responses when taken immediately after a severe event generated a lot of hope that it might be a new way of preventing catastrophic reactions.

Recent findings have dulled the excitement a little though, as two studies have come out on burns victims, one in soldiers and another in children, and the drug had no detectable effect on trauma.

The Times article also mentions that “the drug merely changes the emotional content of memories, rather than erasing them” although this point is controversial.

Some studies that have tested the effect on recalling tragedy or trauma stories have found a genuine reduction in the amount of information recalled, not only the emotional ‘kick’ of the memories.

One similar study didn’t find this effect and a recent experiment directly compared propranolol to placebo and the stress hormone cortisol and found no effect of propranolol on memory, but another study found it did reduce short-term memory overall although it also lessened the impact of emotional distractions.

This is an important issue, because, as The Times notes, it could have massive implications if memories of a traumatic event form part of a court case after the drug may have ‘tampered’ with the evidence.

The piece by Ed Yong and Alice Fishburn, the former who you may know from the Not Exactly Rocket Science blog and who has put an added an interview with neuroscientist Todd Sacktor online who has completed recent work on the role of the PKMzeta protein in memory.

Link to Times article ‘How to forget fear’.
Link to NERS interview with Todd Sacktor.