Rapture of the deep

Photo by Flickr user SteelCityHobbies. Click for sourceWhen scuba divers start swimming deep under water they can sometimes start feeling dreamy, light-headed and mentally fuzzy, an effect nicknamed ‘rapture of the deep’ but better known as nitrogen narcosis.

It is caused by changes in the way nitrogen, one of the gases in the divers’ air tank, dissolves in the body when under high pressure from the depth of the water.

No-one is quite sure exactly how it affects the brain, but many divers have noted the similarity between nitrogen narcosis and being drunk.

Psychologist Malcom Hobbs was intrigued by this connection and conducted a study [pdf], published last year in Undersea and Hyperbaric Medicine, to investigate the psychological similarity between the two states.

The experiment compared the subjective experience and effects on problem solving of alcohol and narcosis, but, also rather elegantly, looked at whether the two effects could be caused by a similar neurobiological process by seeing whether people with high alcohol tolerance also had a high narcosis tolerance.

Hobbs divided a group of divers into experienced and novice divers, as those with more experience should be more tolerant to narcosis, and made a further division between those who drank a lot of booze and those who drank very little, to look at differences in alcohol tolerance.

In the first experiment, they found the interesting effect where experienced divers adapted to the subjective effects of narcosis, but not the behavioural effects. While they felt more in control than novice divers, they actually weren’t. This chimes with an identical effect seen in heavy vs light drinkers.

But crucially, Hobbs also found that those affected to a greater degree by nitrogen narcosis are affected to a greater degree by alcohol on both subjective experiences and performance on the problem solving task (and vice versa), indicating that there is cross tolerance between the two states.

This suggests that they may affect the brain in similar ways. Although more research needs to be done on the actual neurobiology of the two states to be sure of the exact relationship, this study suggests that divers may indeed be ‘drunk’ when experiencing the rapture of the deep.

UPDATE: I just got emailed this interesting snippet by an experienced diver friend (thanks Ben!):

Something extra which happens with narcosis (which deviates from the alcohol analogy) is that, unless you’re already dead, the effects are completely reversible with no discernible side effects (eg hangover). One of the tricks divers use if they recognize narcosis (most often in their buddy than in themselves) is that ascending a few metres will often bring immediate clarity.

Even more interesting is that once clarity is achieved, descent back to the narcotic depth doesn’t necessarily bring back the narcotic effect of the nitrogen, which hasn’t really been explained yet. Theories abound regarding rate of descent and physiological effects of increasing ppN [partial pressure of nitrogen] and how it’s dissolved into various tissues.

Divers have known for years about this and have developed practical methods to deal with its effects (decreasing N content in breathing gases, replacing N with other inert gases etc). Actually, it’s known that oxygen also has a role to play in narcosis (as in nitrous oxide) but since some of it is metabolized, it’s effects are considerably less than the inert gas it accompanies.

I quite like the feeling of a little narcosis; but it does make time fly, and unfortunately time is the real enemy underwater!

pdf of full-text scientific paper.
Link to PubMed entry for same.

Synaesthesia in Frankenstein

One of the new ideas in synaesthesia research is that affected people perhaps don’t develop mixed senses as their brains develop, they just fail to lose them. It seems most children might start with naturally mixed senses before perception becomes segregated through pruning of the fuzzy neural pathways.

I’ve just noted an interesting article in Cognitive Neuropsychology on how this idea actually has long historical routes, and even influenced Mary Shelly’s cryopunk classic Frankenstein.

Although Mary Shelley was only 19 when she wrote her timeless novel, Frankenstein (1818), she combined contemporary philosophical and moral issues with a vision of the danger of emerging sciences that still has relevance today. The specific idea of early unity of the senses, very likely inspired by Rousseau, was articulated by Frankenstein’s creation in his first-person account of his early experiences:

“It is with considerable difficulty that I remember the original era of my being: all the events of that period appear confused and indistinct. A strange multiplicity of sensations seized me, and I saw, felt, heard, and smelt, at the same time; and it was, indeed, a long time before I learned to distinguish between the operations of my various senses. [Mary Shelley, Frankenstein (1818), chapter 11]”

Shelley goes on to present the creature as very humanlike, and it appears here that she wished to show that this extended to the earliest moments of his mental life. With the publication of Frankenstein, the unified-senses idea was thus brought into the popular culture, and Shelley’s words were probably read by some cognitive neuropsychologists in elementary school, even if they paid little heed to the sentiment. The idea also lived on within philosophy and, later, in the science of psychology.

In their professional career, very many cognitive neuropsychologists become acquainted with William James, and indeed the majority should recognize the phrase “one great blooming, buzzing confusion”. Most also recognize this as referring to the world of the infant, but few are probably aware that James was writing about his view that information from different senses is first fused in a child before later segregation.

Link to article.
Link to PubMed entry for same.

The phantom from the battle field

The Lancet recently published a fantastic article on one of the earliest cases of phantom limb. It was written by American Civil War surgeon Silas Weir Mitchell but not as a study in a medical journal, but as a short story in a popular magazine.

The story was titled The Case of George Dedlow in which Mitchell gives a careful medical description of sensations coming from a recently amputated limb, a portrait of how the amputation affected the soldier, and some musings on what it means about our relation to reality.

At this stage in the story, Mitchell uses his fictional character to muse on the neurological phenomenon of phantom limbs. Phantom limbs had been described in the mid-16th century by French military surgeon Ambroise Paré, but very little was known about what caused stump neuralgia (in the 1860s, the only treatments were electrotherapy, leeching, irritation of the surface of the stump, and re-amputation, none of which were very successful).

In The Case of George Dedlow, Mitchell speculates freely about what caused absent limbs to itch and feel pain. According to him, sensory impressions were transmitted through nerves to spinal nerve-cells and then to the brain. When a limb was removed, and until the stump healed, nerves continued to accept sensory impressions and to convey these impressions to the brain. If the stump never fully recovered, the result was constant irritation or a burning neuralgia. As Mitchell later explained in his famous textbook, Injuries of the Nerves and Their Consequences (1872), phantom limbs made “the strongest man…scarcely less nervous than the most hysterical girl”.

Somewhat poignantly, it seems Mitchell was haunted by his own phantoms from the war. In his later years he was troubled by ‘ghosts’ and intrusive memories from his gruesome years as a military surgeon.

It’s a fantastic short article that really conjures up the feel of the time as well as giving an insight into this important point in medical history.

Link to Lancet article.
Link to PubMed entry for same.
Link to text of short story The Case of George Dedlow.

Evolving causal belief

Photo by Flickr user evoo73. Click for sourceThere’s an interesting letter in this week’s edition of Nature from biologist Lewis Wolpert making the speculative but interesting claim that the development of causal belief may have been a key turning point in human evolution.

Wolpert is responding to a recent Nature essay critiquing the idea that closely related species will have evolved similar psychological processes, suggesting that it is shared selection pressures rather than genetic similarity that more greatly influences mental make up.

He responds by saying that we should focus on some of things that have uniquely evolved in humans rather than shared processes. He cites the ability to understand cause as a key example.

The feature that is peculiar to humans is their understanding about the causal interactions between physical objects (see, for example, L. Wolpert Six Impossible Things Before Breakfast; Faber, 2006). For example, children realize from an early age that one moving object can make another move on impact. It is this primitive concept of mechanics that is a crucial feature of causal belief, and that conferred an advantage in tool-making and the use of tools — which, in turn, drove human evolution.

Animals, by contrast, have very limited causal beliefs, although they can learn to carry out complex tasks. According to Michael Tomasello (The Cultural Origins of Human Cognition; Harvard Univ. Press, 1999), only human primates understand the causal and intentional relations that hold among external entities. Tomasello illustrates this point for non-human primates with the claim that even though they might watch the wind shaking a branch until its fruit falls, they would never shake the branch themselves to obtain the fruit. Some primates are, nevertheless, at the edge of having causal understanding.

Once causal belief evolved in relation to tools and language, it was inevitable that people would want to understand the causes of all the events that might affect their lives — such as illness, changes in climate and death itself. Once there was a concept of cause and effect, ignorance was no longer bliss, and this could have led to the development of religious beliefs.

Link to Wolpert’s letter in Nature.

Russian roulette in the medical literature

Photo by Flickr user bk1bennet. Click for sourceI’ve just discovered there’s a small medical literature on deaths by Russian roulette, where people put one bullet in a revolver, spin the chamber, put the gun to their head and pull the trigger.

A recent article from the The American Journal of Forensic Medicine and Pathology has a 10-year case review covering 24 deaths (wow) from the US state of Kentucky alone and serves as a summary of the research into this fate-tempting and most suicidal of games.

It’s a curious set of studies for which the most reliable finding is that people who die by Russian roulette are mostly young men who were drunk or had taken drugs.

On the more unusual side, one study found a link between participation in Russian roulette and “the types and number of tattoos and body piercing”.

The article also briefly describes a number of previous case reports from the literature, including this one which is remarkable for both mathematical and ultimately tragic reasons:

Playing a variation of traditional Russian roulette with his brother and 2 friends, the victim placed 5 live rounds in the cylinder, leaving one empty chamber, of a .357 Traus revolver. He spun the cylinder, put the gun to his right temple, and pulled the trigger. Postmortem blood toxicology revealed an ethanol level of 0.01% and the presence of diazepam and nordiazepam. The decedent had played Russian roulette on 2 occasions in the previous several weeks, each time placing only one live round in the cylinder.

Link to study on Russian roulette and risk-taking behaviour.
Link to DOI entry for same.

A hostage to hallucination

Photo by Flickr user Meredith Farmer. Click for sourceI’ve just found a morbidly fascinating 1984 study on hallucinations in hostages and kidnap victims.

The paper is from the Journal of Nervous and Mental Disease and contains case studies of people who have been held captive by terrorists, kidnappers, rapists, robbers, enemy troops and, er… UFOs.

The reasoning behind including two ‘alien abductees’ was to compare hallucinations in verified versus unverified hostage situations. Cases of people who were hostages but did not hallucinate are also included.

The study found that one in four hostages had intense hallucinations, and these were invariably people who were in life-threatening situations. Isolation, visual deprivation, physical restraint, violence and death threats also seemed to contribute to the chance of having a hallucinatory experience.

Case 14

A 23-year-old member of a street gang was taken hostage by a rival gang. He was kept in a warehouse, blindfolded and tied to a chair, for 32 hours. He was severely beaten and forced to record ransom demands on a tape recorder. During captivity he became dissociated – “even when they were hitting on me I just tripped out, got out of my body… it was like I was high on Sherms (phencyclidine).” At one point he felt detached from his body and “floated” to the ceiling where he observed himself being beaten and burned with cigarettes but denied having any pain. He saw colorful geometric patterns in the air and flashes of past memories “like a dream, only I kept seeing devils and cops and monsters… nightmares I guess”. Eventually he was released when his gang paid the ransom.

Some of the case studies are a little disturbing, but it’s worth reading the paper in full if you can, or at least from the beginning of the case studies, as it’s a rarely discussed but remarkably striking aspect of human experience.

Link to article on ‘Hostage Hallucinations’.
Link to PubMed entry for same.

On the information alarmageddon

New York Magazine has an article arguing that the concerns about digital technology drastically affecting our minds are just hype. I really wanted to like it but it’s just another poorly researched piece on the psychology of digital technology.

Research has shown that distraction can improve exactly the sorts of skills that the digital doomsayers say will be broken by the high-tech world, but I’ve never seen it mentioned in any of the recent high-profile articles on the predicted digital meltdown.

In fact, there is a fairly sizeable scientific literature on how interruption affects the ability to complete a task, and instant messaging has been specifically studied.

But despite getting lots of opinions from everyone from attention researcher David Meyer to lifehacker Merlin Mann only one single ‘study’ on the distracting effect of technology is mentioned in the New York Magazine article: “people who frequently check their email has tested less intelligent than people who are actually high on marijuana”.

This is quite amazing because not only was the ‘study’ in question not an actual scientific study, it was PR stunt for Hewlett Packard, this isn’t even an accurate description of it (users were interrupted with email during an IQ test and scored worse, big surprise).

The issue actually breaks down into two parts, one is a scientific question: what is the psychological effect of distraction? and the other, a cultural one: have we become a society where high levels of distraction are more acceptable?

As I mentioned, the first question has been very well researched and the general conclusion is that distraction reduces our ability to complete tasks. Essentially, it’s saying that distraction is distracting, which is hardly headline news.

But it also turns out that distraction is most disruptive to stimulus based search tasks, when we are flicking our attention around scanning for bits of information. Perhaps unsurprisingly, when we’re on alert for new and different things, something salient like an instant message grabs our attention and knocks us off course.

More thoughtful tasks involving processing meaning are the least affected. This is interesting because most of the digital doomsayers suggest it is exactly this sort of deep thought that being affected by communication technology.

The other line of argument is that all this distraction makes us less creative because creativity needs focus to flourish.

Although not as well studied, it seems this is unlikely. While we assume that distraction reduces creativity, but lab studies tend to show the reverse.

Distraction has also been found to improve decision making, especially for complex fuzzy decisions – again exactly the sort that the doomsayers say will most be at peril.

These studies find that too much concentration reduces our creative thinking because we’re stuck in one mind-set, deliberately filtering out what we’ve already decided is irrelevant, thereby already discarding counter-intuitive ideas (actually this is something the article does touch on). We can speculate that this may be why a preliminary study found that amphetamine-based concentration drug Adderall reduced creativity.

The cultural issue is perhaps more important, but on an individual level is more easily addressed.

You have control over the technology of distraction. If you can’t concentrate, switch it off. It it is your job to be distracted and it is affecting other essential parts of your role, that is something to take up with your employers.

It’s no different than if you’re being distracted by the sound of traffic and can’t do your job. Maybe you need an office away from the street? If you or your employers can’t do anything about it, maybe that’s just one of the downsides of the job.

What research hasn’t yet shown is that digital technology is having a significant negative influence on our minds or brains. In some cases, it’s showing the reverse.

History has taught us that we worry about widespread new technology and this is usually expressed in society in terms of its negative impact on our minds and social relationships.

If you’re really concerned about cognitive abilities, look after your cardiovascular health (eat well and exercise), cherish your relationships, stay mentally active and experience diverse and interesting things. All of which have been shown to maintain mental function, especially as we age.

Technology has an impact on the mind but it’s a drop in the ocean compared to the influence of your health and your relationships.

I’m constantly surprised that the impact of technology is clearly of such widespread interest to merit headline grabbing articles in international publications, but apparently not interesting enough that journalists will actually use the internet to find the research.

It’s like writing a travel guide without ever visiting the country. I’m just guessing the editors have yet to catch on to the scam.

Link to NYMag article ‘In Defense of Distraction’.

Can’t put the thought genie back into the bottle

Photo by Flickr user kaneda99. Click for sourcePsyBlog has an excellent piece on the counter-intuitive psychology of thought suppression – the deliberate attempt to not think of something that almost invariably backfires.

The article is both fascinating from a scientific point-of-view but also important as a personal mental health resource if you’re one of the many people who intuitively think that the best way of dealing with ‘bad’ thoughts is to try and push them out of the mind.

What psychology research has shown us is that not trying to think of something makes us think of it more frequently (the “don’t think of a pink elephant” phenomenon), and that this counter-productive effect is enhanced for emotion-heavy thoughts and in people with mental illnesses where intrusive thoughts are a problem.

Psychologists often use the metaphor of noisy trains passing through the station. Thought suppression is like standing in the middle of the tracks trying to push the train back. You’re just going to get run over. Instead, people are encouraged to just wait on the platform, observe the train of thought and wait for it to pass.

The ability to act as a ‘detached observer’ to the mind’s distressing thoughts is a useful cognitive skill and one that is cultivated by mindfulness mediation, something that has increasing evidence as a useful treatment for mental health problems.

There’s lots of good research on thought suppression, much of which is covered in PsyBlog article, but this study struck me as particularly inventive:

Wegner and Gold (1995) examined emotional suppression by delving into people’s romantic pasts using a neat comparison between ‘hot flames’ and ‘cold flames’. A ‘hot flame’ is a previous partner who still fires the imagination, while a ‘cold flame’ is a previous partner for whom the thrill is gone. In theory the ‘hot flame’ should produce more intrusive thoughts so people should have more practice suppressing them. Meanwhile because the cold flame doesn’t produce intrusive thoughts, people should have less practice suppressing them.

The results revealed exactly the expected pattern: people found it harder to to suppress thoughts about cold flames presumably because they had less practice.

Link to PsyBlog on ‘Why Thought Suppression is Counter-Productive’.

Tell me about your mother superior

I found this fascinating aside in a 1969 article on ‘Psychiatric Illness in the Clergy’ about a group of monks who underwent psychoanalysis, causing two thirds of them to realise they were “called to married life”.

The Pope immediately banned psychoanalysis from the priesthood as a result:

[Bovet] suggests that many clergy would benefit from psychotherapy during their training. This was attempted in Mexico when in 1961 a group of 60 Benedictine monks underwent group and individual psychoanalysis. However, of the original 60 monks taking part in this experiment, only 20 are still monks ; and of the 40 who have left the monastery it is reported that “there are some who realized that they were really called to married life” (Lemercier, 1965).

The Papal Court answered this “threat” the following decree: “You will not maintain in public or in private psychoanalytical theory or practice, under threat of suspension as a priest, and you are rigorously forbidden under threat of destitution to suggest to candidates for the monastery that they should undergo psychoanalysis” (Singleton, 1967).

This would not be the last time psychotherapists cause stirrings in the faithful.

The book Lesbian Nuns, Breaking Silence contains a chapter by the former Sister Mary Benjamin of the Immaculate Heart of Mary convent in California.

Psychotherapists Carl Rogers and William Coulson arranged for the nuns to take part in encounter group, essentially a form of fashionable 60s group psychotherapy aimed at well people rather than patients for ‘personal growth’.

The effect was disastrous for the convent, with hundreds of the nuns defaulting on their vows, and several, including Sister Mary Benjamin, discovering repressed lesbian desires.

The convent eventually collapsed and was closed in 1970.

There’s a brief online article that also recounts this story and I was intrigued to see a footnote at the end:

Having abandoned his once lucrative career, Dr. William Coulson now lectures to Catholic and Protestant groups on the dangers of psychotherapy, with a particular emphasis upon the “encounter group” dynamic.

There’s a whole novel right there in that footnote.

 
Link to summary of ‘Psychiatric Illness in the Clergy’.
Link to online article about Dr William Coulson.

Lithium levels in drinking water linked to fewer suicides

Photo by Flickr user Today is a good day. Click for sourceHigher levels of naturally occurring lithium in the water supply are associated with fewer suicides in the local population, reports a study just published in The British Journal of Psychiatry.

Lithium is one of the fundamental elements, but is also used by psychiatrists as one the most effective drug treatment for mood disorders, in the form of lithium carbonate and lithium citrate, where it is also known to reduce the risk of suicide.

This new study suggests that even trace amounts might have an influence on the whole population level, and this is not the first time this link has been made.

A 1990 study found higher levels of lithium in drinking water were linked to fewer incidences of crimes, suicides, and arrests related to drug addictions.

This leads to the intriguing question of whether lithium should be added to the water supply as a public health measure.

The idea of adding psychoactive substances to the water supply sounds creepy, but some might argue that if we add fluoride simply to prevent tooth decay, boosting lithium concentrations to the high end of naturally occurring levels to reduce deaths could be justified.

Philosophers and conspiracy theorists start your engines.

Link to BJP lithium study.
Link to DOI entry for same.

Blast from the past

New Scientist covers the debate on the causes of the non-specific emotional and cognitive symptoms that are appearing at an alarming rate in US soldiers who have been caught up in blasts while on service.

The controversy centres on whether the symptoms of ‘post concussional syndrome’ (which can include depression, irritability, concentration difficulties, headaches and reduced memory function) are caused by damage to the brain or from shock waves of the explosion, or are largely triggered by an emotional reaction to the stress of war.

It’s an interesting debate, not least because it’s almost 100 years since almost exactly the same debate raged over shell shock.

This is from an excellent article by medical historian Edgar Jones and colleagues who discuss the similarities between the ‘shell shock’ debates, and the current controversy:

Frederick Mott, then Britain‚Äôs leading neuropathologist, who was recruited by the War Office to discover the etiology of the disorder, argued that in extreme cases shell shock could be fatal if intense commotion affected “the delicate colloidal structures of the living tissues of the brain and spinal cord,” arresting “the functions of the vital centers in the medulla”. It was also speculated that the disorder resulted from damage to the CNS from carbon monoxide released by the partial detonation of a shell or mortar. In other words, shell shock was formulated as an organic problem even though the pathology remained unclear.

However, research conducted in 1915 and 1916 by Myers, consultant psychologist to the British Expeditionary Force, led to a new hypothesis. Based on his own observations, an increasing appreciation of the stress of trench warfare, and the finding that many shell-shocked soldiers had been nowhere near an explosion but had identical symptoms to those who had, Myers suggested a psychological explanation. For these cases, the term “emotional,” rather than “commotional,” shock was proposed. The psychological explanation gained ground over the neurological in part because it offered the British Army an opportunity to return shell-shocked soldiers to active duty.

As mentioned by the NewSci two big studies have recently found strikingly similar results: many soldier who have the symptoms of ‘post concussional syndrome’ were never actually in an explosion.

Extreme stress and trauma, of whatever type, seems to predict the likelihood of someone having the symptoms better than actually being caught up in an explosion.

The more things change, the more they stay the same.

Link to NewSci ‘Brain shock: The new Gulf War syndrome’.
Link to ‘Shell Shock and Mild Traumatic Brain Injury: A Historical Review’.

From the four humours to fMRI

The excellent Cognition and Culture blog found a fascinating lecture by the energetic medical historian Noga Arikha about the four humours theory of medicine and how its legacy influences our modern day ideas about the mind and brain.

The four humours theory suggested that the function of the mind and body was determined by the balance of four fluids in the body: black bile, yellow bile, phlegm, and blood.

While specific diseases were explained in this way, so were character traits and, in their excess, mental illness.

Indeed, some of the old names for these fluids still survive as descriptions of character traits (for example, we can still describe someone as phlegmatic or sanguine) even if we’re unaware of their origins.

However, Arikha outlines that its possible to trace the thinking behind humoural theories right through history into our current ideas about mind and brain in the age of brain scans and cognitive neuroscience.

The talk is based on her book, called Passions and Tempers: A History of the Humours, and the video is a bit shaky at times but worth sticking with as it’s an engrossing lecture.

Link to video of talk by Noga Arikha.

Phantom portraits

I’ve just found a gallery of one of my favourite art science projects of all time which used digital photo manipulation to illustrate the phantom limbs of post-amputation patients.

The images are incredibly striking, because they vividly illustrate that phantom limbs are often only phantom part-limbs. Sections can be missing, even in the middle, so a phantom hand can be felt even if a phantom elbow cannot.

Or perhaps a phantom hand can feel as if it protrudes directly from the point of amputation at the shoulder, or perhaps it feels distorted, or perhaps has no intervening phantom arm, or perhaps it is stuck in one position, and so on.

The project was the brainchild of neuropsychologist Peter Halligan, neurologist John Kew and photographer Alexa Wright. Actually, Peter is an ex-boss and I spent several years of my PhD with a huge picture of RD (above) in my office and it never failed to amaze me.

Unfortunately, the pictures in the online gallery are a viewable but a little small, although there are some larger versions if you scroll down in this essay.

Link to After Images online gallery.

Reverse psychology in a pill: anti-placebo

Photo by Flick user ArneCoomans. Click for sourceYou may be aware of the placebo effect, where an inert pill has an effect because of what the patient thinks it does. You may even be aware of the nocebo effect, where an inert pill causes ‘side-effects’. But a fascinating 1970 study reported evidence for the anti-placebo effect, where an inert pill has the opposite effect of what it is expected to do.

Storms and Nisbett were two psychologists interested in attribution, the process of how we explain the causes of events and the impact this has on how we feel.

We know that attributions have a big impact on our level of physical and emotional health. For example, your heart is racing when you’re about to give a talk. If you attribute it to a weak heart, you may start worrying whether you might pass out and become incredibly stressed, but if you attribute it to the situation, you might just think its a natural reaction for the event and feel primed and ready.

In anxiety disorders, we know that people often attribute natural bodily reactions to frightening causes, which makes people feel more on edge, and hence, their body kicks into an even higher gear, and so on. The cycle continues, to fever pitch. In essence, it’s anxiety-fuelled anxiety.

Insomnia has an element of this. People can be worried that they’re not sleeping, and so get anxious thoughts when they go to bed, and so feel on edge, ad nocturnum, until the early hours.

So rather than getting people to fill in questionnaires about causes of insomnia, a typical method in attribution research, Storms and Nisbett wanted to test these ideas in the real world.

They recruited a group of patients with insomnia and told them they were doing a four-night study on dreaming and asked them to rate their difficulty in falling asleep each night.

The first two nights were exactly that, a sleeping and rating exercise, but on the third night the participants were given pills. One group was told that the pill would make them feel more aroused, like a shot of caffeine, while the others were told that the pill would make them feel more relaxed, like a sleeping pill.

On the fourth night, the group were given the ‘opposite’ pill, but in reality, all the pills were identical and completely inert, containing nothing more than sugar.

Now here’s the thing. The insomnia patients taking the ‘relaxation’ pills slept really badly, and the patients taking the ‘arousal’ pills slept much better.

What seemed to be happening was that patients taking ‘uppers’, normally trapped in a cycle of anxious self-monitoring, could attribute any arousal they had to the pill. Any sign of feeling wired wasn’t them, it was the pill, so they could relax and fell asleep easily.

In contrast, those who had taken the ‘downers’ thought that any arousal must be their insomnia causing them problems, and it must be really bad, because it was getting to them despite the supposed sleeping pill they’d taken. In other words, they were freaking out because they couldn’t sleep despite the ‘medication’.

It turns out that this simple experiment wasn’t easily replicated but the problem was solved in 1983 when it was realised that this effect only held for people with insomnia who obsessively self-monitored.

But what these experiments tell us is that the effects of medication, the symptoms of illness and even the process of ‘being sick’ is partly dependent on our own ideas about what’s happening.

Link to PubMed entry for original Storms and Nisbet study.
Link to 1983 replication.

The medieval senses and the evil eye

The latest edition of neurology journal Brain has an extended review of three books about the history of the senses which gives a fascinating insight into how the meaning of our sensory experiences has changed over the centuries.

This paragraph is particularly interesting as it relates medieval theories of perception to the superstition of the ‘evil eye‘ where you could curse someone by looking at them.

While we now think of vision as a system for interpreting passively received light, the ‘evil eye’ makes much more sense when you realize that medieval people thought that light rays could fundamentally influence what they touched and even that the eyes actively sent out rays that could influence the objects within sight.

In 1492, learned debates also influenced how the world was perceived. As medical historians Nancy Siraisi and James T. McIlwain, also a neuroscientist, point out, medieval scholars would have located sensory perception in the brain (Siraisi, 1990; McIlwain, 2006). However, they would have perceived the five senses as active entities conveying information about the outside world to the internal senses of common sense, imagination, judgement, memory and fantasy (the ability to visualize).

Scholars differed considerably over how this worked in practice: for example, were rays emitted from the eyes towards the viewed object or was it the other way round? Either theory allowed for these rays to influence both viewer and object, thus explaining the widespread concept of the evil eye, or a belief still current in the 18th century that what a mother saw affected her foetus. The brain, however, was not the only sensitive organ of the body.

The heart was believed to be the centre of the animal soul, and thus closely associated with more carnal senses such as touch. The brain, the centre of the rational soul, was more closely associated with sight; the eyes often viewed as the ‘windows of the soul’. Sight, therefore, was given pre-eminence in the pre-modern world as it is today, but often for spiritual reasons due to the inter-dependence of religion and rational knowledge (scientia).

Thus even if the brain functioned in the past very much as it does today, the emotional and moral meaning of sensory experience differed dramatically.

The whole review is worth reading in full, not just because of the insights into medieval psychology, but also because these new books introduce ‘sensory history’ – a history of ideas about how we experienced the world through our bodies.

Link to review.
Link to DOI entry for same.

Predicting the determined self-castrator

The Journal of Sexual Medicine has a surprising study looking at psychological attributes that predict which castration enthusiasts who will actually go on to remove their own testicles, in contrast to those who just fantasise about it.

This is the abstract from the scientific paper:

A passion for castration: characterizing men who are fascinated with castration, but have not been castrated

Roberts LF, Brett MA, Johnson TW, Wassersug RJ.

J Sex Med. 2008 Jul;5(7):1669-80.

Introduction. A number of men have extreme castration ideations. Many only fantasize about castration; others actualize their fantasies.

Aims. We wish to identify factors that distinguish those who merely fantasize about being castrated from those who are at the greatest risk of genital mutilation.

Methods. Seven hundred thirty-one individuals, who were not castrated, responded to a survey posted on http://www.eunuch.org. We compared the responses of these “wannabes” to those of 92 men who were voluntarily castrated and responded to a companion survey.

Main Outcome Measures. Respondents answered the questionnaire items relating to demographics, origin of interest in castration, and ambition toward eunuchdom.

Results. Two categories of wannabes emerged. A large proportion (‚àº40%) of wannabes’ interest in castration was singularly of a fetishistic nature, and these men appeared to be at a relatively low risk of irreversible genital mutilation. Approximately 20% of the men, however, appeared to be at great risk of genital mutilation. They showed a greater desire to reduce libido, change their genital appearance, transition out of male, and prevent sexually offensive behavior. Nineteen percent of all wannabes have attempted self-castration, yet only 10% have sought medical assistance.

Conclusions. We identify several motivating factors for extreme castration ideations and provide a classification for reasons why some males desire orchiectomies. Castration ideations fall under several categories of the Diagnostic and Statistical Manual of Mental Disorders, 4th Ed. (DSM-IV), most notably a Gender Identity Disorder other than male-to-female (MtF) transsexual (i.e., male-to-eunuch) and a Body Identity Integrity Disorder. Physicians need to be aware of males who have strong desires for emasculation without a traditional MtF transsexual identity.

We reported on an earlier study by the same research group last year, which discovered that ‘voluntary eunuchs’ report that they are pleased that they have had their testicles removed and seem mentally healthy.

Link to PubMed entry for study.