Through the eyes of the psychopath

The New Yorker has an engaging article about psychopaths and what psychologists are starting to learn about the psychology and neuroscience of people who are thought to lack empathy.

Psychopathy doesn’t necessarily imply violence. The most commonly used modern definition, based on the work of psychologist Robert Hare, suggests that psychopathy includes things like a lack of conscience, manipulative behaviour, impulsiveness and an anti-social lifestyle.

The condition was first described clinically in 1801, by the French surgeon Philippe Pinel. He called it ‘mania without delirium.’ In the early nineteenth century, the American surgeon Benjamin Rush wrote about a type of ‘moral derangement’ in which the sufferer was neither delusional nor psychotic but nevertheless engaged in profoundly antisocial behavior, including horrifying acts of violence. Rush noted that the condition appeared early in life. The term ‘moral insanity‘ became popular in the mid-nineteenth century, and was widely used in the U.S. and in England to describe incorrigible criminals. The word ‘psychopath’ (literally, ‘suffering soul’) was coined in Germany in the eighteen-eighties. By the nineteen-twenties, ‘constitutional psychopathic inferiority,’ had become the catchall phrase psychiatrists used for a general mixture of violent and antisocial characteristics found in irredeemable criminals, who appeared to lack a conscience.

In the late nineteen-thirties, an American psychiatrist named Hervey Cleckley began collecting data on a certain kind of patient he encountered in the course of his work in a psychiatric hospital in Augusta, Georgia. These people were from varied social and family backgrounds. Some were poor, but others were sons of Augusta‚Äôs most prosperous and respected families. Cleckley set about sharpening the vague construct of constitutional psychopathic inferiority, and distinguishing it from other forms of mental illness. He eventually isolated sixteen traits exhibited by patients he called ‘primary’ psychopaths; these included being charming and intelligent, unreliable, dishonest, irresponsible, self-centered, emotionally shallow, and lacking in empathy and insight.

However, the article focuses on the work of psychologist Kent Kiehl who has completed a great deal of recent brain imaging research on criminal psychopaths, and argues that the core problem is a dysfunction of the paralimbic system.

This includes areas such as the orbital frontal cortex, anterior cingulate and amygdala, that are known to be involved in emotional reactions and often thought to be involved particularly in social interaction and empathy.

However, as the article recounts, getting inmates at maximum security prisons involved in cognitive science research has its own special challenges. Although this seem to have been somewhat mitigated by Kiehl’s use of a portable fMRI machine.

To be honest, the article focuses a little too much on the personalities, particularly when the science is so interesting, but it does cover the bases well and does make for an engaging read.

Link to New Yorker article ‘Suffering Souls’.

Synaesthesia induced by hypnosis

Wired Science has an interesting preview of an upcoming study that used hypnosis to induce colour-number synaesthesia in highly hypnotisable participants.

Synaesthesia is where the senses merge, and in colour-number synaesthesia, the affected people experience colours associated with specific numbers.

This new study used hypnosis to induce exactly this experience in people who didn’t have it before:

The researchers, led by Roi Kadosh of University College, London and Luis Fuentes of Spain’s University of Murcia, put three women and one man under hypnosis, then instructed them to perceive digits in color: one as red, two as yellow, three as green, and so on.

Upon waking, the subjects found it difficult to find numbers printed in black ink against correspondingly colored backgrounds. The numbers seemed to blend in — a telltale sign of synesthesia. When the hypnosis was removed, the ability vanished.

How the synesthesia formed so suddenly isn’t clear, but the researchers said that new neural connections are probably not responsible. “Such new anatomical connections could not arise, become functional, and suddenly degenerate in the short time scale provided by the current experiment,” they wrote.

Instead they suggest that hypnosis broke down neurological barriers between sensory regions. Marks agreed, but cautioned against extrapolating the findings too broadly: Many different varieties of synesthesia exist, from seeing emotions to tasting sounds, and may have different neurological and psychological origins.

Hypnosis has been studied before for it’s ability to induce anomalous colour experiences.

In a study published in 2000, the researchers used hypnosis to induce the experience of colour when the participants were viewing a black and white image, as well as the reverse.

What was most fascinating about this particular study was that it was run in a PET scanner and the researchers discovered that the colour-based focused hypnotic suggestions actually altered the function the colour perception areas in the visual cortex, which is known to be involved in the perception of colour.

In other words, it is likely that hypnosis was not simply leading the people to make false claims, but was actually affecting what they perceived.

Link to ‘Hypnosis Lets Regular People See Numbers as Colors’.
Link to PubMed entry for colour study (with full-text link).

Pentagon requests robot packs to hunt humans

New Scientist reports on a new Pentagon request to develop a pack of robots “to search for and detect a non-cooperative human”.

I am a strong believer in the fact that everyone who takes a course in artificial intelligence should be made to watch post-apocalyptic film The Terminator as a stark warning, in the same way that everyone who works with MRI scanners is made to watch serious videos about ‘what can go tragically wrong and how you can prevent it’.

I also suspect though, that the students who come out of those lectures rooting for the robots are recruited into military research teams.

From the Pentagon document:

Typical robots for this type of activity are expected to weigh less than 100 Kg and the team would have three to five robots.

PHASE I: Develop the system design and determine the required capabilities of the platforms and sensors. Perform initial feasibility experiments, either in simulation or with existing hardware. Documentation of design tradeoffs and feasibility analysis shall be required in the final report.

PHASE II: Implement the software and hardware into a sensor package, integrate the package with a generic mobile robot, and demonstrate the system‚Äôs performance in a suitable indoor environment. Deliverables shall include the prototype system and a final report, which shall contain documentation of all activities in this project and a user’s guide and technical specifications for the prototype system.

PHASE III: Robots that can intelligently and autonomously search for objects have potential commercialization within search and rescue, fire fighting, reconnaissance, and automated biological, chemical and radiation sensing with mobile platforms.

PHASE IV: Die puny humans die!

PHASE V: To the bunkers! Run for your lives! Arggghhhhh!

PHASE VI: Sarah Connor, we’re going to send you back in time to make a movie to warn everybody about the coming annihilation of the human race. Recruit a political leader so people will take it seriously – like Governor Schwarzenegger, for example.

Earlier this year, Israel announced that they want to develop an AI-controlled missile system that “could take over completely” from humans. If you’re still chucking, the UK military satellite system is called Skynet.

Link to NewSci on Pentagon opening Pandora’s box.
Link to Pentagon solicitation request.

Monochrome dreaming

Watching black and white television as a child may explain why older people are less likely to dream in colour than younger people, according to new study reported in New Scientist.

The study is from psychologist Ewa Murzyn, who was interested in how early experience could affect our dream life.

She first asked 60 subjects – half of whom were under 25 and half of whom were over 55 – to answer a questionnaire on the colour of their dreams and their childhood exposure to film and TV. The subjects then recorded different aspects of their dreams in a diary every morning.

Murzyn found there was no significant difference between results drawn from the questionnaires and the dream diaries – suggesting that the previous studies were comparable.

She then analysed her own data to find out whether an early exposure to black-and-white TV could still have a lasting effect on her subjects dreams, 40 years later.

Only 4.4% of the under-25s’ dreams were black and white. The over-55s who’d had access to colour TV and film during their childhood also reported a very low proportion of just 7.3%.

But the over-55s who had only had access to black-and-white media reported dreaming in black and white roughly a quarter of the time.

It’s an interesting study because, as we recently discussed, philosopher Eric Schwitzgebel argued that exposure to TV was an unlikely explanation for the effect where we’ve tended to report more coloured dreams in modern times and suggested this actually showed we’re not very good at introspecting into our own minds.

This study provides some evidences that the effect may be more reliable than we think.

However, I’m still puzzled by why television would seem to have such a big influence so many years later when most of the visual experience the person would have received as a child, even if a heavy TV watcher, would be from the ‘real’ coloured world.

Curious.

Link to NewSci on black and white dreams study (thanks Laurie!).
Link to scientific paper.
Link to PubMed entry for same.

Myths of the sleep deprived

New Scientist has an interesting piece by sleep psychologist Jim Horne who sets about busting the myth that modern society causes large scale sleep deprivation.

It’s full of fascinating facts and uses the phrase “to eke out the very last quantum of sleepiness” which is just lovely.

Until recently, people living above the Arctic circle slept much longer in winter than in summer. There are reports from the 1950s of Inuit sleeping up to 14 hours a day during the darkest months compared with only 6 in the summertime. Given the opportunity, we can all learn to significantly increase daily sleep on a more or less permanent basis. When it is cut back to normal we are sleepy for a few days, and then the sleepiness disappears.

Far from our being chronically sleep-deprived, things have never been better. Compare today’s sleeping conditions with those of a typical worker of 150 years ago, who toiled for 14 hours a day, six days a week, then went home to an impoverished, cold, damp, noisy house and shared a bed not only with the rest of the family but with bedbugs and fleas.

What of the risk of a sleep shortage causing obesity? Several studies have found a link, including the Nurses’ Health Study, which tracked 68,000 women for 16 years (American Journal of Epidemiology, vol 164, p 947).

The hazard, though real, is hardly anything to worry about. It only becomes apparent when habitual sleep is below 5 hours a day, which applies to only 5 per cent of the population, and even then the problem is minimal. Somebody sleeping 5 hours every night would only gain a kilogram or so of fat per year. To put it in perspective, you could lose weight at the same rate by reducing your food intake by about 30 calories per day, equivalent to about one bite of a muffin, or by exercising gently for 30 minutes a week.

One of the lessons from sleep research is that we’re actually pretty bad at judging how much sleep we need and even how much we actually get.

This seems to be particularly the case for people with insomnia who tend to underestimate the amount they sleep and overestimate the time it takes them to drop off.

The article is great guide to sleep myths and how they’re addressed by the scientific research and surprisingly for New Scientist, the article is open-access.

NewSci staffer having sleepless nights over their closed-access policy or just someone asleep at the wheel? Answers on a night cap please…

Link to NewSci piece ‘Time to wake up to the facts about sleep’.

The beauty algorithm and coding for the brain

The New York Times has a fascinating piece on some new software that automatically tweaks pictures of human faces to make them more attractive by reducing the concept of facial beauty to simple vector-based algorithms.

The image on the right is a ‘before and after’ picture of the software at work, and the researchers have a page for the project with many more examples and the full-text of the academic paper.

The researchers asked participants to rate the attractiveness of a series of faces and they then used software to calculate distances and directions between key facial landmarks.

By combining the attractiveness ratings and the landmark vectors they created a statistical model of which general facial attributes are most attractive. Their software allows new faces to be subtly altered to more closely approximate the general model of attractiveness.

I’m fascinated by the fact that software advances are increasingly taking advantage of the quirks of our mind and brain.

The MP3 format is perhaps the most well known, which allows audio files to be compressed because it takes advantage of a psychological effect called auditory masking where, when two sounds of certain frequencies are present, we can only perceive one.

The MP3 encoding algorithm simply scans sound files for times when auditory masking would eliminate the perception of one sound, and then actually eliminates the data from the file, thereby making it smaller.

Another wonderful idea is chroma subsampling used in jpg and digital video compression. It’s based on the finding that our visual system is less accurate at pinpointing colour differences than brightness differences.

Chroma subsampling takes advantage of this by storing colour information at a lower resolution than brightness information. For example, rather than storing separate colour information for every pixel, it will store it for every four. When we see the image, we often can’t tell the difference.

This is particularly true for moving images, and you’ll notice sometimes when you stop YouTube videos the colours seem to be fuzzy and bleed from where they’re supposed to be (have a look at this YouTube still I used on a recent post ) even though you hardly notice this when the video is playing.

These software advances wouldn’t have happened without the psychology research to find the bugs / features in human perception and it’s curious to think that these new developments build on both the digital and neural platforms.

What will be most interesting is if software starts to take advantage of cognitive features found only in certain members of the population (for example, some women have four types of colour receptor in the retina, rather than the usual three).

In other words, we might find that some important software advance will only work on some people (or rather, will be developed with only some people in mind), and so these people might be preferentially hired to work with certain applications.

If these applications become particularly high value (usually due to their use in the military or intelligence services), people might starting attempting to engineer themselves or others to have the uncommon attribute.

Sci-fi writers, start your engines.

Link to NYT piece ‘The Sum of Your Facial Parts’.
Link to researcher’s page with photos and full-text.

Feeling out of control sparks magical thinking

Psychology Today journalist Matthew Hutson covers some fascinating experiments just published in this week’s Science that found that reducing participants’ control increase the tendency for magical thinking and the perception of illusory meaning in random or patternless visual scenes.

Hutson covers all six experiments, but here’s a sample from his article which should give you the general idea:

In the fourth study, people who recalled a situation where they lacked control were more likely to see nonexistent images in snowy pictures and were also more likely to suspect conspiracies in ambiguous vignettes. (In one story, three local construction companies raise their prices after their owners all spend the same weekend at one bed and breakfast. In another, the protagonist was denied a promotion right after his boss and a workmate exchanged a flurry of emails.)

The fifth experiment showed that describing the stock market as volatile (versus stable) renders people more likely to spot false correlations in reports on company financials—and then make stock investments based on their unfounded conclusions.

Finally, the sixth study showed that feeling good about yourself reduces the frantic grasping for straws. There were three groups. One group recalled not having control, another recalled not having control and then performed a self-affirmation task, and a third group did neither. The first group saw more figures in snowy pictures and perceived more conspiracies than the other groups did. Apparently, increasing self-esteem fosters a sense of control over one’s life and reduces the need to seek additional stability in random noise.

Two of the ‘snowy pictures’ are shown on the right. The one on the top is completely random, the other has an embedded picture.

This is particularly interesting to me, because one of my own studies I completed with some colleagues in Cardiff also involved getting participants to perceive images in random visual patterns.

We did something a little different though, in that we didn’t have any hidden images, so every time someone saw something we knew it was illusory.

However, we also managed to alter how often people saw the images, but we used electromagnets (a technique called TMS) to alter the function of the temporal lobes which have been previously thought to be involved in the magical thinking spectrum – from everyday examples to diagnosable psychosis.

This study was inspired by an earlier study by neuroscientist Peter Brugger, who found that people who professed a belief in ESP (‘telepathy’) were more likely to see meaningful patterns in visual noise than those that didn’t.

Both the new study and our study are interesting because they show how this type of magical thinking can be manipulated.

However, this new study takes it to a whole new level because it involves a whole range of magical thinking tests (not just the ‘snowy patterns’) and shows how a number they are subject to the tides of emotion and feelings of being in control.

Link to Hutson’s excellent write-up.
Link to study in Science.
Link to DOI entry for same.

Fearing pharmaceutical modifications

Psychology Today journalist Matthew Hutson covers an interesting study that investigated which drug-based enhancements people are most comfortable with and which changes to the self people view negatively.

It seems drugs that potentially change our fundamental character traits are treated with most suspicion whereas those that change our abilities are thought to be the most acceptable.

Collaborators Jason Riis at NYU, Joseph Simmons at Yale, and Geoffrey Goodwin at Princeton first asked people to rate how fundamental a series of traits were to personal identity. In order of rated importance, the traits were: reflexes, rote memory, wakefulness, foreign language ability, math ability, episodic memory, concentration, music ability, absent-mindedness, self-control, creativity, emotional recovery, relaxation, social comfort, motivation, mood, self-confidence, empathy, and kindness. So people tend to think that emotional traits are more fundamental than cognitive ones.

The researchers then found that people are most reluctant to take pills that enhance the highly fundamental traits. Their most cited concern was personal authenticity…. When rating which types of enhancements should be banned, people instead based their decisions on concerns about competitions and fairness–morality rather than identity.

Link to write-up.
Link to study abstract.

Neuroaesthetics and the state of the art

Seed Magazine has an excellent article by Mo Costandi discussing how the study of neuroaesthetics – the neuroscience of art and beauty – is really starting to take off with a dedicated research centre recently launched in London.

I love the idea of neuroaesthetics but remain a little skeptical, not least because some of the literature gives the impression that it’s revolutionising our understanding of art when psychologists have been researching it since the beginning of psychology. I’ve yet to see the ‘neuro’ aspect add anything particularly novel so far.

I’ve got a fascinating but out of print book called Cognitive Processes in the Perception of Art that has a collection of papers from a five day conference on art and cognition from 1983.

The chapters cover much of the same sort of thing that is discussed under the neuroaesthetics banner (just without the brain scans) – including methods, symbolism, visual perception, music, improvisation, aesthetics, beauty and synaesthesia.

The introduction is interesting as an overview of the fragmented history of the field, most of which seems to have been undertaken in the expectation that this was something new and exciting:

…since 1876, when Fechner initiated the empirical approach to art through his book ‘Vorschule der Aesthetik’ psychology has been characterized by different ‘schools’; there has been continual dispute about the proper subject-matter of the discipline and about the theories and methods which should be applied to it. In many cases, the various approaches – such as Behaviourism, Gestalt Theory, Psychoanalysis, Humanistic Psychology, Information Theory, and Cognitive Psychology – have made distinctive contributions to the arts. One consequence has been that particular artistic phenomena have been selectively examined and then assimilated to preferred theories and methods of working, and hence these phenomena have escaped broad and systematic investigation as distinctive phenomena in their own right. Approaches to the arts have often been superficial and fragmentary, as Kose points out in his chapter, traditional approaches to the study of art often reveal more about the workings of psychological investigation than they do about art.

I’ve still yet to see anything that advances on this position.

Furthermore, theories that simply redescribe what you’re trying to explain are generally thought to be useless and the test of a good theory is that it can make accurate predictions. Where relevant it also suggests where interventions will have predictable effects.

Consequently, I often wonder whether neuroaesthetics will ever lead to a new and innovative type of artwork or art practice.

One of the most interesting things I’ve read recently was a discussion on the empyre mailing list (thanks Julian!) with various artists discussing their work in the cognitive and neurosciences. I warn you, it’s a pain in the arse to read because it’s only available as list archives.

Nevertheless, it mentioned a piece called ‘Ghosts in the Machine’ which sounds fantastic:

Ghosts in the Machine is a generative, closed system. Random noise from a CCD camera is analyzed for patterns. An algorithm looks for patterns that match the basic geometry and physiognomy of the human face. What it actually finds are pixels on a screen forming blobs and patches of colour that have no actual relation to a real world face. They have no indexical relation to an object. They are not images of people, but another kind of image loaded with meaning, which arises accidentally, but irresistibly, from the hybrid interaction between machine and body. To all intents and purposes when these patches of pixels look like faces, they are images of faces. That such obscure images resolve themselves into faces without conscious effort, and that remain even when attending closely to them, suggests that it is paradoxically their lack of objective meaning that generates their form. It is the very ambiguity and intedeterminacy of the images that allows the brain to reconfigure them as indexical.

It’s part of the Einstein’s Brain Project which aims to explore “the notion of the brain as a real and metaphoric interface between bodies and worlds in flux, and that examines the idea of the world as a construct sustained through the neurological processes contained within the brain”.

Link to Seed article ‘Beauty and the Brain’.
Link to details of cognitive processes in art book.
Link to Einstein’s Brain Project.
Link to good neuroaesthics primer.

Taking responsibility

Cato Unbound has a thought-provoking essay arguing that we need to radically re-think our relationship to psychoactive substances of all kinds to encourage informed responsible drug use rather than relying on the impossibility of prohibition to protect society.

The piece is by the founders of the Erowid drugs information and experience exchange site, who have been at the forefront of promoting education and information as the basis of responsible drug use.

“Know your body. Know your mind. Know your substance. Know your source.” One of Erowid’s earliest slogans, this directive encourages people to pay close attention to multiple aspects of their psychoactive substance use. These include understanding the individuality of response; avoiding drugs contraindicated because of health issues; learning enough about each substance to avoid unexpected effects and overdoses; and choosing both substance and information sources carefully in order to reduce risks. While these principles may seem obvious, they are seldom taught in contemporary drug education.

Alcohol is a good case to study, as its use is accepted in our culture and is not illegal for those over 21. Yet healthy and pragmatic drinking practices are seldom taught by parents, schools, or the government. By the time young adults reach the legal drinking age in the United States the vast majority of them have already consumed alcohol. In 2006, according to the National Survey on Drug Use and Health, the average age at which Americans first tried alcohol was 16.5, with only one in ten waiting until they were legally of age to drink.[14] And they haven’t just had a sip; nearly 40% of 20-year-olds have gotten drunk in the last month.[15] The opportunity to teach responsible use of alcohol—the most commonly consumed and arguably one of the most dangerous strong psychoactives[16]—is missed. The situation is much worse for controlled substances.

Teaching responsible, intentional use to young people does not require giving detailed instructions on how to use illegal psychoactives. The general principles can be taught through education about prescribed medications, alcohol, or other legal drugs. There are many practical lessons about how to safely and responsibly use psychoactives, whether learned from personal subjective experience, research, or the hard-won wisdom of others.

They make the important point that this applies to all drugs, illicit, commercial, medical, natural and artificial – from aspirin to angel dust.

Link to ‘Towards a Culture of Responsible Psychoactive Drug Use’.

Drug-fuelled shooting as a spectator sport

The Atlantic has a provocative article arguing that drug-fuelled shootings would make competitive sport more interesting, although probably not in the way you’re thinking.

The piece discusses beta blockers such as propranolol, drugs that have their major effect on the peripheral part of the autonomic nervous system.

They don’t actually make the user feel less psychologically anxious, but just reduce the normal ‘fight or flight’ pumped feeling, so the bodily effects of anxiety such as shaking, sweating, heart pounding and muscle tension are reduced.

These drugs are used widely by professional musicians to stop performance jitters and the Atlantic article argues that they should be allowed in sports like shooting and archery so competitors aren’t disadvantaged by performance anxiety.

From a competitive standpoint, this is what makes beta blockers so interesting : they seem to level the playing field for anxious and non-anxious performers, helping nervous performers much more than they help performers who are naturally relaxed. In the British study, for example, the musician who experienced the greatest benefit was the one with the worst nervous tremor. This player’s score increased by a whopping 73%, whereas the musicians who were not nervous saw hardly any effect at all.

One of the most compelling arguments against performance enhancing drugs is that they produce an arms race among competitors, who feel compelled to use the drugs even when they would prefer not to, simply to stay competitive. But this argument falls away if the effects of the drug are distributed so unequally. If it’s only the nervous performers who are helped by beta blockers, there’s no reason for anyone other than nervous performers to use them.

Link to ‘In Defense of the Beta Blocker’ (via 3QD).

Who needs sleep? The evolutionary slumber party

PLoS Biology has a cozy essay entitled “Is Sleep Essential?” that addresses the mystery of the purpose of sleep.

The article looks at sleep across the whole of the animal kingdom to examine how different species sleep and whether there are any animals that don’t sleep at all.

There are no convincing cases of sleepless animals it seems, and the authors, neuroscientists Chiara Cirelli and Giulio Tononi, argue that sleep is therefore likely to be an essential function of living creatures.

The three corollaries of the null hypothesis [‘sleep is not required’] do not seem to square well with the available evidence: there is no convincing case of a species that does not sleep, no clear instance of an animal that forgoes sleep without some compensatory mechanism, and no indication that one can truly go without sleep without paying a high price. What many concluded long ago still seems to hold: the case is strong for sleep serving one or more essential functions. But which ones?

The article goes on to examine the hypotheses that sleep is important for regulating the body’s core functions, the brain, individual cells and that it is common to all species and must involve something that cannot be provided by quiet wakefulness.

More interesting is the question of whether all animals dream – and perhaps most intriguing, if so, how they might dream.

Indeed, it would be interesting to discover whether dreaming is a necessary function of sleep, or whether it is specifically linked to certain neurocognitive processes or even particular creatures.

Link to PLoS Biology article ‘Is Sleep Essential?’ (via Wired Science).

Strip Club Hunter, or the attractions of anatomy

It’s hard to start a paragraph with “I was strolling through London’s red light district the other evening…” without seeming a little dubious, but it’s the truth, so I shall have to begin by sounding suspect.

If your suspicions have already been raised, I doubt that if I say that I became interested in one of London’s biggest strip clubs for its importance in the history of neuroanatomy that I will seem at all convincing. But it was also the case, so I shall I have to also begin by sounding a little implausible.

The photo on the left depicts the neon drenched Windmill Theatre, the first venue in London to have risqué shows displaying the naked bodies of young women to breathless crowds of young men.

In the 1930s the owners realised there was a loophole in the law, and that if the naked girls stood still, they weren’t acting and so weren’t subject to legislation banning nude actors. Decades of titillating ‘living statue’ shows followed, using increasingly inventive ways of presenting the spectacle of the unclothed and unmoving girl.

The theatre and the Windmill Girls, like the one on the right, became legendary, even being the subject of a recent Hollywood movie. Time could not stand still, however, and with changing morals, inevitably, the law changed, and along with it, the theatre. It now operates as a standard lap dancing club in the centre of Soho.

While the Windmill Theatre advertises its pedigree in large strips of red neon, the seemingly nondescript building to the right has nothing but a modest blue plaque to mark its heritage, but it drew similarly excited crowds wanting to glimpse the anatomy of the naked.

The plaque reads “Hunter, William. This was the home and museum of Dr William Hunter, Anatomist (1718-1783)”. While the plaque and the association with one of history’s great anatomists gives it an air of respectability that the gleaming Windmill lacks, it was no less salacious in its day.

For over a thousand years, medical men had used the 2nd century Greek physician Galen as their guide to the structure of the human body. The trouble was, Galen was often wrong and his work had only recently been challenged owing to a taboo over dissecting the dead.

Two local men decided that Galen would have to go, and thankfully for us, they were riotously successful. William Hunter, to whom the Soho plaque is dedicated, is now famed for his contribution to anatomy, and his brother, John Hunter is considered the first scientific surgeon – the founder of modern surgery.

The Hunter brothers were living in a time when the taboo over cutting up corpses was slowly being broken, but dissections were still considered seedy. A kind of edgy horrorshow for the strong of stomach and certainly not for the ladies.

To compound the air of disgust, bodies were acquired on a ‘no questions asked’ basis, and many were rumoured to be from the murdered poor, or from bodies stolen from graves.

On one horrific occasion in 1784, the physician John Sheldon, proprietor of the Blenheim Street School of Anatomy, was presented with his recently deceased sister by one of the school’s regular ‘suppliers’.

But the first of these independent school’s of anatomy was opened by William and John Hunter, on Great Windmill Street, where the famous strip club now stands. William Hunter (shown on the left) actually lived on the same site, with his brother living round the corner, in Golden Square, before moving to a large house in the prestigious Leicester Square where his bust can still be seen.

One of the school’s star pupils was Sir Charles Bell, the noted physician who revolutionised the understanding of the nervous system through his careful anatomical dissections and clinical studies, and whose name still resides in our bodies through numerous eponymous labels and disorders that scatter the neurology textbooks.

The Hunter brothers did more than just tutor, however, they catalogued – virtually every new discovery, anatomical oddity and grotesque pathology they found.

This systematic study led to many new discoveries, particularly in comparative anatomy and the understanding of the nervous system. In fact, you can still visit the Hunter’s collection, at the Royal College of Surgeon’s Hunterian Museum, which, as I’ve noted before, is full of neuroanatomical curiosities.

Great Windmill Street has hosted anatomists, professional and pornographic, for centuries, and still continues its proud tradition, although not necessarily in the form that the Hunters would have imagined.

So that’s my excuse, and I’m sticking to it.

Experienced drivers perceive the road differently

Experienced drivers are not only better skilled at the actions of driving, but learn to perceive and attend to the road in a different way

We found that novices eye-movements were different from those of the more experienced drivers in several ways, though the extent of scanning on a particular section of dual carriageway was particularly limited. We have since examined this effect in the laboratory using video-based stimuli replicating the same impoverished scanning in novice drivers (e.g. Underwood, Chapman, Bowden, & Crundall, 2002).

We have also further explored why this might be the case, examining the possibility of whether this was due to the novice drivers having a deficient mental model or whether they were simply overloaded by the requirement to control the car (a process which requires less attention with increased experience), and found that even when car-control demands were eliminated, the effect persisted (Underwood et al., 2002).

Another aspect that appears to be important in understanding this effect is the extent of the inexperienced drivers’ peripheral attention (Crundall, Underwood, & Chapman, 1999, 2002). We found that the less experienced drivers have a smaller field of peripheral vision, and are more likely to miss even abrupt onsets. This is especially the case when they are focusing on something that is potentially dangerous.

For example if the car ahead brakes suddenly, a novice driver will focus so much attention on that car that they may miss the errant cyclist emerging from the side road. More experienced drivers have a wider spread of peripheral attention however, and this appears to be linked to their spread of search.

The paragraph is an excerpt from a commentary on an interesting article on the relevance of lab studies to the real world from the latest edition of the British Journal of Psychology. I’ll post more about the main article shortly, but this snippet just caught my attention, if you’ll excuse the pun.

Link to PubMed entry for commentary paper.

Colic psychology

I’ve just found a surprisingly psychological New Yorker article on colic, the persistent and mysterious episodes of crying that affects some newborn babies.

I always thoughts that colic was just discomfort caused by trapped wind but apparently this is just one theory and the cause of colic is still medically unexplained.

The crying tends to stop after a few months and although thought to be physically harmless it can cause a great deal of discomfort to both baby and parents.

The New Yorker article, written by the talented physician and writer Jermone Groopman, notes that some of the most important discoveries about colic have not focused on the biology of the babies digestive system but on the psychology of parenting and carer-child interaction.

Lester believes that some infants who suffer from colic are “hypersensitive to normal stimuli”: they perceive and react to changes in their bodies (such as hunger or gas pangs) or in their environment (such as loud noises or the experience of being touched) more acutely than do other babies. In the mid-nineties, he studied forty-five children between the ages of three and eight who had had colic as infants (and had been seen at his clinic). He found that thirty-four of them—about seventy-five per cent—suffered from behavioral problems, including a limited attention span, tantrums, and irritation after being touched or coming in contact with particular fabrics or tags in their clothing. “Some of the kids would get very annoyed and refuse to put on a hat,” he told me. The children apparently objected to the sensation of having fabric on their head.

Lester speculates that many colicky infants are so sensitive to stimuli that physical contact with their parents is unlikely to soothe them, a theory that may be supported by data from societies in which babies are held continuously. Ronald Barr, the co-author of the 1997 study on infant cries, has analyzed data gathered by Harvard researchers between 1969 and 1971, during a study of the !Kung San, a tribe of hunter-gatherers in Botswana who practice a version of attachment parenting. “We found that the !Kung San carry their babies upright, have skin-to-skin contact day and night, breast-feed every 13.69 minutes for the first one to two years of life, and respond within fifteen seconds to any fret or whimper,” Barr, who now teaches at the University of British Columbia, told me. “The duration of the crying is fifty per cent less among the !Kung San compared with Western babies, but the !Kung San still have what we call colic, with episodes of inconsolable crying.”

A great deal of clinical psychology work concerned with difficult behaviour in children focuses on how people respond to certain behaviours. It is often the case that our natural reactions inadvertently reinforce and maintain the problem.

This can be the case even with severe difficulties like self-harm. Imagine that the parents of a child go through a period where they are so caught up in work they don’t have much time for the child no matter what he or she does.

The child accidentally harms themselves and suddenly gets a great deal of attention because the parents, who are not ‘bad parents’, just massively overworked, want to make sure their child is OK.

The child works out that harming themselves gets them attention but this causes resentment, so the parents act more negatively towards the child he or she does not harm themselves, meaning that caring attention is all the more attractive.

Although this type of cycle is most likely to crop up with children with learning disabilities, you can see how less severe versions (replace self-harm with tantrums) could easily occur. Or perhaps how the same cycle could occur in a child with learning disabilities in a specialised care environment (replace parents with staff).

Similar sorts of response-reaction cycles seem to occur in colic and Groopman’s article recounts how for even the youngest babies, social relationships are of prime importance.

Link to New Yorker article ‘Colic Conundrum’.

Recreational drug preference linked to medical speciality

Following our piece on several cases of drug addiction in anaesthetists, I just found some interesting studies on how recreational drug preference varies between medical specialities. It seems working in psychiatry and emergency medicine is linked to the highest rates of drug use, with surgeons having some of the lowest levels.

This study seems to be the most comprehensive on doctors of all levels of seniority:

Emergency medicine physicians used more illicit drugs. Psychiatrists used more benzodiazepines. Comparatively, pediatricians had overall low rates of use, as did surgeons, except for tobacco smoking. Anesthesiologists had higher use only for major opiates. Self-reported substance abuse and dependence were at highest levels among psychiatrists and emergency physicians, and lowest among surgeons. With evidence from studies such as this one, a specialty can organize prevention programs to address patterns of substance use specific to that specialty, the specialty characteristics of its members, and their unique practice environments that may contribute risk of substance abuse and dependence.

A 1992 study looked at exactly the same thing in junior doctors, and again found similar results – psychiatrists and emergency doctors tended to be more likely to use drugs, while surgeons were among the least likely:

Emergency medicine and psychiatry residents showed higher rates of substance use than residents in other specialties. Emergency medicine residents reported more current use of cocaine and marijuana, and psychiatry residents reported more current use of benzodiazepines and marijuana. Contrary to recent concerns, anesthesiology residents did not have high rates of substance use. Family/general practice, internal medicine, and obstetrics/gynecology were not among the higher or lower use groups for most substances. Surgeons had lower rates of substance use except for alcohol. Pediatric and pathology residents were least likely to be substance users.

A similar study on nurses was conducted by the same team a couple of years earlier and found similar results:

As hypothesized, rates varied greatly by speciality. Oncology nurses reported the highest past-year prevalence for all substances combined (42%), followed by psychiatry (40%) and emergency and adult critical care (both 38%).

Emergency and pediatric critical care nurses had the highest prevalence of marijuana / cocaine use (7%), followed by adult critical care nurses (6%). Prescription-type drug use was less varied across specialties: those with the highest prevalence of use were oncology, rehabilitation, and psychiatry. For cigarette smoking, psychiatry had the highest prevalence (23%), followed by emergency and gerontology (both 18%). Pediatric critical care nurses were least likely to smoke (8%). Binge drinking was high among oncology, emergency, and adult critical care nurses.

Link to abstract of recreational drug preference in doctors study.
Link to full text of drugs in junior doctors study.
Link to PubMed entry for same.
Link to full text of study on nurses.
Link to PubMed entry for same.