A hitchhiker’s guide to the inherited mind

New Scientist has a fantastic article on making sense of cognitive genetics studies, the science that links certain versions of genes to behaviour, by taking the use and abuse of the MAOA gene as an example. If the name doesn’t ring a bell you may remember it being dubbed ‘warrior gene’, which as well as being inaccurate, was one of its least embarrassing moments.

For many decades, genetics and psychology only really interacted with the twin study, which, by comparing the differences between identical and non-identical twins, can indicate how much of the difference in the twins you’ve studied is due to the environment and how much has been inherited.

As it became possible to identify individual genes, and more importantly, as automated ‘gene chip‘ technology made this economical, studies began looking at differences between groups of people distinguished by simply having different versions of the same gene.

The idea is to see how a single gene influences behaviour, but because the gene and the everyday effect are so distant (it’s like trying to detect the effect of a day of farm weather on the flavour of your lunch) the story often gets mangled in the retelling.

The New Scientist article, by Not Exactly Rocket Science’s Ed Yong, tells the story of MAOA and its headline-making link with aggression, but it also serves as an essential hitchhiker’s guide to the science and pitfalls of linking genetics with behaviour.

However, the clearest sign yet that the gene is no ruthless determinant of behaviour came in 2002 when Avshalom Caspi and Terrie Moffitt of Duke University in Durham, North Carolina, published their findings about a sample of 442 men from New Zealand who they had followed from birth. A third of these men carried the MAOA-L variant. Now, aged 26, this group was indeed more likely than the others to have developed antisocial disorders and violent behaviour – but only if they had been poorly treated or abused as children. Moffitt and Caspi concluded that the so-called “warrior gene” affects a child’s sensitivity to stress and trauma at an early age. Childhood trauma “activates” bad behaviour, but in a caring environment its effect is quashed.

Since then, similar interactions between nature and nurture have become part and parcel of the MAOA story. Carriers of MAOA-L are more likely to show delinquent behaviour if they were physically disciplined as children. They are also more likely to be hyperactive in late childhood if their first three years were stressful, and to develop conduct disorders if their mothers smoked cigarettes while pregnant with them. The list goes on. Likewise, Beaver found that MAOA-H carriers were more likely to commit fraud, but only if they hung around with delinquent peers.

Link to NewSci article on MAOA, genes and behaviour.

Cultural differences in childhood amnesia

Photo by Flickr user Irregular Shed. Click for sourceChildhood amnesia is the phenomenon where we are generally unable to remember the earliest years of childhood. This is often assumed to be purely because the brain is too underdeveloped to successfully store and organise memories but an interesting study from 2000 reported that the extent of childhood amnesia differs between cultures and sexes.

Cross-cultural and gender differences in childhood amnesia

Memory. 2000 Nov;8(6):365-76.

MacDonald S, Uesiliana K, Hayne H.

In two experiments, we examined cross-cultural and gender differences in adults’ earliest memories. To do this, we asked male and female adults from three cultural backgrounds (New Zealand European, New Zealand Maori, and Asian) to describe and date their earliest personal memory. Consistent with past research, Asian adults reported significantly later memories than European adults, however this effect was due exclusively to the extremely late memories reported by Asian females. Maori adults, whose traditional culture includes a strong emphasis on the past, reported significantly earlier memories than adults from the other two cultural groups. Across all three cultures, the memories reported by women contained more information than the memories reported by men. These findings support the view that the age and content of our earliest memories are influenced by a wide range of factors including our culture and our gender. These factors must be incorporated into any comprehensive theory of autobiographical memory.

This doesn’t mean that brain development plays no role, of course, but it raises the question of how many of the things we recall from childhood are influenced by culture.

For example, memories that seem genuinely to be from the early years may appear that way due to us being brought up with the retelling of family stories or from seeing photographs and subsequently absorbing them as our own memories thanks to source amnesia.

It could be that this form of social remembering differs between cultures or is influenced by the sex of the child which may encourage people to report earlier or later memories, or alternatively, may actually strengthen genuine memories as they are re-told during our early years.

pdf of full text of culture and childhood amnesia study.
Link to PubMed entry for same.

In the eye of the storm

Wired magazine’s Haiti Rewired blog has an excellent piece on the ‘psychological typhoon eye’ phenomenon, discovered after studies of the 2008 Sichuan earthquake in China, where those closest to the centre of the devastation actually reported less concern about their safety and health.

The effect was initially reported shortly after the disaster and was found to still be present in a follow-up study one year later.

From the Wired piece:

Two suggestions have been provided to account for the psychological eye, namely “psychological immunization” or “cognitive dissonance”. The former seemed like a plausible explanation after the initial survey, since there is wide anecdotal documentation of “coping measures” adopted by those who experience significant personal trauma or hazards. However, the fact that subsequent surveys found relatives experiencing a variation of the psychological eye, suggests that the extent of personal experience, which strongly drives psychological immunization, is not sufficient to account for the observed effect.

Festinger’s theory of cognitive dissonance is defined as an uncomfortable psychological state in which two opposing cognitions are experienced and need to ultimately be reconciled. In the example of the psychological eye, the devastation of the area creates a sense of danger, yet the individual may have no choice but to remain close by, counter to the survival instinct. To reconcile these conflicting beliefs, the individual may unconsciously lower self-assessed risk to justify remaining in the area. Cognitive dissonance is very difficult (impossible?) to modify in the field, as noted by the authors, and thus, this proposal will remain more speculative until follow-up studies in a controlled fashion can be done.

The author, Nature’s Noah Gray, goes on to suggest that “Surveyors must maintain a cautious and healthy skepticism when interviewing survivors and assessing areas for aid because information provided and opinions given will not likely reflect the dire situations being experienced.”

One difficulty in these situations is that mental health workers usually hurriedly arrive from other countries and may not fully understand how trauma and psychological distress are experienced by the local population, or how they integrate with other sorts of decision-making.

We tend to assume that trauma is a universal reaction to a difficult situation but this singular concept is something of a mirage – common psychological reactions to devastation have differed over time and differ between cultures.

The model of trauma described as the diagnosis of post-traumatic stress disorder or PTSD simply doesn’t fit the common reactions of people from many cultures, despite the fact that this is the most common conceptual tool used by Western mental health workers.

In a 2001 article for the British Medical Journal psychiatrist Derek Summerfield noted:

Underpinning these constructs is the concept of “person” that is held by a particular culture at particular point in time. This embodies questions such as how much or what kind of adversity a person can face and still be “normal”; what is reasonable risk; when fatalism is appropriate and when a sense of grievance is; what is acceptable behaviour at a time of crisis including how distress should be expressed, how help should be sought, and whether restitution should be made.

In these cases, not understanding the local culture may mean that aid workers may assume that individuals don’t understand the risks of the situation, when, in fact, each may be basing their risk assessment on different priorities – as has been found in studies on cultural differences in risk perception.

Treating trauma seems like a no brainer. It intuitively seems like one of the most worthy and naturally important responses to a disaster, which is probably why disaster areas are now often flooded with ‘trauma counsellors’ after the event (Ethan Watters’s book Crazy Like Us charts the response to the 2004 tsunami in Sri Lanka where floods of well-meaning but poorly trained therapists arrived in the following weeks much to the bafflement of the locals and annoyance of the established relief organisations).

However, this is one of few areas where well meaning but poorly prepared therapists can actually do harm. Although experiencing extreme danger raises the risk of mental illness, contrary to popular belief, only a minority of people caught up in disasters will experience psychological trauma and immediate psychological treatment, either in single or multiple sessions has found to be useless or to make matters worse.

The psychological impact of devastation changes through time and space and we need to be careful to understand its local significance lest we inadvertently amplify the chaos.

Link to Haiti Rewired on the ‘psychological typhoon eye’.

Towards an aesthetics of urban legends

Photo by Flickr user quinn.anya. Click for sourceThe Point of Inquiry podcast has a great discussion with psychologist Scott Lilienfeld about his new book ’50 Great Myths of Popular Psychology’ and why scientific-sounding mental fairy tales persist, despite them having no good evidence to support them.

The most interesting bit is where Lilienfeld tackles why such myths have their psychological power, which to me is far the most interesting aspect of why certain stories perpetuate.

Some ideas seem to have properties that give them social currency. Here’s one of my favourite and you can try it out yourself – the usual format of the conversation goes something like this:

– Remember Bobby McFerrin, the ‘Don’t Worry Be Happy’ guy?
– Yeah, I remember him.
– Killed himself.
– Huh, that figures.

This myth has no evidence for it whatsoever, Bobby McFerrrin is alive and well, but it became so widespread that Snopes created a page debunking the story.

What is it about this story that makes it so easily accepted? Or perhaps, we should ask, what is it about this story which makes it so attractive to pass on to others?

There has been a considerable amount of research on the psychology of rumours that attempts to explain why we are motivated to spread them. A fantastic book called Rumor Psychology reviews the research which indicates that uncertainty, importance or outcome-relevant involvement, lack of control, anxiety, and belief are crucial – but this doesn’t seem to apply to all such rumours (as an aside, it’s interesting that these principles seem rarely applied in military PsyOps campaigns e.g. see PsyWar.org Iraq war leaflet archive).

On a personal level, you can see how these principles might apply to trite ‘women are from mars, women are from venus’ pop relationship psychology, but it doesn’t seem to apply quite so well to the commonly repeated myth that we use only 10% of our brains.

And when we consider the ‘Bobby McFerrin topped himself’ story, none of it seems relevant. Perhaps this is better thought of as ‘gossip’, but unfortunately the psychology of gossip is much less developed and relies largely on pseudo-evolutionary ideas about social bonding and the like (Robin Dunbar’s book Grooming, Gossip, and the Evolution of Language is perhaps the most developed example of this).

I often wonder if we need an experimental aesthetics of information that helps us understand why such stories are inherently attractive, in the same way that studies have begun to focus on what makes certain tunes catchy.

Link to Point of Inquiry podcast on PopPsy myths.

At the yawn of time

The journal Frontiers of Neurology and Neuroscience has an paper that looks at how rates of yawning change throughout our life.

It has a slightly surreal feel to it, and I can’t help imaging yawn scientists carefully tracking the behaviour across the globe with overly complicated machines, like something out of a Roald Dahl book.

Yawning throughout Life.

Front Neurol Neurosci. 2010;28:26-31.

Giganti F, Salzarulo P.

Yawning is a behavior that begins in the first stages of life. It has not only been observed in infants and in newborns, but also in fetuses of 12-14 weeks’ gestational age. Yawning frequency changes over the life span. In preterm infants, the number of yawns decreases between 31 and 40 weeks’ postconceptional age, mainly during the day. In this period of life, yawning is an isolated behavior rarely occurring in bursts, and its frequency is quite low with respect to adults. The incidence of yawning seems to increase when children attend elementary school, whereas this is reduced in the elderly. Aged people yawn less than younger ones, mainly during morning and mid-afternoon. In adults, the time course of yawning is associated with the time course of sleepiness, except upon awakening when the high frequency of yawns is not associated with high sleepiness. In adults, yawning frequency increases in the early morning and in the late evening, whereas at the earliest stages of development (fetuses and preterm infants) yawning does not show diurnal variations. Yawning seems to be involved in the modulation of arousal process across the whole life span. In preterm infants, yawning is often followed by motor activation and it is more common during waking than sleep; in adults, yawning occurs mainly at sleep onset and upon awakening.

Link to PubMed entry for paper on ‘Yawning throughout Life’.

The personality of the Messiah

What is Jesus’ Myers-Briggs personality profile? Rather to my surprise, it turns out that lots of people have tried to answer this question.

The Myers-Briggs Type Indicator (MBTI) questionnaire was created as a systematic approach to classifying people’s personality based on categories originally proposed by Swiss psychoanalyst Carl Jung.

The Mormon Matters website has a completely charming article that attempts to analyse Jesus’ personality in terms of the Myers-Briggs types and concludes he’s an INFP – an Introverted, iNtuiting, Feeling, Perceiving type.

If this seems a little flippant for you – pay attention Anglican Vicars: the Sermons That Work website has a pre-written sermon that discusses Our Lord’s Myers-Briggs type and informs the flock that he’s likely a INFJ – an Introverted, iNtuiting, Feeling, Judging type.

Profiling Jesus seems to have become a minor passtime in some circles. In fact, Yahoo! Answers has a thread where people were discussing the possibilities. The thread is marked as a ‘Resolved Question’ (!) with the best answer being voted as ENFJ – an Extroverted, iNtuiting, Feeling, Judging type.

Anecdotal evidence! I hear you cry. Fear not, there is some peer-reviewed data on the personality of the Messiah.

The Journal of Psychology and Theology published a paper entitled “Students’ perceptions of Jesus’ personality as assessed by Jungian-type inventories” back in 2004. You can read the full text online, but the abstract alone is pure joy:

The present study was the first phase of an exploration of college students’ perceptions of the personality of Jesus Christ as assessed by two Jungian-type inventories, the Myers-Briggs Type Indicator (Myers, 1998) and the Keirsey Temperament Sorter II (Keirsey, 1998), which categorize personality along four dimensions: Extraversion/Introversion, Sensing/Intuition, Thinking/Feeling, and Judgment/Perception. Along with an overall exploration of students’ perceptions, the present study focused on whether students were likely to make self-based attributions in their perceptions of Jesus’ personality. Results indicated that students perceived Jesus to be an Extravert Feeler and made self-based attributions along the Sensing/Intuitive dimension, with 43% perceiving Him to be an Intuitive-Feeler and 37% perceiving Him to be a Sensing-Judger. Perceptions of Jesus as a Judger or Perceiver were divided, with those placing more importance on modeling Jesus more likely to see Him as a Judger, and those placing less importance on modeling Him perceiving Jesus as a Perceiver.

Since we’re already working on his Myers Briggs profile, I wonder if someone would hazard a guess at how he would score on the… oh stop it already. You’ll only give Dan Brown ideas.

Link to study on students’ MBTI profiles for Jesus.

Bernardino Álvarez, asylum bandit

The founder of the oldest psychiatric hospital in Latin America was an ex-soldier turned criminal who broke out of jail, escaped the law with the help of a prostitute, and eventually ended up destitute after spending his entire fortune caring for the mentally ill.

I’ve just discovered the amazing story of Bernardino Álvarez after reading up on the (surprisingly sparse) literature on the history of psychiatry in Latin American and particularly the Hospital de San Hipólito in Mexico City, the oldest institution on the continent.

The hospital, still in existence as the Fray Bernardino Hospital (although, apparently not in the original building), was founded in 1567 by Álvarez – a remarkable chap who became interested in caring for the mentally ill after attempting for to make amends for a life spent fighting, gambling, debauching, whoring and living off daring crime sprees.

This is from what seems to be the only English language article on his life, from a 1972 edition of the American Journal of Psychiatry. It reads like a movie script:

After arriving at what is now Mexico City he was sent to the countryside and fought in several actions in the war against the chichimecas in the north of New Spain. Apparently he was a soldier without too many scruples, for a biographer says that “hate, tears and curses‚” usually followed him. He wanted a shortcut to wealth, however; he disliked discipline and had no taste for the military life.

After this campaign Álvarez returned to Mexico City, then a lively and tempting emporium. Soon he was in trouble, gambling and robbing the gambling houses, drinking heavily, rebelling against the law, joining the delinquents of the city, and eventually being chosen the leader of a small gang. “A handsome and perfidious demon”: this is the way he was described at that time. Finally he and his band were apprehended, imprisoned, and sentenced to forced labor in China. They escaped from prison, though, killing three guards in the process. Some of the band were eventually caught again and hanged but Álvarez, through the aid of a close friend, a prostitute, got arms, money, and horses. He fled to Acapulco and then by sea to Peru.

He later became a wealthy and legitimate business man and, shocked by the way the mentally ill were treated, used his money to build the first mental hospital in the New World.

He was so dedicated that he apparently ended up spending his entire fortune on his new found mission and ended up living in a meagre cell in his own hospital by the time he died.

UPDATE: Thanks to Avicenna who points out that there’s a full version of the article online here. Thanks!

Link to PubMed entry for article on Bernardino Álvarez.