Twenty years of fMRI

Functional magnetic resonance imaging, better known as fMRI, is 20 years old this week. October’s NeuroPod marks the celebrations by looking back at the brain scanning technology, it’s successes, and its troublesome teenage years.

The imaging technique was first announced in a 1991 study published in Science that announced how a standard MRI scanner could be used to used to track where oxygenated and deoxygenated blood flowed in the brain.

The technique takes advantage of the fact that haemoglobin, the iron containing protein that carries oxygen to essential tissues in the body, is differently magnetic when it is carrying oxygen, in comparison to when it is oxygen depleted.

The scanner is essentially a large electromagnet that aligns the proton spin of hydrogen atoms in the body, plus a radio frequency pulse that knocks them out of alignment.

Like shaking a compass, the protons move back into alignment again. The speed of return depends on the density of the body tissue, just like the speed of a compass needle returning to north depends on the density of the liquid in which it rests.

fMRI tunes in to the different magnetic echoes, or more technically, the magnetic resonance, of the protons realigning in oxygenated and deoxygenated blood.

As more active brain areas need more oxygenated blood, it’s possible to infer which tasks or mental activities are most associated with activity in certain brain areas by statistically comparing maps of magnetic resonance differences when people undertake different mental tasks in the scanner.

Although the technique can pinpoint where these changes take place in the brain, down to about the nearest millimetre, blood flow is not the same as actual brain activity, so it is not a precise measure.

Furthermore, changes can only be tracked in time slices of a second or more, clearly missing some of the changes in the fast moving brain, and statistical choices during analysis can affect the outcome sometimes as much as the task itself.

But despite the disadvantages, and with data from other types of study and imaging techniques, fMRI has become an essential scientific tool in the quest to understand the link between the mind and brain.

The piece has interviews with neuroscientists Karl Friston and Russ Poldrack, both involved in fMRI since its early days, who talk about the genuine progress and unfortunate hype that surrounds the technique.

A fantastic look back of the first two decades of fMRI and the other sections of the NeuroPod podcast are equally as interesting.

Link to October’s NeuroPod.

Invasion of the disembody snatchers

The latest edition of The Psychologist has a fantastic article on the psychology of horror, taking in everything from the popularity of cultural themes like zombies and vampires to research into the enjoyment of slasher films.

It’s a really comprehensive look at the both the psychological concept, the feeling of horror, and where its origins may lie in our evolutionary and cultural past, as well as numerous studies on how we react to fear and horror, both in real life and in entertainment.

This bit particularly caught my eye.

Related to this is the ‘snuggle theory’ – the idea that viewing horror films may be a rite of passage for young people, providing them with an opportunity to fulfil their traditional gender roles. A paper from the late 1980s by Dolf Zillmann, Norbert Mundorf and others found that male undergrads paired with a female partner (unbeknown to them, a research assistant), enjoyed a 14-minute clip from Friday the 13th Part III almost twice as much if she showed distress during the film. Female undergrads, by contrast, said they enjoyed the film more if their male companion appeared calm and unmoved. Moreover, men who were initially considered unattractive were later judged more appealing if they displayed courage during the film viewing.

Surely asking people to watch horror films with a companion who is secretly working with psychologists to study your reactions to fear is a fantastic plot for a horror film.

Yours for only $1 million Wes Craven.

Link to ‘The Lure of Horror’ in The Psychologist.

Declaration of interest: I’m an unpaid associate editor and occasional columnist for The Psychologist. I avoid exploring abandoned houses on the edge of town.

Nasal mummy exit

A new study just published in the Journal of Comparative Human Biology takes an enthusiastic look at exactly how the Ancient Egyptians removed the brain from cadavers before they were mummified.

You’ll be pleased to know that a variety of techniques were used over the millennia but unfortunately none make for particularly good dinner time conversation owing to them being slightly gory.

But for those not gathered round the table, the article is joyously over-detailed. In this part, the authors consider the history of scientific attempts to understand how you get a brain out of a dead person working only through the nose.

Speculation surrounding the steps following perforation has inspired experimental attempts at excerebration in sheep and human cadavers. The general consensus is that either the brain was macerated by means of the vigorous insertion and rotation of the perforation tool or other similar instrument, or that the brain was simply allowed to liquefy in the hot Egyptian environment. The first method, consistent with the account of Herodotus, is withdrawal of residues on the perforation tool or its like and Macalister (1894) refers to a three-toothed hook pictured in Chabas’ Études sur l’Antiquité Historique (p. 79) that may have been used to this effect.

Similarly, Pirsig and Parsche (1991) suggest that a bamboo rod tied with linen may have sufficed for this piecemeal extraction of semi-liquid brain. Both of these techniques are time intensive, with the rod drawing out little of the brain on each retraction. Alternative to, or in conjunction with, the previous method it has been suggested that the liquefied or semi-liquid brain might be allowed to drain from the cranium by placing the body prone. This process might also be expedited by flushing the cranium with water or other fluids, such as the cedar oil used to dissolve organs in Herodotus’ account of the “second process” of mummification.

The ‘experimental attempt’ at trying this out on a human cadaver is referenced to a 1911 German book by Karl Sudoff which has a title that translates to ‘Egyptian mummification instruments’.

I can’t imagine exactly how the experiment came about but presumably the chap got so enamoured with the tools he was collecting he just wanted to ‘have a bit of a go himself’.

Link to locked article. Or rather, entombed.

Bad celebrity tie-ins

No celebrity disaster is too tragic to remind us of an interesting fact about cognitive science. Some lowlights from the genre.

Lindsay Lohan is likely to be jailed for violating her probation says The Christian Science Monitor – clearly an example illustrating recent findings from research on how behavior is influenced by like-minded cohorts rather than essential values.

Charlie Sheen? say CBS. I suspect you want to hear about a new study on the cognitive science of self-deception. Guest appearance by Colonel Qaddafi.

An anti-semitic tirade by Mel Gibson reported by The LA Times. Quick, look over there! Wha..? Oh nothing. The neural basis of the alcohol related disinhibition.

The New York Times don’t know how Amy Winehouse tragically died but if you’re thinking what I’m thinking (wink, wink) then why wouldn’t you want to hear about the role of genes, environment and psychology in overdose and addiction.

But this, from The Globe and Mail, surely takes the biscuit. It contains a paragraph that will probably be stolen by The Onion.

But neuroscientists, despite 15 years of brain-imaging studies, are unable to define the circuitry involved in creative thinking. They don’t know what is different about the brains of creative geniuses like Steve Jobs, the visionary co-founder of Apple Inc. who died on Wednesday.

Elvis, of course, was a neuroscientist.

Ten years of the language gene that wasn’t

It’s now ten years since mutations in the FOXP2 gene were linked to language problems, which led to lots of overblown headlines about a ‘language gene’, which it isn’t.

The actual science is no less interesting, however, and Discover Magazine has a fantastic article that looks back on the last decade since the gene’s discovery and what it tells us about the complex genetics that support lingustic development and expression.

There’s also a fascinating bit about the history of attempts to explain how humans developed language, which apparently got so ridiculous that speculation was banned by learned societies in the 19th century:

Lacking hard evidence, scholars of the past speculated broadly about the origin of language. Some claimed that it started out as cries of pain, which gradually crystallized into distinct words. Others traced it back to music, to the imitation of animal grunts, or to birdsong. In 1866 the Linguistic Society of Paris got so exasperated by these unmoored musings that it banned all communication on the origin of language. Its English counterpart felt the same way. In 1873 the president of the Philological Society of London declared that linguists “shall do more by tracing the historical growth of one single work-a-day tongue, than by filling wastepaper baskets with reams of paper covered with speculations on the origin of all tongues.”

Like a 19th century reverse scientific X-Factor where people voted to ban people from speculating further. I think I may have found a gap in the market.

Link to Discover article on ‘The Language Fossils’.

Make study more effective, the easy way

Decades old research into how memory works should have revolutionised University teaching. It didn’t.

If you’re a student, what I’m about to tell you will let you change how you study so that it is more effective, more enjoyable and easier. If you work at a University, you – like me – should hang your head in shame that we’ve known this for decades but still teach the way we do.

There’s a dangerous idea in education that students are receptacles, and teachers are responsible for providing content that fills them up. This model encourages us to test students by the amount of content they can regurgitate, to focus overly on statements rather than skills in assessment and on syllabuses rather than values in teaching. It also encourages us to believe that we should try and learn things by trying to remember them. Sounds plausible, perhaps, but there’s a problem. Research into the psychology of memory shows that intention to remember is a very minor factor in whether you remember something or not. Far more important than whether you want to remember something is how you think about the material when you encounter it.

A classic experiment by Hyde and Jenkins (1973) illustrates this. These researchers gave participants lists of words, which they later tested recall of, as their memory items. To affect their thinking about the words, half the participants were told to rate the pleasantness of each word, and half were told to check if the word contained the letters ‘e’ or ‘g’. This manipulation was designed to affect ‘depth of processing’. The participants in the rating-pleasantness condition had to think about what the word meant, and relate it to themselves (how they felt about it) – “deep processing”. Participants in the letter-checking condition just had to look at the shape of the letters, they didn’t even have to read the word if they didn’t want to – “shallow processing”. The second, independent, manipulation concerned whether participants knew that they would be tested later on the words. Half of each group were told this – the “intentional learning” condition – and half weren’t told, the test would come as a surprise – the “incidental learning” condition.

I’ve made a graph so you can see the effects of these two manipulations

As you can see, there isn’t much difference between the intentional and incidental learning conditions. Whether or not a participant wanted to remember the words didn’t affect how many words they remembered. Instead, the major effect is due to how participants thought about the words when they encountered them. Participants who thought deeply about the words remembered nearly twice as many as participants who only thought shallowly about the words, regardless of whether they intended to remember them or not.

The implications for how we teach and learn should be clear. Wanting to remember, or telling people to remember, isn’t effective. If you want to remember something you need to think about it deeply. This means you need to think about what you are trying to remember means, both in relationship to other material you are trying to learn, and to yourself. Other research in memory has shown the importance of schema – memory patterns and structures – for recall. As teachers, we try and organise our course material for the convenience of students, to best help them understand it. Unfortunately, this organisation – the schema – for the material then becomes part of the assessment and something which students try to remember. What this research suggests is that, merely in terms of remembering, it would be more effective for students to come up with their own organisation for course material.

If you are a student the implication of this study and those like it is clear : don’t stress yourself with revision where you read and re-read textbooks and course notes. You’ll remember better (and understand much better) if you try and re-organise the material you’ve been given in your own way.

If you are a teacher, like me, then this research raises some disturbing questions. At a University the main form of teaching we do is the lecture, which puts the student in a passive role and, essentially, asks them to “remember this” – an instruction we know to be ineffective. Instead, we should be thinking hard, always, about how to create teaching experiences in which students are more active, and about creating courses in which students are permitted and encouraged to come up with their own organisation of material, rather than just forced to regurgitate ours.

Reference: Hyde, T. S., & Jenkins, J. J. (1973). Recall for words as a function of semantic, graphic, and syntactic orienting tasks. Journal of Verbal Learning and Verbal Behavior, 12(5), 471–480.

Now available in Italian Insegnare ed apprendere in modo efficace (thanks Giuliana!)

Steven Pinker: a life in brawls

There’s an excellent interview with Steven Pinker on the BBC Radio 4 programme The Life Scientific that takes a look back at his work and his involvement with a long list of enjoyable controversies.

For those over-saturated with discussion about his new book on the decline of violence, The Life Scientific interview is actually a refreshing retrospective that reviews his career as a whole.

It tackles everything from the cognitive science of word learning to brawls over the influence of genetics on human behaviour (bonus segment: Oliver James making a tit of himself in a live radio debate).

A thoroughly engrossing discussion although if you want the podcast you’ll have to download it from a separate page (linked below) because linking to the podcast is a bit too advanced for the BBC.

Link to BBC Pinker interview and streaming audio.
Link to podcasts of The Life Scientific interviews.