Magic in mind

Interest in the cognitive science of magic is really hotting up with Nature Neuroscience having just published a review article jointly authored by some leading cognitive scientists and stage illusionists. They argue that by studying magic, neuroscientists can learn powerful methods to manipulate attention and awareness in the laboratory which could give insights into the neural basis of consciousness itself.

The neuroscientists involved are Stephen Macknik and Susana Martinez-Conde, while the magicians are Mac King, James Randi, Apollo Robbins, Teller from Penn and Teller, and John Thompson.

If this collection of names sounds familiar, it’s because this time last year the same group presented a symposium at the Association for the Scientific Study of Consciousness on ‘The Magic of Consciousness’.

The new article rounds up the conference discussion and The Boston Globe has a piece looking at some of the highlights.

This is not the only cognitive science article that explores what neuroscience can learn from the mystic arts. In a forthcoming article [pdf] for Trends in Cognitive Sciences psychologist Gustav Kuhn.

Kuhn has done some fantastic experimental studies looking at eye movements and attention of people watching magic tricks.

It’s not only an academic interest as Kuhn is apparently an illusionist himself and he’s one of a number of psychologists who also happen to be stage magicians. Just off the top of my head psychologists Richard Wiseman and Robert Moverman are also ex-professional conjurers. I’ve come across several others and so its perhaps not so surprising that these new articles have been published, but more that they took so long.

Both articles look at some common and no so common magic tricks and explain the cognitive science behind how they work:

Persistence of vision is an effect in which an image seems to persist for longer than its presentation time12, 13, 14. Thus, an object that has been removed from the visual field will still seem to be visible for a short period of time. The Great Tomsoni’s (J.T.) Coloured Dress trick, in which the magician’s assistant’s white dress instantaneously changes to a red dress, illustrates an application of this illusion to magic. At first the colour change seems to be due (trivially) to the onset of red illumination of the woman. But after the red light is turned off and a white light is turned on, the woman is revealed to be actually wearing a red dress. Here is how it works: when the red light shuts off there is a short period of darkness in which the audience is left with a brief positive after-image of the red-dressed (actually white-dressed but red-lit) woman. This short after-image persists for enough time to allow the white dress to be rapidly removed while the room is still dark. When the white lights come back, the red dress that the assistant was always wearing below the white dress is now visible.

Link to Nature Neuroscience article (via BB).
pdf of Trends in Cognitive Science article.
Link to Boston Globe write-up.

Encephalon 51 arrives with a flourish

The rather poetic 51st edition of the Encephalon psychology and neuroscience writing carnival has just been published online and is graciously hosted by The Mouse Trap.

It has a distinctly poetic theme on this occasion, with a set of cognitive science haikus enlivening proceedings.

A couple of my favourite posts include one on the continuing mirror neuron hype and another on the cultural feedback loop between psychiatry and our expression of mental distress.

Link to Encephalon 51.

On the edge of truth

Discover Magazine has a brief but interesting interview with ex-NSA psychologist Eric Haseltine, who directed research into interrogation and lie detection.

He discusses the use of new technologies that measure body and brain function – i.e. the still not-yet-very-good ‘brain scan lie detectors’ – but also talks about the skills humans need to be able to pick up when someone is trying to deceive them.

Interestingly, he cites the development of human skills as where the biggest advances are likely to be made in the future:

What is the hottest area today in deception detection?

Human lie detectors. I think the low-tech training of humans to be better interpreters of information is where the most productive work is going to be. The reason being that you can either train a human to do it or train a computer to do it, and human brains are still much better computers than computers are.

Link to Discover Magazine interview with Haseltine.
Link to New Yorker article on the shortcomings of ‘brain scan lie detection’.
Link to past interview with Haseltine on US national security.

Interview with self-trepanner, Heather Perry

Neurophilosophy has a fantastic interview with Heather Perry, a 37-year old British woman who organised a modern-day trepanation to insert a hole in her skull in an attempt to alter her state of consciousness.

Perry gives a lucid insight into her motivations and describes the rather ad-hoc operation in rather gory detail:

How exactly did you perform the trepanation?

I used a hand trepan initially, but that wasn’t proving to be terribly successful. Then there was a problem with the people who owned the property we were staying in, so we decided we’d have to just leave it. I wrapped my head up in a towel and we got out of there. A couple of days later, we had another go. We abandoned the hand trepan and got an electric drill instead. I injected myself with a local anaesthetic and then slashed a big T-shaped incision in my scalp, right down to the bone. I was sat there in the bathroom feeling quite relaxed and they started with the drill. It didn’t take that long at all, probably about 20 minutes. Eventually I could feel a lot of fluid moving around. Apparently, there was a bit too much fluid shifting around, because they’d gone a little bit too far and I was leaking some through the hole, but this wasn’t especially dangerous as there are three layer of meninges before you get to the brain.

It’s an interesting read not least because Perry is rather circumspect when discussing the procedure.

You might expect that someone who had arranged for a hole to be drilled in her skull to be completely convinced about the rather far-out claims for trepanation.

While she does mention some claimed effects and findings, she seems quite measured in her assessment and largely seems to have tried the procedure as an exploration rather than a ‘cure’ in any specific sense.

Link to Neurophilosophy interview with Heather Perry.

On the brains of the assassins of Presidents

This is a wonderfully written summary that tells the story of how two father-and-son doctors were involved post-mortem brain examinations of the assassins of the US Presidents James Garfield and William McKinley.

The article is by neuroanatomist Duane Haines although unfortunately, I haven’t read or even got access to the full paper. Luckily, the abstract is just a joy to read in itself. A curious slice of neurological history in 300 words.

Spitzka and Spitzka on the brains of the assassins of presidents.

J Hist Neurosci. 1995 Sep-Dec;4(3-4):236-66.

Haines DE.

Although four American Presidents have been assassinated (Lincoln, Garfield, McKinley, Kennedy), only the assassins of Garfield (Charles Julius Guiteau) and McKinley (Leon Franz Czolgosz) were tried, convicted, and executed for their crime. In 1882 Edward Charles Spitzka, a young New York neurologist with a growing reputation as an alienist, testified at the trial of Guiteau.

He was the only expert witness who was asked, based on his personal examination of the prisoner, a direct question concerning the mental state of Guiteau. Spitzka maintained the unpopular view that Guiteau was insane. In spite of aggressive and spirited testimony on Spitzka’s part, Guiteau was convicted and hanged. However, even before the execution it was acknowledged, by some experts, that Spitzka was undoubtedly right.

About 20 years later, in 1901, Edward Anthony Spitzka, the son of Edward Charles Spitzka, was invited to conduct the autopsy on Czologsz, the assassin of McKinley. At the time Spitzka the younger, who had just published a detailed series of papers on the human brain, was in the fourth year of his medical training. It was an unusual series of fortuitous events that presumably led to Edward A. Spitzka conducting the autopsy on the assassin of the President of the United States while still a medical student. This, in light of the fact that other experts were available.

Each Spitzka went on to a career of note and each made a number of contributions in their respective fields. It is however, their participation in the ‘neurology’, as broadly defined, of the assassins of Presidents Garfield and McKinley that remains unique in neuroscience history. Not only were father and son participants in these important events, but these were the only times that assassins of US Presidents were tried and executed.

Edward Spitzka was also known as one of the main proponents of the idea that masturbation caused madness, and wrote an 1887 article outlining 12 cases of ‘masturbatic insanity’.

Link to PubMed entry.

Constraining the ancient mind

As part of Seed Magazine’s on innovative thinkers in science, they published a podcast interview with archaeologist Lambros Malafouris who is pioneering the study of ancient cultural artefacts as a way of constraining theories in evolutionary psychology.

One of the criticisms of some evolutionary psychology is that it too often involves over-interpretation and ‘just so’ stories – explanations of why we have certain psychological attributes that are stories rather than hypotheses that can be easily tested.

Malafouris has taken the novel approach of using the findings from archaeology to systematically generate and test theories of the evolution of the mind. He seems particularly interested in embodied cognition, the idea that the mind can only be understood in relation to how it interacts with the world through body and action.

The mainstream approach to cognition holds that it happens in the mind and that material culture is nothing more than an outgrowth of our mental capacities. Archaeologist Lambros Malafouris is challenging this deep-seated idea with a radical new notion: the hypothesis of extended mind, which posits that material culture is not a reflection of the human mind but an actual part of it. Take, for instance, a blind man’s stick. “Where does the blind man end and the rest of the world begin?” he says. “You might see the stick as something external, but it plays a very important role in the perceptual system of this person. It extends the boundaries of this human‚Äîthe stick becomes an integral part of the cognitive architecture.”

If material culture is an extension of human cognition, our engagement with it has actively shaped the evolution of human intelligence, Malafouris argues. For example, ancient clay tablets that allowed people to actually write down records were not mere objects, he says. Instead, they became integral adjuncts of the human memory system. The invention of such a technology “changes the structure of the human mind,” says Malafouris, a post-doctoral fellow at the University of Cambridge. Rather than happening wholly in the head, he argues, cognition develops and evolves through the interplay between intelligence and material culture.

In fact, there’s an increasing focus on related ideas. Some of my favourite studies have been done by psychologist Dennis Proffitt who has found numerous effects of tool use on thinking and perception.

One of my favourite studies is where he found that we perceive distances as shorter when we have a tool in our hand, but only when we intend to use it.

Malafouris is using these ideas and adds to the relatively new but exciting field of cognitive archaeology.

Link to Seed interview with Lambros Malafouris.

Avalance of new SciAmMind articles

The new edition of Scientific American Mind has just appeared with a whole host of new freely-available articles available online covering the psychology of storytelling, gifted children, genius, animal intelligence, scent, smell and learning through error.

My favourite is the article on the psychology of storytelling and narrative, and why it could intricately bound up in the cognitive abilities we’ve developed to navigate the social world.

The article is quite wide ranging, dipping into anthropology, cognitive and evolutionary psychology to explore why stories are so central to cultures across the world.

Perhaps because theory of mind is so vital to social living, once we possess it we tend to imagine minds everywhere, making stories out of everything. A classic 1944 study by Fritz Heider and Mary-Ann Simmel, then at Smith College, elegantly demonstrated this tendency. The psychologists showed people an animation of a pair of triangles and a circle moving around a square and asked the participants what was happening. The subjects described the scene as if the shapes had intentions and motivations—for example, “The circle is chasing the triangles.” Many studies since then have confirmed the human predilection to make characters and narratives out of whatever we see in the world around us.

But what could be the evolutionary advantage of being so prone to fantasy? “One might have expected natural selection to have weeded out any inclination to engage in imaginary worlds rather than the real one,” writes Steven Pinker, a Harvard University evolutionary psychologist, in the April 2007 issue of Philosophy and Literature. Pinker goes on to argue against this claim, positing that stories are an important tool for learning and for developing relationships with others in one’s social group. And most scientists are starting to agree: stories have such a powerful and universal appeal that the neurological roots of both telling tales and enjoying them are probably tied to crucial parts of our social cognition.

Link to August 2008 SciAmMind.

Cognitive restructuring and the fist bump terrorists

The recently satirical New Yorker cover depicting Obama and his wife as fist-bumping Islamic terrorists comes under fire in an article for The Chronicle by psychologist Mahzarin Banaji who argues that it irresponsibly creates an implicit association between “Obama and Osama”. Banaji is almost certainly right, but neglects higher levels of cognition which can make this ineffectual.

Banaji is most known for her extensive work on the implicit association test (IAT), which we discussed only the other day. What this and other work has shown is that despite our conscious thoughts (“hair colour has no association with intelligence”) we still might have an unconscious bias that associates certain concepts (‘blonde’ and ‘dim’).

Along these lines, Banaji suggests that the artist, Barry Blitt, who created the picture has harmed the political debate by unintentionally strengthening an inappropriate link:

The brain, Blitt would be advised to understand, is a complex machine whose operating principles we know something about. When presented with A and B in close spatial or temporal proximity, the mind naturally and effortlessly associates the two. Obama=Osama is an easy association to produce via simple transmogrification. Flag burning=unpatriotic=un-American=un-Christian=Muslim is child’s play for the cortex. Learning by association is so basic a mechanism that living beings are jam-packed with it ‚Äî ask any dog the next time you see it salivating to a tone of a bell. There is no getting around the fact that the very association Blitt helplessly confessed he didn’t intend to create was made indelibly for us, by him.

It is not unreasonable, given the inquiring minds that read The New Yorker, to expect that an obvious caricature would be viewed as such. In fact, our conscious minds can, in theory, accomplish such a feat. But that doesn’t mean that the manifest association (Obama=Osama lover) doesn’t do its share of the work. To some part of the cognitive apparatus, that association is for real. Once made, it has a life of its own because of a simple rule of much ordinary thinking: Seeing is believing. Based on the research of my colleague, the psychologist Daniel Gilbert, on mental systems, one might say that the mind first believes, and only if it is relaxing in an Adirondack chair doing nothing better, does it question and refute. There is a power to all things we see and hear ‚Äî exactly as they are presented to us.

It strikes me that Banaji is perhaps being a little disingenuous here. Certainly the advert does strengthen that unconscious association, but, as as the intention of most satire, it attempts to include another association into the mix – that of absurdity.

In other words, the idea of the cartoon is presumably to trigger the association Obama = terrorist, but also include another so it becomes Obama = terrorist = absurd. It’s the humourists equivalent of the reductio ad absurdum argument.

Of course, this can rely as much on the same implicit associations as Banaji mentions, but it can be also seen to work very effectively through a process of reinterpretation that alters the impact of automatic connections through changing their meaning.

In fact, this process so can be so powerful that it is used to treat psychiatric problems.

In clinical work it is called ‘cognitive restructuring’. For example, in panic disorder, people begin to interpret normal bodily reactions (increased heart rate, temperature etc) as a sign of impending heart attack or other danger, which leads to more anxiety, further interpretations and a spiral of terrifying anxiety.

Cognitive restructuring teaches people that these bodily changes and worried thoughts aren’t signs of an impending heart attack, they’re normal reactions, and the spiral of anxiety is not a risk to your health, just a pattern you’ve got into. In other words, they begin to believe something different about the significance of the link.

Humour also relies on a process of reinterpretation. Most theories of humour stress that it usually requires the reframing of a previously held association.

However, the key to good satire is that this reframing should be obvious and we might speculate that the reframing effect should be more powerful than the effect of simply reviving the old association.

We can perhaps wonder then, whether the controversy over the New Yorker cover is not that it made an association between Obama and terrorism, but that it was not effective enough in making it obviously absurd.

I suspect one of the difficulties is that the cartoon was actually attempting to satirise not Obama, but the media discussion of him. This is always a risky strategy because it requires so much cognitive abstraction that the automatic association is far more apparent.

Link to Banaji’s article in The Chronicle.

2008-08-01 Spike activity

Quick links from the past week in mind and brain news:

Awesome Developing Intelligence post gives a remarkably concise review of cognitive science and discusses what this tells us about the best targets for cognitive enhancement.

BookForum looks at two memoirs that recount the psychological and physical intricacies of illness of the body and brain.

The mighty Language Log has a great analysis looking at the fallacies of yet another popular piece on sex differences in mind and brain.

The Economist has an article on the science of cognitive nutrition.

The ideas behind ‘critical neuroscience‘ are discussed by Neuroanthropology.

Eric Schwitzgebel on the Wittgensteinian puzzle of whether philosophy solves problems with language or problems with the world.

ABC Radio National’s The Philosopher’s Zone has an interesting discussion on the philosophy of moral dilemmas.

While we’re on the subject of morality the NYT Freakanomics blog has two guest posts on moral hypocrisy.

Sharp Brains has a special on mind and brain haikus.

ABC Radio National’s In Conversation looks at the anthropology of sisters, mothering and motherhood across the world’s cultures.

Dr Petra has the most sensible post you’ll read about the recent news reports on Viagra supposedly increasing sexual function in women who take antidepressants.

Advances in object recognition around age 2 may herald symbolic thought, reports Science News.

Pure Pedantry has an interesting commentary on the merits of postponing your alcoholism.

Perpetually falling woman learns to balance with her tongue. The Telegraph has a story about a woman who has lost her sense of balance owing to brain injury.

The Primary Visual Cortex is an excellent new blog on vision science and perception.

A robot that “resembles the love child of a monkey and an iMac”. The Times has an excellent piece on robots designed to emotionally interface with humans.

Not Quite Rocket Science looks at a new study on language evolution in the lab and Wired Science has some further in-depth analysis.

A new book called ‘Brain Research for Policy Wonks’ is reviewed by Nature Reviews Neuroscience.

New Scientist has a special article and video report on the somewhat recursively titled ‘Seven Reasons Why People Hate Reason‘.

The psychology of motivation – when passionate interest becomes a business – is discussed by The Washington Post.

The New York times examines the methods and motivations of web trolls.

An eye-tracking study that compared how individuals with Williams syndrome (“hyper social”) and autism (“hypo social”) view pictures of social scenes is covered by The Neurocritic.

It is scientists who seek to get heaven in their heads

The wonderful image is an original drawing by the artist Masonic Boom, aka Kate St.Claire, as part of her series of psychological self portraits.

The quote in the image is from the author and philosopher G.K. Chesterton.

He was once asked by The Times to write an article on ‘What is wrong with the world?’ and send the following piece:

Dear Sirs,

I am.

Sincerely yours,
G. K. Chesterton

Thanks to Katie for allowing us to feature the image and it’s really worth seeing full size at the link below.

Link to full size image on Flickr.
Link to Masonic Boom collection.

The theatre of hysteria

I’m currently reading Elaine Showalter’s book Hystories, a cultural history of the concept of ‘hysteria‘, a term which has variously described the supposed effects of a ‘wandering womb’, unexplained neurological symptoms, panic, nervousness or just ‘making a fuss’.

She describes where medicine and media have collided, and highlights how popular interest in the condition has driven a long-standing tradition of fictional interpretations that have developed alongside medical understanding.

Showalter has a feminist angle although is generally even handed with the evidence and is not shy in highlighting the excesses of some past feminist writing on the subject.

One particularly interesting part is where she discusses how theatre interpreted the work of 19th century French neurologist Jean-Martin Charcot as it was happening.

Charcot is perhaps most famous for his work on hysteria and held regular Tuesday lectures at the Salp√™tri√®re hospital in Paris where he would theatrically demonstrate the symptoms of hysteria in favourite female patients who apparently ‘performed’ with an equal flourish.

As we mentioned previously, one of the reasons Charcot’s work was so widely known is because he used the newly developed technology of photography to create striking and sometimes pseudo-erotic portraits documenting the bodily contortions of his (largely) female patients. The picture on the right is of Augustine, one of his ‘star patients’.

These have been the inspiration for numerous contemporary plays, ballets, exhibitions and novels.

What I didn’t know was that these are not a modern phenomena, shows based on Charcot’s work work were popular since Charcot first began publishing his work and giving lectures (from p100):

As Charcot’s clinic achieved celebrity in the 1890s, images of hysteria cross over to theatre and cabaret. At the Chat Noir and Folies Berg√®re, performers, singers, and mimes who called themselves the “Harengs Saurs √âpileptiques” (The Epileptic Sour Herrings) or “Hydropathes” mimicked the jerky, zigzag movements of the hysterical seizure…

The poses of grande hyst√©rie enacted at the Friday spectacles of the Salp√™tri√®re closely resembled the stylized movements of French classical acting. Indeed, hysterical women at the clinic and fallen women in melodrama were virtually indistinguishable; the theatre critic Elin Diamond comments that both displayed “eye rolling, facial grimaces, gnashing teeth, heavy sighs, fainting, shrieking and choking; ‘hysterical laughter’ was a frequent stage direction as well as a common occurrence in medical asylums”…

Arthur Symons regarded the Moulin Rouge dancer Jane Avril as the embodiment of the age’s “pathological choreography.” These resemblances were not coincidental: writers, actresses cabaret performers and dancers like Avril attended Charcot’s matinees and then worked the Salp√™tri√®re style into their own performances.

An interesting twist is that Avril was actually treated by Charcot as a young girl after she ran away from an abusive mother and was admitted to the Salp√™tri√®re for ‘insanity’.

Link to details of Showalter’s book Hystories.
Link to first chapter.

The Maudsley cat

The not very good photo is of Coco, the Maudsley Hospital cat and one in a long line of felines who reside in psychiatric hospitals. Not all psychiatric hospitals have cats, but they’re not uncommon and exist as a sort of informal tradition of live-in feline therapy.

They’re very popular with both staff and patients, but their presence tends to drive managers up the wall, which just makes them all the more endearing. I’ve worked in three hospitals that have cats and almost invariably they live in the older adults ward, keeping the older folks company (and vice versa, of course).

The older adults ward at the Maudsley is called the Felix Post unit, after the distinguished psychiatrist of the same name. Coco’s predecessor was naturally called Felix, leading to occasional confusion where people assumed the ward was named after the cat.

As I hadn’t seen Coco all summer I enquired and it turns out he’s “gone to Liverpool”, which I’m assured isn’t a euphemism to protect those of fragile mood, but a genuine change in his location as the ward manager moved with Coco in tow. So for the first time in decades, the Maudsley is without a hospital cat.

Promising Alzheimer’s drug announced

The results of a moderate sized trial on a new Alzheimer’s drug have just been announced and the results, if reliable, may suggest that the treatment is one of the most important medical breakthroughs of the century.

Alzheimer’s disease is a type of dementia, a degenerative disorder of where the brain starts to degrade more quickly than would be expected through normal ageing.

One of the common features of Alzheimer’s disease is the accumulation of neurofibrillary tangles in the brain. These are clumps of tau protein that accumulate inside dying neurons. There have been debates about whether these cause the problems or are just the result, but most researchers are now coming round to the idea that tau protein tangles are the main problem.

The drug has been given the tradename ‘remben’ and was initially thought to be useful as it dissolved tangles in the test tube. It has just been tested in a Phase II trial which have been announced at an Alzheimer’s research conference.

The results of the first announced trial has not been published but there are details on the conference press release which I’ve included below the fold.

What’s most impressive from the preliminary details, is that the drug seemed to both slow or even stop cognitive decline in some cases, as well as eliminating the decline in blood flow in the areas usually most affected by the disease suggesting that it is halting the spread of tangles.

Interestingly, the company behind the drug, TauRx, have just launched their website today to catch the wave of publicity.

However, I’m wondering whether there’s more to it than meets the eye because, if I’ve got it right, the drug isn’t actually new.

Its chemical name is methylthioninium chloride but it’s also known as methylene blue and was synthesised way back in 1876. It was shown to be active against malaria by Paul Ehrlich in 1891 and later as a useful antibacterial drug (have a look at this fascinating NYT article from 1910).

In the late 1980s it was tried as a treatment for manic-depressive disorder and found to be useful.

Is this seems surprising, you may be interested to know that methylene blue was the basic compound from which the first antipsychotic drug chlorpromazine or Thorazine was made (in case you’re wondering, this family of antipsychotics can also work as anti-bacterial drugs, but have not been used due to other drugs having less side-effects).

If this is really just methylene blue, what this means in financial terms is that the drug can’t be patented.

In other words, anyone can make the drug which means its much harder to make money on it as pricing becomes competitive. In contrast, a patent gives you a time-limited monopoly – albeit one that can earn billions.

A widely available cheap generic drug that treats a major disease is actually a fantastic thing for society, but developing them is not typical behaviour for pharmaceutical companies who tend to shun unpatentable drugs.

Also, it’s probably true to say that the history of drug development shows a typical three stage process:

1. We’ve found a miracle cure!
2. We’ve found a miracle cure, but it can kill people.
3. It’s not a miracle cure, it can kill people, but it’s worth the risk in many cases.

So, time will tell how useful it is in the real world, but pretty much everyone has their fingers crossed that it will work out as a useful treatment.

Link to write-up from The Telegraph.

Continue reading “Promising Alzheimer’s drug announced”

Is the cinematograph making us stupid?

I’ve just found an eye-opening 2003 article in the Journal of the American Medical Association on the work on 19th century neurologists George Beard and Silas Weir Mitchell, who thought the pace of life and the effect of new technology was harming the mind and brain of citizens in 1800s America – echoing similar concerns we still hear today.

The two physicians were influential in pushing the idea that these effects resulted in ‘neurasthenia‘, a kind of fuzzy catch-all diagnosis for mental or emotional malaise.

What’s interesting is we’re experiencing something almost identical over 100 years later.

As we’ve noted several times, leading scientists or commentators can make international headlines by simply suggesting that new technology is harming the mind, brain and relationships of the modern citizen, despite a general lack of evidence or flat out evidence to the contrary.

The JAMA article notes how neurasthenia was associated with the cultural concerns of the time:

Families migrated from the countryside to the city, men left traditional jobs as tradesmen and farmers to join the growing ranks of businessmen and office workers, women went from being mothers and daughters to also being university students and physicians, and technological developments such as telegraphs, telephones, and railroads became increasingly common parts of everyday life. As a diagnosis, neurasthenia commanded an intuitive legitimacy because it incorporated the anxieties that arose from these changes into the way people thought of their health. It could attribute a bank manager’s headaches to his hectic schedule and the obsession for detail his job demanded.

Similarly, a young woman’s depression could be understood as neurasthenia brought on by the mental drain of attending a newly founded coeducational university, where she competed for grades. In many cases, diagnoses of neurasthenia attached themselves to traditional ideals, such as the restorative virtues of farming vis-√†-vis the fast-paced stress of modern business or the Victorian belief in women’s disposition for motherhood rather than scholarship. For Beard and Mitchell, neurasthenic patients were casualties of modern society whose bodies and minds simply could not keep up with the seemingly accelerated lifestyles of men and women in the latter part of the 19th century.

It’s a lovely illustration of the fact that since the dawn of popular medicine, our cultural concerns about changes in society are likely to be expressed in the language of illness and disease.

The article also notes that then, like now, the concerns are accompanied by an encouragement to return to the traditional ways of doing things (in this day and age – encouraging kids to ‘play proper games’ or have ‘genuine relationships’) rather than highlighting ways of healthy adaptation to the new technology.

This is not to say that all fears about new technologies are unfounded, but its clear that they are quickly medicalised and get far more prominence than the evidence supports, both in the 19th century and in the 21st.

Link to JAMA article ‘Neurasthenia and a Modernizing America’.

A party game that goes down like a red balloon

I just found this clever advert for The Economist which has an immediate impact but kinda becomes a bit awkward if you think about it for too long.

Presumably, it’s meant to convey the idea that the magazine is ‘mind expanding’. But as we mentioned in an earlier post, we tend to ascribe different sorts of properties to the mind and brain.

One key difference is that we don’t ascribe physical properties to the mind, which is a bit of a pain when you’re trying to create a visual advert. So the designers went for a brain.

But ‘brain expanding’ is just kind of awkward. It makes me think of hydrocephalus – a condition where faulty fluid drainage causes internal pressure which literally balloons the brain.

In young children with soft skulls this causes skull deformation, in adults it just tends to squash the brain against the side of the skull. Either way, it usually needs surgical intervention to insert a shunt valve to treat the drainage problem, else brain damage and death follow in a high proportion of cases.

Nevertheless, if you can get your hands on any of these balloons you’ve instantly got yourself a neurosurgery party game for kids. The first kid to fashion a shunt out of a drinking straw gets a special John Holter prize.

Yes, I know I should get out more.

Link to Economist advert.

Juggling can change brain structure within 7 days

A new study just published in PLoS One reports that learning to juggle alters the structure of motion detection areas in the brain within as little as 7 days.

Led by neuroscientist Joenna Driemeyer, the study builds on a previous research that also found juggling could alter brain structure, although this previous study waited three months before the brain was checked for alterations using high resolution structural MRI scans.

This new study also took 20 non-jugglers and asked them to learn to juggle, but scanned them after 7, 14 and 35 days.

After only 7 days, a motion specialised part of the occipital lobe known as V5 had increased in density. In both studies, the changes were maintained over the subsequent weeks of practice, but these areas returned to their pre-learning state after several weeks without juggling.

This is an interesting example of rapid ‘neuroplasticity‘, the ability of the brain to adapt structurally to new situations.

However, the authors are careful to note that they can’t tell whether the brains of the participants had generated more neurons, or whether existing cells grew in size, or additional glial cells were developed, or maybe there were just changes in how much blood or other brain fluids packed the area.

Also, the fact that changes seemed to occur at the beginning of the learning cycle but that further practice maintained but didn’t cause additional changes led the researchers to speculate that learning a variety of new things, rather than simply practising old skills, may be most effective in terms of brain structure alterations.

Link to ‘Changes in Gray Matter Induced by Learning ‚Äî Revisited’.
Link to PubMed entry for paper.

Full disclosure: I’m an unpaid member of the PLoS One editorial board.