Turn left at the surge of excitement

We covered Christian Nold’s brilliant project to create emotion maps of cities before, and I had the pleasure of going to the launch of his new book on Emotional Cartography on Friday. It’s awesome for lots of reasons, but one of the best ones is that you can download it free from the project website.

Nold came up with the idea of fusing a GSR machine, a skin conductance monitor that measures arousal, and a GPS machine, to allow stress to be mapped to particular places. He then gets people to walk round and creates maps detailing high arousal areas of cities.

The biomapping website has some of the fantastic maps from the project.

His book, called Emotional Cartography: Technologies of the Self contains some of the wonderful maps images, but also chapters by artists, psychogeographers, designers, cultural researchers, futurologists and neuroscientists who examine the relationship between space and the self.

One of the chapters is written by our very own Tom Stafford who explores the neuroscience of the self through a case study of an amnesic patient from the scientific literature called SS, who seemed to be unaware of his own depression because of his profound memory problems. Tom also gave a great talk at the launch, which you can also read online.

If you want to read the books, and I highly recommend it, you can download the book as a screen quality or print quality PDF, and its released under a Creative Commons license so you can take it to your nearest copy shop if you want a hard copy.

Link to Emotional Cartography website.
Link to Biomapping website.

The beautiful baby brain

Jonah Lehrer has an excellent piece in today’s Boston Globe about how babies’ brains develop and what psychologists are starting to understand about the infant mind.

It’s largely riffing on the work of Alison Gopnik, one of the world’s leading developmental psychologists, who has long argued that babies might be more conscious than adults and that we learn to filter the world and mentally manage its initial chaos.

While this less focused form of attention makes it more difficult to stay on task – preschoolers are easily distracted – it also comes with certain advantages. In many circumstances, the lantern mode of attention can actually lead to improvements in memory, especially when it comes to recalling information that seemed incidental at the time.

Consider this memory task designed by John Hagen, a developmental psychologist at the University of Michigan. A child is given a deck of cards and shown two cards at a time. The child is told to remember the card on the right and to ignore the card on the left. Not surprisingly, older children and adults are much better at remembering the cards they were told to focus on, since they’re able to direct their attention. However, young children are often better at remembering the cards on the left, which they were supposed to ignore. The lantern casts its light everywhere.

I’m a bit sceptical of one bit of the article though, where it claims that babies have more neurons than adults, as researchers have only very recently attempted to make this estimate and, in fact, found that babies and adults have about the same in the cortex, which makes up the vast majority of the brain.

In terms of synapses, connections between neurons, this varies on the age of the infant. For example, have a look at this graph of synapse density as we grow, taken from a study of the human cortex.

Newborns start with fewer synapses than adults but this number rockets, so by six months of age we have approximately twice as many connections. This tails off as the brain prunes connections on a ‘use it or lose it’ basis.

I’m always slightly awestruck whenever I view that graph as it is a vivid illustration of the incredibly rapid changes changes that take place as we grow and learn to make sense of the world.

It’s this same sense of awe that the Boston Globe manages to capture as it explains how understanding the baby’s brain can help us make sense of the adult mind.

Link to Boston Globe article ‘Inside the baby mind’.

Should we be trying to stop dream violence?

The Onion has a video of a funny spoof news report on “Should We Be Doing More To Reduce The Graphic Violence In Our Dreams?”

It gets a little bit gory towards the end, so if that’s not your thing, you may prefer another one of their recent reports on the news that “70 percent of all praise is sarcastic”.

Link to ‘Should We Be Doing More To Reduce The Violence In Our Dreams?’
Link to ’70 Percent of All Praise is Sarcastic’.

CIA psychology through the declassified memos

I’ve been reading the recently released CIA memos on the interrogation of ‘war on terror’ detainees. The memos make clear that the psychological impact of the process is the most important aim of interrogation, from the moment the detainee is captured through the various phases of interrogation.

Although disturbing, they’re interesting for what they reveal about the CIA’s psychologists and their approach to interrogation.

General framework
It is clear that empirical psychological science is core to interrogation-based intelligence gathering on both the individual and general approach levels. In clinical psychology, this is known as the scientist-practitioner model, where scientific research is used to understand types of problems and design interventions, but also where an iterative hypothesis-testing information-gathering process is applied to each individual.

The memos state that psychologists are involved in both directing interrogations and mental health assessments, making it likely that the majority of military psychologists are originally trained as clinical psychologists.

Indeed, after a visit to Guantanamo Bay, American Psychological Association president Ronald Levant wrote about his trip in an article for Military Psychology noting “I turned to see a former doctoral student in clinical psychology from Nova Southeastern University (NSU), who is now a military psychologist”. NSU strongly emphasises the scientist-practitioner model and it this style of clinical psychologist which probably makes up the bulk of the CIA’s ‘Behavioral Science Consultation Teams’ (BSCTs).

It is also clear that the CIA are interested in finding out two types of information: one, intelligence from the detainees, and two, which methods are most effective in doing so. It is interesting that all references to the impact and effectiveness of the interrogation methods are based on single cases (x has started giving intelligence after the use of y) or data from the US Military’s own SERE interrogation resistance programme, run on its own personnel.

There is no significant blacked out text in these sections, indicating that there are unlikely to be other key sources of evidence (such as secret research on the effectiveness of torture). In other words, Guantanamo and other interrogation facilities are as much interrogation labs as they are interrogation centres.

Integrated physiological monitoring
The memo [pdf] that discusses the interrogation of ‘al-Quaeda operative’ Abu Zubaydah has an interesting part where it states that “in an initial confrontational incident, Zubaydah showed signs of sympathetic nervous system arousal”. This would suggest that the detainees are wired-up to a system that detects physiological arousal – probably GSR, blood pressure, heart rate or a similar combination.

This would allow the interrogators to look for patterns in stress responses and focus on areas where stress was present despite an outward appearance of calm. The memo also notes that Zubaydah “appears to have a fear of insects”. Assuming that detainees would not voluntarily disclose their phobias, we can assume that likely phobias are detected by exposing the detainee to photos or situations related to common fears and then monitoring the detainee for abnormal stress responses.

Profiling
The summary of the psychological profile of Zubaydah is notable for the fact it doesn’t use the psychoanalytic or psychodynamic language more favoured by FBI profilers, instead using the relatively plain language of cognitive and psychometric approaches. For example, it describes his “coping resources”, rather than his ‘defences’, “problems” rather than ‘conflicts’ and makes no reference to any unconscious desires or motivations.

The profile is apparently “based on interviews with Zubaydah, observations of him, and information collected from other sources such as intelligence and press reports”. As with the FBI, there is likely to be formal psychometric methods for analysing self-written text to help inform the personality profile, although the complete profile is probably put together by a psychologist who integrates the various sources of information with only a conservative level of interpretation.

Confused understanding of ‘learned helplessness’
A couple of the memos note that the whole interrogation procedure and environment is designed “to create a state of learned helplessness“. This is a concept originally developed by psychologist Martin Seligman who found that dogs given inescapable electric shocks would eventually just give up trying to avoid them and remain passive while electrocuted. The theory was related to depression where people with no control over their unpleasant lives supposedly just learnt to be withdrawn and passive.

The concept is not particularly well validated, but even if it was and you were an interrogator, you’d want to avoid learned helplessness at all costs, because the detainee would see no point in co-operating. Furthermore, the acceptance of the theory is in direct contrast to the claims that the interrogations should not cause “severe physical or mental pain or suffering.” Learned helplessness is, by definition, the effect of chronic uncontrollable suffering.

What the interrogators want, and indeed, what the memos describe, is not learned helplessness, but where the detainees know and can demonstrate that co-operation is the only method that allows them control over their environment. This is more akin to sociologist Ervin Goffman’s concept of a total institution.

Clues and curiosities
One memo [pdf] mentions the concept of ‘resistance posture’, meaning the act of resisting the interrogators demands. The fact that this a specific term is used, and that it is additionally referred to as something that could be measured (‘This sequence “may continue for several more iterations as the interrogators continue to measure the [detainee’s] resistance posture”‘) suggests that this might be a specific psychological concept that is being empirically measured, perhaps through a combination of behavioural and physiological responses, presumably to help distinguish between resistance and genuinely not knowing the answer to a question.

It’s interesting that there is no reference to any neuroscience-based research or monitoring to justify conclusions, despite the widespread reports of the US secret services funding billions of pounds of research in this area. This may be because it’s too secret to release to the public, but it is just as likely that, as with other brain-based ‘prediction’ methods (neuromarketing, brain-scan ‘lie detection’) the data is less useful than more straightforward and better validated psychological and physiological methods.

As has been picked up by Wired the claims that 180 hours of sleep deprivation is not harmful in the long-term is based on a selective and limited reading of the scientific literature and is disputed by the people who carried out the research.

Link to PDFs of released memos.

The suicidal attraction of the Golden Gate Bridge

I’ve just found this morbidly fascinating article from a 2003 edition of The New Yorker that discusses the attraction of San Francisco’s Golden Gate Bridge to people who are suicidal.

It’s full of interesting snippets, like the fact that suicidal people tend to ignore the nearby and equally fatal Bay Bridge in favour of its more famous and more attractive cousin.

It also has quotes from some of the very few people who have ever jumped off the bridge and survived, and describes exactly what impact such a jump has on the body.

The article also touches on the debates over the erection of a suicide barrier on the landmark (it was finally decided in 2008 to put one in place) and the people-based suicide prevention methods.

It also has this lovely snippet about one of the police patrolmen, who has a wonderfully gentle way of talking to suicidal people:

Kevin Briggs, a friendly, sandy-haired motorcycle patrolman, has a knack for spotting jumpers and talking them back from the edge; he has coaxed in more than two hundred potential jumpers without losing one over the side. He won the Highway Patrol’s Marin County Uniformed Employee of the Year Award last year.

Briggs told me that he starts talking to a potential jumper by asking, “How are you feeling today?‚” Then, “What’s your plan for tomorrow?‚” If the person doesn’t have a plan, Briggs says, “Well, let’s make one. If it doesn’t work out, you can always come back here later.”

Apparently the article was the inspiration for the 2006 documentary film The Bridge which covered similar territory.

Link to New Yorker article ‘Jumpers’.

2009-04-24 Spike activity

Quick links from the past week in mind and brain news:

The first Neuroanthropology Conference kicks off in October and looks awesome.

Twitter causes immorality nonsense deftly dispatched by bloggers. Most mainstream press lost the plot although Time did a good job and Wired Science were keeping it real.

The Guardian review neurophysiologist Kathleen Taylor’s new book on cruelty.

AI system examines mysterious and ancient symbols from the long-lost Indus Valley civilization and suggests that they may represent a spoken language, reports Wired.

The Financial Times has a look at the Wellcome Collection’s latest exhibition on ‘madness and modernity’.

The links between autism and genius are explored by The Economist.

Not Exactly Rocket Science has a brilliant article on how touch-related brain activity reduces after only a couple of weeks of having your hand in a cast.

There are a couple of wonderful girl-with-exposed-brain paintings here.

The New York Times reports on mental illness, the musical! (thanks Daniel!)

BBC Radio 4’s Health Check has a programme on meningitis and supernumeray phantom limbs.

Newsweek has an interesting Q&A on the psychology of memory.

An extended and interesting article on the psychology of how we related to the environment is published by The New York Times.

NeuroImage has an article arguing for community neuroimaging databases. Hallelujah and amen!

Is there a link between autistic traits and anorexia? asks New Scientist.

Frontal Cortex has an excellent piece on the commuters paradox – where we consistently underestimate the pain of a long commute.

Rapid emotional swings could predict violence in psychiatric patients suggests new research covered by Science News.

BBC News on the impressive ‘Blue Brain‘ project but who seem to like talking themselves up rather a lot. Apparently just a “matter of money” to simulate a whole brain (oh, and a good conceptual understanding of how the brain actually works beyond simplified models of the neocortical column).

18 ways attention goes wrong. PsyBlog continues riffing on attention by listing several related problems.

Psychiatric Times has an excellent article on the philosophy of psychiatry and how we define what counts as a mental illness. Bonus ‘internet addiction’ slapdown included.

Neuronarrative on a study suggesting that TV may be a surrogate for social interaction.

New ‘mind reading’ consumer EEG headsets about to hit the shelves with dull-looking games, according to New Scientist. They look fantastic, but don’t believe the hype – the fun will be in equipment hacks and data aggregation projects.

The Economist has a couple more good articles: one on the cognitive benefits of bilingual babies and the other on preconscious action selection and free will.

Makes of antidepressant Lexapro (escitalopram) may be gearing up for the latest in a long line a heavy weight US government fines for illegal promotion, reports Furious Seasons.

From the four humours to fMRI

The excellent Cognition and Culture blog found a fascinating lecture by the energetic medical historian Noga Arikha about the four humours theory of medicine and how its legacy influences our modern day ideas about the mind and brain.

The four humours theory suggested that the function of the mind and body was determined by the balance of four fluids in the body: black bile, yellow bile, phlegm, and blood.

While specific diseases were explained in this way, so were character traits and, in their excess, mental illness.

Indeed, some of the old names for these fluids still survive as descriptions of character traits (for example, we can still describe someone as phlegmatic or sanguine) even if we’re unaware of their origins.

However, Arikha outlines that its possible to trace the thinking behind humoural theories right through history into our current ideas about mind and brain in the age of brain scans and cognitive neuroscience.

The talk is based on her book, called Passions and Tempers: A History of the Humours, and the video is a bit shaky at times but worth sticking with as it’s an engrossing lecture.

Link to video of talk by Noga Arikha.

Phantom portraits

I’ve just found a gallery of one of my favourite art science projects of all time which used digital photo manipulation to illustrate the phantom limbs of post-amputation patients.

The images are incredibly striking, because they vividly illustrate that phantom limbs are often only phantom part-limbs. Sections can be missing, even in the middle, so a phantom hand can be felt even if a phantom elbow cannot.

Or perhaps a phantom hand can feel as if it protrudes directly from the point of amputation at the shoulder, or perhaps it feels distorted, or perhaps has no intervening phantom arm, or perhaps it is stuck in one position, and so on.

The project was the brainchild of neuropsychologist Peter Halligan, neurologist John Kew and photographer Alexa Wright. Actually, Peter is an ex-boss and I spent several years of my PhD with a huge picture of RD (above) in my office and it never failed to amaze me.

Unfortunately, the pictures in the online gallery are a viewable but a little small, although there are some larger versions if you scroll down in this essay.

Link to After Images online gallery.

Reverse psychology in a pill: anti-placebo

Photo by Flick user ArneCoomans. Click for sourceYou may be aware of the placebo effect, where an inert pill has an effect because of what the patient thinks it does. You may even be aware of the nocebo effect, where an inert pill causes ‘side-effects’. But a fascinating 1970 study reported evidence for the anti-placebo effect, where an inert pill has the opposite effect of what it is expected to do.

Storms and Nisbett were two psychologists interested in attribution, the process of how we explain the causes of events and the impact this has on how we feel.

We know that attributions have a big impact on our level of physical and emotional health. For example, your heart is racing when you’re about to give a talk. If you attribute it to a weak heart, you may start worrying whether you might pass out and become incredibly stressed, but if you attribute it to the situation, you might just think its a natural reaction for the event and feel primed and ready.

In anxiety disorders, we know that people often attribute natural bodily reactions to frightening causes, which makes people feel more on edge, and hence, their body kicks into an even higher gear, and so on. The cycle continues, to fever pitch. In essence, it’s anxiety-fuelled anxiety.

Insomnia has an element of this. People can be worried that they’re not sleeping, and so get anxious thoughts when they go to bed, and so feel on edge, ad nocturnum, until the early hours.

So rather than getting people to fill in questionnaires about causes of insomnia, a typical method in attribution research, Storms and Nisbett wanted to test these ideas in the real world.

They recruited a group of patients with insomnia and told them they were doing a four-night study on dreaming and asked them to rate their difficulty in falling asleep each night.

The first two nights were exactly that, a sleeping and rating exercise, but on the third night the participants were given pills. One group was told that the pill would make them feel more aroused, like a shot of caffeine, while the others were told that the pill would make them feel more relaxed, like a sleeping pill.

On the fourth night, the group were given the ‘opposite’ pill, but in reality, all the pills were identical and completely inert, containing nothing more than sugar.

Now here’s the thing. The insomnia patients taking the ‘relaxation’ pills slept really badly, and the patients taking the ‘arousal’ pills slept much better.

What seemed to be happening was that patients taking ‘uppers’, normally trapped in a cycle of anxious self-monitoring, could attribute any arousal they had to the pill. Any sign of feeling wired wasn’t them, it was the pill, so they could relax and fell asleep easily.

In contrast, those who had taken the ‘downers’ thought that any arousal must be their insomnia causing them problems, and it must be really bad, because it was getting to them despite the supposed sleeping pill they’d taken. In other words, they were freaking out because they couldn’t sleep despite the ‘medication’.

It turns out that this simple experiment wasn’t easily replicated but the problem was solved in 1983 when it was realised that this effect only held for people with insomnia who obsessively self-monitored.

But what these experiments tell us is that the effects of medication, the symptoms of illness and even the process of ‘being sick’ is partly dependent on our own ideas about what’s happening.

Link to PubMed entry for original Storms and Nisbet study.
Link to 1983 replication.

Taking pride in your posture

A simple but elegant study just published in the European Journal of Social Psychology found that getting people to generate words about pride caused them to unknowingly raise their posture, while asking them to generate words about disappointment led to an involuntary slouch.

The research team, led by psychologist Suzanne Oosterwijk, asked people to list words related to ‘pride’ and ‘disappointment’, and some emotionally neutral control categories of ‘kitchen’ and ‘bathroom’, while being secretly filmed.

‘Pride’ caused a slight increase in posture height, while ‘disappointment’ caused the participants to markedly slouch.

The researchers suggest that the activation of the concept of disappointment led to a spontaneous bodily simulation of the feeling. They link this to the idea of embodied cognition that suggests that our mental life is fundamentally connected to acting on the world.

As we discussed last year, research has suggested that bodily expressions of pride and shame are the same across cultures, indicating that this connection between action and emotion may be a core part of our emotional make-up.

Link to abstract of study (via the BPSRD).

The medieval senses and the evil eye

The latest edition of neurology journal Brain has an extended review of three books about the history of the senses which gives a fascinating insight into how the meaning of our sensory experiences has changed over the centuries.

This paragraph is particularly interesting as it relates medieval theories of perception to the superstition of the ‘evil eye‘ where you could curse someone by looking at them.

While we now think of vision as a system for interpreting passively received light, the ‘evil eye’ makes much more sense when you realize that medieval people thought that light rays could fundamentally influence what they touched and even that the eyes actively sent out rays that could influence the objects within sight.

In 1492, learned debates also influenced how the world was perceived. As medical historians Nancy Siraisi and James T. McIlwain, also a neuroscientist, point out, medieval scholars would have located sensory perception in the brain (Siraisi, 1990; McIlwain, 2006). However, they would have perceived the five senses as active entities conveying information about the outside world to the internal senses of common sense, imagination, judgement, memory and fantasy (the ability to visualize).

Scholars differed considerably over how this worked in practice: for example, were rays emitted from the eyes towards the viewed object or was it the other way round? Either theory allowed for these rays to influence both viewer and object, thus explaining the widespread concept of the evil eye, or a belief still current in the 18th century that what a mother saw affected her foetus. The brain, however, was not the only sensitive organ of the body.

The heart was believed to be the centre of the animal soul, and thus closely associated with more carnal senses such as touch. The brain, the centre of the rational soul, was more closely associated with sight; the eyes often viewed as the ‘windows of the soul’. Sight, therefore, was given pre-eminence in the pre-modern world as it is today, but often for spiritual reasons due to the inter-dependence of religion and rational knowledge (scientia).

Thus even if the brain functioned in the past very much as it does today, the emotional and moral meaning of sensory experience differed dramatically.

The whole review is worth reading in full, not just because of the insights into medieval psychology, but also because these new books introduce ‘sensory history’ – a history of ideas about how we experienced the world through our bodies.

Link to review.
Link to DOI entry for same.

Predicting the determined self-castrator

The Journal of Sexual Medicine has a surprising study looking at psychological attributes that predict which castration enthusiasts who will actually go on to remove their own testicles, in contrast to those who just fantasise about it.

This is the abstract from the scientific paper:

A passion for castration: characterizing men who are fascinated with castration, but have not been castrated

Roberts LF, Brett MA, Johnson TW, Wassersug RJ.

J Sex Med. 2008 Jul;5(7):1669-80.

Introduction. A number of men have extreme castration ideations. Many only fantasize about castration; others actualize their fantasies.

Aims. We wish to identify factors that distinguish those who merely fantasize about being castrated from those who are at the greatest risk of genital mutilation.

Methods. Seven hundred thirty-one individuals, who were not castrated, responded to a survey posted on http://www.eunuch.org. We compared the responses of these “wannabes” to those of 92 men who were voluntarily castrated and responded to a companion survey.

Main Outcome Measures. Respondents answered the questionnaire items relating to demographics, origin of interest in castration, and ambition toward eunuchdom.

Results. Two categories of wannabes emerged. A large proportion (‚àº40%) of wannabes’ interest in castration was singularly of a fetishistic nature, and these men appeared to be at a relatively low risk of irreversible genital mutilation. Approximately 20% of the men, however, appeared to be at great risk of genital mutilation. They showed a greater desire to reduce libido, change their genital appearance, transition out of male, and prevent sexually offensive behavior. Nineteen percent of all wannabes have attempted self-castration, yet only 10% have sought medical assistance.

Conclusions. We identify several motivating factors for extreme castration ideations and provide a classification for reasons why some males desire orchiectomies. Castration ideations fall under several categories of the Diagnostic and Statistical Manual of Mental Disorders, 4th Ed. (DSM-IV), most notably a Gender Identity Disorder other than male-to-female (MtF) transsexual (i.e., male-to-eunuch) and a Body Identity Integrity Disorder. Physicians need to be aware of males who have strong desires for emasculation without a traditional MtF transsexual identity.

We reported on an earlier study by the same research group last year, which discovered that ‘voluntary eunuchs’ report that they are pleased that they have had their testicles removed and seem mentally healthy.

Link to PubMed entry for study.

Inside Britain’s highest security psychiatric hospital

The Independent has an article giving a rare look inside Broadmoor Hospital, one of only four high security psychiatric hospital in the UK, which houses some of the most severely dangerous offenders with mental illness.

Broadmoor is the oldest and most well-known high secure hospital in Britain, having housed a string of high profile murders and other violent offenders since Victorian times to the present day.

The article focuses on the Paddock Centre, a new section to treat people with a dangerous and severe personality disorder (DSPD).

DSPD is not a medical diagnosis, it is a category created by the UK government to classify a group of offenders with a diagnosable personality disorder who are thought to be at risk of violent offending in the future.

The category was devised because the government wanted to find a way in which psychiatrists could treat persistently violent offenders with an antisocial personality disorder diagnosis, because the mental health act only allowed people to be detained if their condition was treatable.

Since there was no treatment, psychiatrists couldn’t detain such people and refused to do so, so the government created the category and changed the law so they could.

Hence we now have the rapidly expanding DSPD Programme and Broadmoor houses the Paddock Centre, the biggest DSPD centre in the country.

The category has caused a great deal of ethical debate and even heated argument, as it allows currently untreatable people to be detained on the basis of risk, rather than for committing a specific crime.

However, the Independent article is more focused on the day-to-day running of the unit, talking to its lead psychiatrists and giving a picture of how it functions.

Journalistic insights into Broadmoor are incredibly infrequent, so this is a rare opportunity to get a glimpse of what goes on. The only other recent example I can think of was a 2004 edition of BBC All in the Mind that you can still listen to online.

Link to Independent ‘Exclusive: Inside Broadmoor’.

The risks of cognitive enchantment

The New Yorker has a fantastic in-depth article about ‘cognitive enhancement’ that talks to some of the neuroscientists studying the effects and some of the mind tweakers who regularly pop pills to give themselves an edge.

One of the issues it touches on is whether cognitive enhancers really ‘enhance’ people, and there’s good evidence that for the highest achievers, the pills might not be of much benefit.

Even worse, it’s also likely that the amphetamine-based drugs (Ritalin, Adderall) could actually impair your performance even though you might feel as if you’ve had a mental boost.

Amphetamine has the effect of increasing focus, confidence and giving a euphoric feeling. Although the effects are less marked in the slow release amphetamines used for ADHD and appropriated for illicit mind tweaking, the effect is certainly still there.

What we do know, however, is that people with certain genotypes actually show a decrease in working memory performance when they take amphetamine.

And it turns out that these are the people most likely to already be at the high end of mental performance. This is from a classic study on the effect:

Amphetamine enhanced the efficiency of prefrontal cortex function assayed with functional MRI during a working memory task in subjects with the high enzyme activity val/val genotype [of the COMT gene], who presumably have relatively less prefrontal synaptic dopamine, at all levels of task difficulty.

In contrast, in subjects with the low activity met/met genotype who tend to have superior baseline prefrontal function, the drug had no effect on cortical efficiency at low-to-moderate working memory load and caused deterioration at high working memory load

In other words, it’s possible that high achievers might be popping stimulants, feeling like it boosts their performance, when in fact, it’s doing exactly the opposite.

The article explores more than just this area though, and is incredibly wide-ranging, looking at the neuroscience, the underground use of the drugs, legal aspects, new and current compounds, and so on.

It’s also one of the most interesting articles I’ve read on the subject for a while, which, for an area which attracts of lot of attention, has got to be a good thing.

Link to ‘The underground world of ‚Äúneuroenhancing‚Äù drugs’.

Choice blindness

New Scientist has a fascinating article on some ‘I wish I’d thought of that’ research that looks at how we justify our choices, even when the thing we’ve chosen has been unknowingly swapped. It turns out, most of the time we don’t notice the change and precede to give reasons for why the thing we didn’t choose was the best choice.

It’s a fantastic use of stage magician’s sleight of hand to make a change outside conscious awareness.

We have been trying to answer this question using techniques from magic performances. Rather than playing tricks with alternatives presented to participants, we surreptitiously altered the outcomes of their choices, and recorded how they react. For example, in an early study we showed our volunteers pairs of pictures of faces and asked them to choose the most attractive. In some trials, immediately after they made their choice, we asked people to explain the reasons behind their choices.

Unknown to them, we sometimes used a double-card magic trick to covertly exchange one face for the other so they ended up with the face they did not choose. Common sense dictates that all of us would notice such a big change in the outcome of a choice. But the result showed that in 75 per cent of the trials our participants were blind to the mismatch, even offering “reasons” for their “choice”.

The idea riffs on the well-known psychological phenomenon of change blindness but this is also a lovely example of what Daniel Dennett called “narratization”, the ability of the mind to make a coherent story out what’s happening, with you as the main character, even when it’s clear that the outcome was determined externally. In a well-known article, Dennett cites this process as the key to our understanding of the ‘self’.

This was vividly demonstrated in split-brain patients who can be shown images to each independent hemisphere.

Each hand picks out a different picture, because the information is only accessible to the side that controls action for one side of the body, but when asked why they chose the two, they give a story of why the two pictures are related, even though they’re not conscious of initially seeing both pictures.

There’s a great summary in this New York Times piece from 2005, that comes highly recommended.

The New Scientist article covers this new technique for investigating this process with a nifty video of the slight-of-hand in action.

Link to NewSci on ‘Choice blindness: You don’t know what you want’.

Seized by the anti-storm

Newsweek has an excellent article on the neuroscience and personal impact of epilepsy. It’s well-researched, gripping in parts and bang up-to-date as it takes us through how neurologists tackle the seizure-prone brain.

I was particularly impressed by the following section as it avoids the common clich√© of the epileptic ‘brain storm’ because, as we’ve discussed before on Mind Hacks, a seizure is not a storm of random brain activity.

In fact, it’s completely the opposite. During a seizure neurons become super-synchronised, pulsing together, so they can’t do their normal job. In effect, it’s an anti-storm.

Conceptually, the job of the cardiologist is straightforward: he needs to restore a damaged heart to its normal rhythm. But epilepsy is the opposite. A normal brain is governed by chaos; neurons fire unpredictably, following laws no computer, let alone neurologist, could hope to understand, even if they can recognize it on an EEG. It is what we call consciousness, perhaps the most mathematically complex phenomenon in the universe.

The definition of a seizure is the absence of chaos, supplanted by a simple rhythmic pattern that carries almost no information. It may arise locally (a “partial” seizure), perhaps at the site of an old injury, a tumor or a structural malformation. A network of neurons begin firing in unison, enlisting their fellows in a synchronous wave that ripples across the brain.

Or it may begin everywhere at once (“generalized” epilepsy), with an imbalance of ions across the cell membrane, usually the result of an inherited mutation. At a chemical signal, whose origin is still a mystery, billions of neurons drop the mundane business of running the body and join in a primitive drumbeat, drowning out the murmur of consciousness. And so in contrast to the cardiologist, the epilepsy doctor must attempt to restore not order, but chaos.

The article is very much epilepsy from the medical perspective, but it is probably the single best mainstream piece I’ve read that attempts to tackle this area.

If you only read a handful of epilepsy articles in your life, make this one of them. Well done Newsweek.

Link to Newsweek article ‘In the Grip of the Unknown’.