Beyond paddling: children and technology

One of the most sensible articles yet published on children, technology and the brain has just appeared in the scientific journal Neuron. It’s titled “Children, Wired: For Better and for Worse” and has been made open-access so you can read it in full online.

You’ll notice a few things that are different from your usual article about the impact of technology: it is written by cognitive scientists who are actually involved in the research; it is published in a peer-reviewed scientific journal; it discusses the whole range of evidence; and it hasn’t made any headlines.

Although it’s an academic article, it’s surprisingly readable and if you’re interested in the area, I highly recommend it.

This is not least because it points out lots of counter-intuitive findings in the scientific literature that are never covered by the people who usually spin the ‘I think it’s trash culture so it must be doing harm’ line.

For example, educational or ‘brain boosting’ applications may actually slow learning while ‘mindless’ video games can have sustained benefits:

Technology specifically developed for the purpose of enhancing cognitive abilities, such as infant-directed media including the ‘‘Baby Einstein’’ collection or various ‘‘brain games’’ designed for adults, may lead to no effects or, worse, may lead to unanticipated negative effects (Owen et al., 2010; Zimmerman et al., 2007). Meanwhile, technological applications that on the surface seem rather mindless (such as action video games) can result in improvements in a number of basic attentional, motor, and visual skills (Green and Bavelier, 2008; Greenfield, 2009).

It’s worth noting that there is good evidence that some educational TV programmes and software have a beneficial effect, but the point remains that you can’t guess the effect from the label.

The article is great at picking up on these complexities and noting the importance of fully considering content and context as well as the way technology delivers it.

My only quibble is a throwaway line where the authors consider addiction to video games and note we need to consider neurological evidence because: “The fronto-striatal pathway, which has been strongly implicated in both drug addiction and behavioral disorders such as pathological gambling is also activated by interaction with certain types of media technology, video games in particular”

As the ‘reward system’, of course, it’s strongly activated in lots of things we find pleasurable or useful – like listening to music, consuming soft drinks, co-operating with others and receiving a compliment.

There is nothing inherently pathological about the activity of this system so we need to be careful that we are guided by what actually impacts on people’s lives and not get too dazzled by the bright lights of brain scanners. But this is a minor point in a overwhelmingly excellent piece.

The take home point is that the ‘technology is damaging the brain / eating our children / harming our culture’ stories are over-simplified to the point of absurdity. No-one could get away with a scare story about the whole of ‘transport’ but you can with ‘technology’ because it plays to our anxious stereotypes.

This is not to say that there aren’t some genuine areas of concern but these are little different from every other media that has come before: violence has a small but significant effect on aggression and doing anything to the detriment of a balanced education and active life will affect school progress and health.
 

Link to Neuron article with full text pdf link (via @bradleyvoytek).
Link to DOI entry.

Air gun psychology

An amusing YouTube video demonstrates Ivan Pavlov’s principal of classical conditioning with an air gun, a novelty alarm and a reluctant college roommate.

Pavlov discovered that we learn to associate an established response to a new event simply by repeatedly pairing the new event to a situation that already caused the response. Famously he could trigger salivation in a dog just with the sound of a bell, simply by ringing a bell every time food was presented.

This video uses exactly the same principle, but instead of food, an airgun pellet is fired at a college roommate causing a painful reaction, and instead of a bell, an annoying novelty alarm is sounded.

Science. Standing on the shoulders of giants.
 

Link to YouTube video.

Guided by voices

RadioLab has a fantastic mini-edition about the link between our internal thought stream and the development of auditory hallucinations – the experience of ‘hearing voices’.

The programme discusses the theory that the experience of hearing hallucinated ‘voices in your head’ occurs when we lose the ability to recognise our internal thoughts as our own.

Although there is some good evidence that, for example, people diagnosed with schizophrenia who hear voices are less able to recognise their own actions as their own, one crucial aspect not explained by the theory is why many ‘voice hearers’ experience voices with distinct identities.

For example, someone might hear the voice of their dead parent along with someone they knew from childhood where someone else might have discovered the identities of their voices over time, simply from hearing them speak, and they seem to have no relation to specific people they’ve met in their lives.

The programme suggests the idea, which, as far as I know, has never been discussed in the scientific literature, that the identities of the voices could originate from when we learn to internalise voices of people who give us instructions when we’re children – an approach based on the theories of Lev Vygotsky.

It’s a delightful idea, if not a little blue sky, and is accompanied by a brilliant demonstration of the type of study that focuses on hallucinated voices.
 

UPDATE: There’s further discussion with references to Vygotsky’s work on self-talk and internalised thought from the interviewee, psychologist Charles Ferynhough, over at a great post on his blog.

 

Link to RadioLab ‘Voices in Your Head’ edition.

Why are overheard phone conversations so distracting?

Psychological Science has a brilliantly conceived study that explains why overhearing someone talk on a mobile phone is so much more annoying than simply overhearing two people in conversation.

It turns out that a one-sided conversation (brilliantly named a ‘half-a-logue’) draws in more of our mental resources because the information is less predictable – like being fed a series of verbal cliff-hangers.

Overheard Cell-Phone Conversations: When Less Speech Is More Distracting.

Psychol Sci. 2010 Sep 3. [Epub ahead of print]

Emberson LL, Lupyan G, Goldstein MH, Spivey MJ.

Why are people more irritated by nearby cell-phone conversations than by conversations between two people who are physically present? Overhearing someone on a cell phone means hearing only half of a conversation-a “halfalogue.” We show that merely overhearing a halfalogue results in decreased performance on cognitive tasks designed to reflect the attentional demands of daily activities. By contrast, overhearing both sides of a cell-phone conversation or a monologue does not result in decreased performance. This may be because the content of a halfalogue is less predictable than both sides of a conversation. In a second experiment, we controlled for differences in acoustic factors between these types of overheard speech, establishing that it is the unpredictable informational content of halfalogues that results in distraction. Thus, we provide a cognitive explanation for why overheard cell-phone conversations are especially irritating: Less-predictable speech results in more distraction for a listener engaged in other tasks.

 

Link to PubMed entry for study.

A stranger in half your body

An amazing study has just been published online in Consciousness and Cognition about a patient with epilepsy who felt the left half of his body was being “invaded by a stranger” when he had a seizure. As a result, he felt he existed in one side of his body only.

The research is from the same Swiss team who made headlines with their study that used virtual reality to make participants feel they were in someone else’s body, and one where brain stimulation triggered the sensation of having an offset ‘shadow body’ in patients undergoing neurosurgery.

The researchers suggest that having an integrated sense of our own bodies involves three types of perception: self-location – the area where we experience the self to be located; first-person perspective – the perceived centre of the conscious experience; and self-identification – the degree to which we identify sensations with our own bodies.

They report two case studies of patients with neurological disorders where self-identification goes haywire. This is the first:

Patient 1 is a 55 year old, left-handed male patient suffering from epilepsy since the age of 14 years. His simple partial sensorimotor seizures [where he remained ‘awake’ throughout] affected his left hand had been well controlled under anti-epileptic medication until the onset of paroxysmal episodes of vertigo 9 years before the current hospitalization.

At that time he additionally started to experience the following, highly stereotypical pattern of symptoms: without any prior warning he would first have the impression of an increasing pressure in the entire left hemi-body. This sensation increased progressively in strength leading eventually to the sensation that he was invaded by a stranger in his left hemi-body.

At this time he also sensed that the left half of his head, the upper part of his left trunk, his left arm and his left leg were no longer belonging to him (no misattribution), that these parts were disconnected from the rest of his body, and that his body was divided into two parts (Fig. 1A [see image above]). Sometimes this was followed by the impression that the left arm was moving unintentionally and would disappear behind the patient’s back. During these episodes he never experienced any deformation or other changes of his body or the environment.

Furthermore, no autoscopic hallucinations, no sensation of floating or disembodiment, no change in visuo-spatial or first-person perspective, no disturbance of language or vision and no loss of contact or consciousness were noted. During these sensations the patient localized the self as within the right side of his body (shown in grey in Fig. 1 [above]). He managed to remain calm and was able to continue standing, walking, and even give oral presentations while in front of audiences at work (surrounding persons usually did not notice his seizure manifestations). These simple partial seizures occurred on a daily basis and lasted about 1 min.

These sorts of cases are useful because they help us understand whether theories about the brain and its relation to our experience are realistic.

For example, one test of the idea that body self-consciousness has three components (self-location; first-person perspective; and self-identification) would be to see if there are any patients who show disturbances to only one of these experiences due to a neurological problem.

This patient shows exactly this, giving us some additional evidence that the three-component idea is useful. It is not the only evidence we need of course, but it is still makes an important contribution.
 

Link to PubMed entry for study.
Link to DOI entry.

Chomsky’s Universal Glamour

Satirical website Newsbiscuit has a funny piece about linguist Noam Chomsky being a new judge on X-Factor.

Professor of linguistics and political campaigner Noam Chomsky has been confirmed as the new judge on TV talent show The X Factor. ‘Cheryl Cole was still recovering from malaria and we needed someone who could fill the intellectual void,’ said programme creator Simon Cowell, ‘Professor Chomsky is perfect and the audience just loves him.’

In his first outing as judge, Chomsky quickly made his mark. ‘Your act is part of a propaganda state promoting a culture-ideology of comforting illusion’, he told one hopeful young girl, before adding, ‘I’m saying yes.’

Chomsky then set about a teenage boy-band, describing them as ‘yet another example of pre-packaged ideological oppression whose lyrics systematically fail to demonstrate even a basic understanding of what happened to East Timor in 1975,’ he paused for effect, ‘But, I’m giving you a second chance…You’re through to the next round.’

 

Link to Newsbiscuit story.

NeuroPod on James, genes and jammin’

The latest Nature Neuroscience podcast has just appeared online. The latest edition is a particularly good one and tackles the 100th anniversary of William James’ death, a barely known gene that has been linked to severe brain malformations, monkey anxiety and psychedelic psychiatry.

The author of William James‘ biography, Linda Simon, is interviewed about the life of the great man and founder of modern psychology. The interview makes for a brilliant potted biography of James and is a particular highlight of the show.

I also enjoyed the interview with Franz Vollenwider, co-author of the recent article on the use of psychedelic drugs in psychiatry who gives a classic scientists’ answer to the question “Have you ever taken these drugs yourself?” He replies “When we did the very first psilocybin study, we had no idea about the dose…”

You’ll have to listen to the show to hear how it went.
 

Link to NeuroPod homepage.
mp3 of this episode.

Peculiar disturbances of vision

I have found what is reportedly the first description of a hallucinogenic ‘magic mushroom’ trip in the Western medical literature. It is from a 1926 paper on different types of mushroom poisoning that was published in the Journal of Pharmacology and Experimental Therapeutics and was written by William. W. Ford.

He lists various types of poisoning, including stomach upsets, cramps, vomiting and convulsions, but the final category is where he tackles the effects of what are now known as ‘magic mushrooms’ that contain the naturally occurring hallucinogenic drug psilocybin.

5. Mycetismus cerebralis. Here the patients show peculiar cerebral symptoms four or five hours after the fungi are eaten. They are greatly exhilarated, laugh immoderately on slight occasion, develop a staggering gait and show peculiar disturbances of vision. The symptoms are transient, the patients being restored to health in twenty-four to forty-eight hours, except for a peculiar sensation which they describe as a feeling “as if they were walking on air.” This subjective sensation may last several days.

The plants responsible for this peculiar poisoning are Panaeolus papilionaceus and Panaeolus campanulatus. They are of interest to us chiefly in that they grow on lawns together with the edible mushroom, Agaricus campestris, and also rarely in beds for the artificial propagation of this species.

Although leaders of the psychedelic movement have created back histories about Western use of hallucinogenic mushrooms – suggesting they were used for spiritual purposes by druids, the Ancient Greeks, and folk healers to the present day – historian Andy Letcher has noted that the effects of these fungi have, as far as we know, always been treated as accidental poisoning.

For example, he notes in his book on the cultural history of the ‘magic mushroom’ that throughout the 18th and 19th centuries medical records describe how consumption of what we would now recognise as ‘magic mushrooms’ were treated with emetics, cathartics, the stomach pump, and occasionally leeches, as would any other poison.

The medical article by William. W. Ford, published 30 years before the active ingredient of magic mushroom would be isolated, was the first time that the hallucinogenic effects had been identified as a distinct effect.

We know now that the toxicity of psilocybin is very low and the main dangers are being inebriated (off your face) or accidentally picking and eating genuinely poisonous mushrooms.

I learnt about this early medical report from an academic article by Andy Letcher where he analysis how ‘magic mushrooms’ have been discussed in popular culture and science.
 

Link to online version of paywalled 1926 paper.
Link to NYT review of Shrooms: A Cultural History of the Magic Mushroom.

You are the last piece in the puzzle

The Economist has an excellent article that discusses the increasingly diverse ways in which information from your social network – drawn from services like Facebook, or from telephone calls or payment patterns – are being used to obtain personal information about you.

This is not information which you have explicitly stated or included, but which can be found out or ‘mined’ from your patterns of behaviour and your connections to other people.

The piece looks at ways in which software, specifically designed for the task, is being increasingly deployed by companies and security agencies to profile their targets.

Telecoms operators naturally prize mobile-phone subscribers who spend a lot, but some thriftier customers, it turns out, are actually more valuable. Known as “influencers”, these subscribers frequently persuade their friends, family and colleagues to follow them when they switch to a rival operator. The trick, then, is to identify such trendsetting subscribers and keep them on board with special discounts and promotions. People at the top of the office or social pecking order often receive quick callbacks, do not worry about calling other people late at night and tend to get more calls at times when social events are most often organised, such as Friday afternoons. Influential customers also reveal their clout by making long calls, while the calls they receive are generally short.

The piece goes on to explain how such analyses have been used in everything from targeting advertising to tracking down Saddam Hussein.
 

Link to ‘Untangling the social web’.

The labyrinth of Inception

When you have a hammer, everything can look like a nail and people have been banging the shit out of Inception. The sci-fi movie of the year has attracted numerous ‘neuroscience of Inception’ reviews despite the fact that the film has little to say about the brain and is clearly more inspired by the psychological theories of Carl Jung than by neurobiology.

It’s easy to why the movie has attracted neuroscience fans, including a brain-based review in this week’s Nature. It’s a science fiction film, the dream entry device presumably alters the brain, and director Christopher Nolan’s previous film Memento was carefully drawn from a detailed reading of the science of brain injury and memory loss.

Inception itself, however, contains so little direct reference to the brain (I counted about three lines) that you have to do some pretty flexible interpretation to draw firm parallels with brain science. Perhaps, most tellingly, for a film supposedly about neuroscience, the dream entry devices don’t even connect to the brain and nothing is made of how they achieve their interface.

But for those familiar with the theories of Carl Jung, the psychoanalyst and dissenter from Freud’s circle, the film is rich with both implicit and explicit references to his work.

As with all psychoanalysts, Jung was concerned with the subconscious mind and believed that it contains powerful emotional processes that, when malformed or disturbed, can break through and cause immense distress to our conscious lives. To protect us, the subconscious tries to hide these forces behind symbols, which appear, most vividly, in dreams.

This is why Freud called dreams “the royal road to the unconscious” and Jung’s work is also based on this core assumption.

Similarly, in Inception, dreams are a way of accessing the subconscious of the dreamer, to the point where they can be used to steal secrets. This dream invasion work is not easy, of course, primarily because the subconscious mind attempts to defend against invaders (a defense mechanism in psychoanalytic terms) and the dreamspace needs to be explored and interpreted by the invaders to get to the secret itself.

This is not the only challenge, as other people in the dream are projections of the dreamer’s subconscious where, in line with the definition from psychoanalysis, personal feelings are perceived as residing in other people.

In the film, the young architect, Ariadne is hired to build dreams in the form of mazes, and the labyrinth forms one of the central symbols in the film (the name, Ariadne, by the way, comes from the Greek legend where she leads Theseus out of the Minotaur’s labyrinth – Jung referred to being lost in life as ‘losing the Ariadne thread’).

In Jungian psychology the labyrinth is one of the most powerful symbols of the subconscious. In his book ‘Man and His Symbols’, he explains its meaning:

“The maze of strange passages, chambers, and unlocked exits in the cellar recalls the old Egyptian representation of the underworld, which is a well-known symbol of the unconscious with its abilities. It also shows how one is “open” to other influences in one’s unconscious shadow side and how uncanny and alien elements can break in.”

Ariadne is hired because Don Cobb can no longer create dreams, owing to the fact that the subconscious representation of his ex-wife, who killed herself due to Cobb’s dream work, appears and attempts to violently stop him. Cobb names her his ‘shade’, directly referencing the Jungian concept of the shadow where we are haunted by the parts of ourselves which we are most ashamed and which we most try to repress.

While Cobb’s main objective is to get back to his children, his main challenge is to overcome his shadow that causes conflicts in his subconscious. Normally, if you wrote a sentence like that about a film you would be using a Jungian interpretation, but in the case of Inception this is also the literal state of affairs.

This is not the only psychological journey that happens in the film, as Cobb’s journey is paralleled by that of Robert Fischer, the target of the dream invaders. Fischer’s father is dying leaving both the state of the family corporation and the father-son relationship unresolved.

The situation is a representation of the Arthurian grail legend, the Fisher King. In the tale, the king responsible for protecting the Holy Grail is wounded and his kingdom decays in parallel to his damaged body. The knight Perceval learns he could heal the king and his kingdom by asking the right questions.

Not coincidentally, Jung was intensely interested in the Grail legend throughout his life as he thought it was one of the best representation of the ‘collective unconscious‘ where common psychological themes of humanity appear as what he called ‘archetypes‘.

His wife, Emma Jung, a psychoanalyst in her own right, wrote a book on the psychological meaning of the legend drawn from Carl Jung’s theories and cited the key theme of the tale to be ‘individuation‘, that is the healthy development of ourselves as distinct individuals by resolving our relationships with those around us and the conflicts within us.

In Inception, Robert Fischer’s journey ends with him resolving his relationship with his wounded father and saving his ‘kingdom’ by learning that he had always wanted him to be his own man and not try and be his father – which, as we learn at the end – is at the core of his subconscious. Again, this is not an interpretation; it is the literal truth of the film.

There are lots of other subtle pointers in the film which may or may not be deliberate. Is it a co-incidence that the lead character Don Cobb, shares a name with Stanley Cobb, the person most responsible for introducing Jungian analysis to the United States? Or that Ariadne gets the job by drawing a mandala style maze, a symbol that Jung believed was a representation of the unconscious self? Or that Mal’s madness is portrayed as her subconscious breaking through into reality, in line with Jung’s definition?

Regardless of whether these are subtle hints or not, the film is Jungian at its core, and what is most interesting for me is that Nolan is deploying different theories of the mind as themes in his films. While Memento was obviously neuropsychological, Inception is clearly Jungian.
 

Link to Wikipedia Inception page.
Link to more on Jung and Inception.

The McDonaldization of the Mind

Last week’s ABC Radio National All in the Mind had a fantastic interview with journalist Ethan Watters whose book ‘Crazy Like Us: The Globalization of the American Psyche’ has been making waves with its criticism of the cultural dominance of American psychiatry.

I’ve promised a long overdue review of the book to The Psychologist so I won’t go into too many details here, except to say that although the book is not without its flaws, it remains an important take on how the DSM diagnostic manual is becoming a lens through which both professionals, and more importantly, regular folks, are interpreting their own distress.

This is not just a case of people using foreign terms to label mental disorder. The effect is much more profound. Our knowledge of illness, both physical and mental, affects how we experience distress.

Watters’ book investigates how ideas taken from Western society about the nature of mental illnesses are affecting other cultures, in terms of disease mongering by drug companies, inappropriate treatments being foisted on people in times of distress and local concerns being ignored because they “don’t fit the picture”.

These criticisms are not new, but Watters drags them from the depths of the anthropological research and vividly illustrates them by weaving them into personal and social stories from across the world.

The All in the Mind interview summarises and explores some of the most important and well-done pieces in the book and is definitely worth a listen.
 

Link to All in the Mind interview with Ethan Watters.

2010-09-03 Spike activity

Quick links from the past week in mind and brain news:

More on our psychedelic drugs and psychiatry series: Nature’s The Great Beyond blog has an introduction to the series and The Guardian has a great overview from our very own Mo Costandi.

Time magazine asks why do heavy drinkers outlive non-drinkers? Your mileage may vary.

How good are we at estimating other people’s drunkenness? asks The BPS Research Digest.

IEEE Spectrum magazine has a good piece on attempts by commercial companies to get still-not-very-good ‘fMRI lie detection’ accepted into court.

There’s an excellent analysis of the recent study that found only 23% of people are without ‘personality disorder symptoms’ over at Neuroskeptic.

Wired Science covers research on how behaviour change spreads more rapidly through online networks when they’re more densely connected than real-life social networks.

Fake patients and simulated symptoms are discussed in an engaging analysis of the (in)famous Rosenhan experiment at Frontier Psychiatrist.

BBC Radio 4 has a documentary on on the ‘Pont St Esprit affair’ where a French town went strange for a few days with the CIA suspected of spiking the townspeople with LSD. More commentary on the documentary maker’s blog here.

There’s an excellent essay on taking the science vs post-modernism debate beyond extremism over at Fistful of Science.

Wired Dangerroom notes that the head of the US Military in Afghanistan has been making snide comments probably referencing the Human Terrain System – the military’s crack team of ‘weaponized anthropologists’.

Zipf’s law, the long-tail and the pattern of common and lesser-spotted words in language are tackled over at Child’s Play.

The New York Times has an extensive article asking can preschoolers be depressed?

You know those visual illusions that are two pictures at once but you seem to be only able to see one at a time and ‘flip’ between them? New Scientist discusses how the brain makes the switch.

Spoonful of Medicine briefly covers a study finding that regular cannabis smokers are more sensitive to pain than non-smokers.

My brief piece on strange objects that get stuck in MRI scanners is up at Wired UK: “An MRI machine disarmed an off-duty US police officer… The gun was pulled by the magnetic force, jamming her hand between the pistol and the machine and trapping the officer.”

Frontal Cortex covers the identifiable victim bias, where we’re more likely to have sympathy with individual victims than groups, in light of the trapped miners in Chile.

People who do ‘mental work-outs’ seem to get Alzheimer’s later than other people, but they can be hit harder when it strikes, according to new research covered by Science News.

Neuroanthropology has one of its last posts on its old site on how the Sapir-Whorf hypothesis (words shape our thoughts) gets inappropriately bashed as ‘dead science’. You know the blog has moved right?

There’s a piece on Bronze Age brain surgery over at New Scientist.

Advances in the History of Psychology blog is back after its summer recess.

All hail the launch of Philosophy TV. Looks great.

The Beast File has a brilliant video giving a visual guide to the history of MDMA / Ecstasy.

Experimental philosophy is discussed by Joshua Knobe, one of the field’s founders, over at Philosophy Bites.

The Human Edge over at National Public Radio asks if believing in God is evolutionarily advantageous.

An Opthamologist on Mars

Oliver Sacks is interviewed on NeuroTribes where he talks about his forthcoming book and his own experience of spectacular hallucinations that occurred after he developed a tumour behind his retina.

NeuroTribes is a new blog by ace science writer and Wired veteran Steve Silberman. It is part of the new PLoS science blog network and in the inaugural post Silberman has scooped a fascinating interview with the great neurologist and raconteur himself.

Here he discusses how his hallucinations, caused by the brain trying to ‘fill in’ or ‘guess’ what should be in the damaged part of the retina, are affected by smoking pot.

I also had — and still have — almost continuous hallucinations of a low order: geometric things, especially broken letters, some of them like English letters, some like Hebrew letters, some like Greek, some runes, and some a bit like numbers. They tend to have straight lines rather than curves, but they rarely form actual words. This is not something I said in the book, but if I smoke a little pot, they sometimes become words. And they tend to be in black and white — but when I smoke a little pot, they’re in color.

Silberman: That’s wonderful. What do the words say?

Sacks: Short English words of no particular significance like “may,” or pseudo-words, like “ont.” Also, since my back surgery last year, I’ve been on nortriptyline, which is supposed to block the gating mechanism for pain in the spinal cord. I only take a small dose, because it gives me an intensely dry mouth. But even the small dose has a striking effect of enhancing dreams and involuntary imagery, and upgrading my hallucinations from black-and-white to color, and from geometric patterns to faces and landscapes.

The interview is both playful and profound and makes a great teaser for his forthcoming book, which apparently, is due out in October.
 

Link to Oliver Sacks interview on NeuroTribes.

The class of 77%

A study just published in the British Journal of Psychiatry has found that only 23% of the population are without symptoms of personality disorder.

If you’re not familiar with it, personality disorder is a somewhat controversial diagnosis which essentially classifies people who we might otherwise called ‘extremely difficult’ but to the point where they cause themselves significant life problems.

This new survey used the standard diagnostic criteria, but instead of giving people a “you’ve got it or you haven’t” all-or-nothing diagnosis (given when a certain threshold of symptoms are reached) the researchers totalled up the symptoms to make a sliding scale.

The study found that even those who wouldn’t qualify for a diagnosis but still had some symptoms were more likely to have had a history of running away from home, police contacts, homelessness and sexual abuse and were less likely to be employed.

Of course, what the study could be describing is simply that people who have had a rough time come out the worst for wear.

The question is not so much whether this is a high or low figure, but at what point psychiatry and mental health services should offer assistance.

For many years psychiatry has been suffering from ‘mission creep’ where things previously thought to be unhelpful but normal (e.g. low mood after a divorce, shyness) have become classified and promoted as mental illnesses with the accompanying pharmacological treatment.

At what point we decide that something is a mental illness has become one of the central psychological and cultural questions of the 21st century.
 

Link to summary of study at the British Journal of Psychiatry.

Visions of a psychedelic future

This post is part of a Nature Blog Focus on hallucinogenic drugs in medicine and mental health, inspired by a recent Nature Reviews Neuroscience paper ‘The neurobiology of psychedelic drugs: implications for the treatment of mood disorders’ by Franz Vollenweider and Michael Kometer.

This article will be available, open-access, until September 23. For more information on this Blog Focus please visit the Table of Contents.
 


In a hut, in a forest, in the mountains of Colombia, I am puking into a bucket. I close my eyes and every time my body convulses I see ripples in a lattice of multi-coloured hexagons that flows out to the edges of the universe.

Two hours earlier, I had swallowed a muddy brown brew known as yagé, famous for its hallucinogenic effects, its foul taste, and the accompanying waves of nausea that eventually lead to uncontrollable vomiting.

Yagé has been used for hundreds, if not thousands, of years – not as a recreational drug – but as a psychological and spiritual aid that holds a central place in indigenous religion.

Romualdo, a displaced Witoto shaman who led the ceremony, was convinced of its mental health benefits and had confidentially assured me that, after the puking, I would remain in a state of heightened conciencia where I could “ask questions, solve difficulties and communicate with spirits.” “Come with a question,” he told me, “you’ll feel better afterwards.”

The main active ingredient in yagé, known outside Colombia as ayahuasca, is dimethyltryptamine or DMT, a hallucinogenic drug from the tryptamine family that works – like LSD and psilocybin – largely through its effect on serotonin receptors.

Psychedelic drugs, mental health and brain science have traditionally made for a heated combination, but a recent scientific article, published in September’s Nature Reviews Neuroscience, has attempted to more coolly assess the growing research on the potential of hallucinogens to treat depression and anxiety.

Lab studies and medical trials form a small but robust body of knowledge that reveal reliable benefits and promising future avenues. The dissociative anaesthetic ketamine has been found to lift mood – even in cases of severe of depression – while psilocybin, present across the world in mushrooms and fungi, has been shown to have anxiety reducing properties.

But while no serious bad reactions have happened during the trials, the full range of potential risks is still not fully understood, meaning the treatments remain firmly in the lab.

The caution is warranted. Psychiatrists are more than aware of hospital admissions triggered by the same drugs taken outside of controlled conditions, and so the compounds will remain as experimental treatments until the risks are fully known.

Nevertheless, the science is now developed enough for new ideas to be generated based solely on a neurobiological understanding of the drugs.

The authors of this latest review, neuroscientists Franz Vollenweider and Michael Kometer, note that success with psychedelics that largely work on the glutamate system – such as ketamine and PCP – may be due to the fact that these circuits regulate long-term brain changes. This suggests a potential path to extending the mood lifting effects of these drugs beyond the initial ‘trip’.

One key advance would be an understanding of how the chemical structure of a particular hallucinogen relates to the experience it creates, allowing researchers a neurological toolkit that could be used to trigger the beneficial effects while toning down the extreme unreality that some people find unpleasant.

Yet, it is still not clear whether such benefits are separable from the psychedelic effects and it may be that the ‘active ingredient’ lies somewhere between an altered state of consciousness and a reflective mind, as some studies on drug-assisted psychotherapy suggest.

It is also clear that a great number of ritual hallucinogens, widely used by indigenous people for their psychological benefits, have yet to be explored.

The preliminary studies on users of yagé indicate that it has potential benefits for mental health, although it remains largely untouched by more rigorous tests.

As my own investigation ends, I leave the isolated hut feeling exhausted and disoriented as the clear morning light refracts through my thoughts and casts bright trickling colours into unfilled spaces.

As Romualdo promised, I feel better, elated even, but the questions I brought remain unanswered and have similarly refracted into a thousand intricate doubts.
 

Link to Nature Blog Focus on psychedelics Table of Contents.
Link to Nature Reviews Neuroscience article.

I, Jacques Derrida, Used To Be A 97lb Weakling!

Anthropologist Pascal Boyer has written a wonderfully contrarian essay for Cognition and Culture criticising the “crashingly banal assumptions” behind supposedly radical theories of human nature.

While Boyer is clearly making mischief, his main criticism of the post-modernist idea that human nature is entirely socially constructed is spot on.

While there are clear social influences in how we understand ourselves, the extreme relativism of saying everything is ‘defined by culture’ tends to evaporate when examined too closely.

But on closer inspection, it generally turns out that the initial, amazing, challenging statements in fact disguised crashingly banal assumptions. Suppose you point out to your academic ideologue that, for instance, if maleness and manhood really are completely unrelated… then it is puzzling that an extraordinarily vast number of [socially constructed] “men” happen to be [chromosomal] “males”, and that such a coincidence is spooky. You will probably be told that you did not quite understand the original statement. What it meant was that the meaning of maleness could not be derived from possession of the Y chromosome…

Or if you point out that some forms of insanity occur in many cultures at the same rates, that they trigger highly similar behaviors, are associated with the same genetic predispositions and correlate with similar neuro-functional features, you will be told that you did not understand. What was meant was that the cultural construal of madness was not derived directly from brain dysfunction…

At which point, you might be forgiven to think something like “so that was what all the fuss was about?” and you would be right of course. When push comes to shove, the flamboyant, earth-shattering, romantic, swash-and-buckle assault on our entrenched certainties seems to be, well, a bit of a damp squib.

 

Link to Boyer’s essay ‘There is no such thing as sexual intercourse’.