Erotic asphyxia and the limits of the brain

A guy who enjoyed whacking off while trying to strangle himself has provided important evidence that an outward sign considered to indicate severe irreversible brain damage can be present without any lasting effects.

It was long thought that a body response called decerebrate rigidity – where the body becomes stiff with the toes pointing and the wrists bending forward – was a sign of irreversible damage to the midbrain.

This sign is widely used in medical assessments to infer severe brain damage and has been observed in videos of people being executed by hanging.

A new study in The American Journal of Forensic Medicine and Pathology provides striking evidence that it is possible to recover from decerebrate rigidity owing to self-taped videos of a man who would strangle himself with a pair of pyjama pants suspended from the shower while masturbating.

The practice is known as autoerotic asphyxia and is based on the idea that restricted oxygen can enhance sexual pleasure – although is not recommended, not least because the medical literature is awash with cases of people who have died while attempting it.

Indeed, the gentleman described in the study did eventually die while hanging himself and when the forensic team investigated his house they found videos where he had filmed himself undertaking the risky sexual practice.

The three videos show him hanging himself while masturbating to the point where he lost consciousness and had the equivalent of an epileptic tonic-clonic seizure as he crashed to the ground. Each time, he regains consciousness and has no noticeable lasting effects.

In one of the videos, 20 seconds of decerebrate rigidity are clearly present. This was previously thought to be a sign of severe permanent brain damage – and yet he comes round, picks himself up and seems unaffected.

The study makes the interesting point that we still know very little about the effects of oxygen starvation on the brain.

For example, the widely quoted figure about brain cells dying after three to five minutes without oxygen is based entirely on animal studies and we don’t actually know the limit for humans.

As the authors note “There is no study to document this threshold of 3 to 5 minutes of ischemia [oxygen deprivation] to cause irreversible brain damage in human beings. Nevertheless, data obtained from animal studies were applied to human beings and the source of the threshold was later forgotten and assumed to be reliable.”
 

Link to PubMed entry for study.

Arrow in the head

The image is a 3D CT scan of someone who was shot in the head with an arrow which penetrated their brainstem.

It’s taken from a recent case study that notes that these injuries have virtually disappeared from the West although are more common in other parts of the world, including from some tribal areas of India, from where this injury occurred.

The case is reported in the Journal of Emergencies, Trauma and Shock and is of a 35-year-old man who was admitted to hospital conscious, in severe pain, with partial facial paralysis, unable to open his jaw, and with an arrow sticking out of the back of his head.

The report notes that the patient made a full recovery although stresses the importance of not pulling out arrows without surgery because they can cause life threatening damage to blood vessels if removed without careful monitoring.

As far as I can tell, this is the only report in the medical literature of an arrow stuck in the brain after being fired in anger, as all the others are either from sporting accidents or suicide attempts.
 

Link to Journal of Emergencies, Trauma and Shock case study.

Fifteen brain encounters

I’ve just finished Carl Zimmer’s new e-book Brain Cuttings that collects fifteen of his previous long-form mind and brain articles and, I have to say, I thoroughly enjoyed it.

I was kindly sent an advanced copy of the book which is only available as a pdf for your Kindle or other electronic reader. As I don’t own one, I took the file to the local copy shop and got it printed out (a paper version of the ‘iPad’ known as the ‘Pad’).

I was particularly impressed by the sheer range of the pieces that cover everything from the neurobiology of astrocytes (in a chapter entitled The Brain’s Dark Matter) to an account of a trip the Singularity Summit, a conference of techno-utopians who are working towards augmented immortality for the human race.

The piece on the Singularity is probably the stand-out section of the book as it takes a level-headed look at the movement’s claims for brain enhancement and super-intelligence without engaging in literary eye-rolling or ever losing a sense of wonder for the genuine scientific advances incorporated into the ideas.

In terms of the science, the book is absolutely faultless, which is sadly not something your average reader can take for granted when it comes to neuroscience or psychology journalism, and Zimmer writes in a remarkably clear style that makes absorbing even some of the most technical aspects seem as natural as breathing.

At times, I yearned for a little more exploration of the characters we encounter on the journeys, but the length of the pieces means they tend to focus more on the ideas than the scenes.

I’m not familiar with the e-book market but $11 for a 100 page book struck me as a little steep. However, I note that the book is an experiment in itself and is only available electronically, something of a first from such as well-established author.

Whether you are an enthusiast, a professional psychologist or neuroscientist, or a combination, you will probably learn much from the book due to its breadth of vision. Regardless of who you are you are sure to enjoy the engaging immersions in some of the most interesting ideas in contemporary science.
 

Link to Brain Cuttings page.
Link to Zimmer’s blog post about the new book.

Ten minutes of consciousness

I have to admit, I’m a little bored with consciousness, and my heart slightly sinks when I see yet another piece that rehashes the same old arguments. However, I thoroughly enjoyed this refreshing Cristof Koch talk where he engagingly describes his own approach to the neural basis of conscious experience.

The talk is from a recent debate on consciousness that was covered by The Guardian and serves as a great introduction to some of the major issues in the field.

Despite a minor relapse of his Mac-affliction half way through (sufferers may note that there is now a maintenance treatment that can ease the path to full remission) the talk is ten well-spent minutes which might just re-ignite your interest in consciousness.
 

Link to video of consciousness talk at The Guardian.

Distracted by the data

Wired has an incisive article looking at the science behind the ‘technology and multi-tasking are damaging the brain’ scare stories that regularly make the media.

The piece does a fantastic job of actually looking at specific studies on multi-tasking and distraction and questioning whether the ‘tech scare’ headlines are warranted given the findings.

The conclusion is neither that ‘all is well’ or that ‘we are all doomed’ but that we really have very little data – although none of it so far has given any credence to popular concerns that technology is impairing our intelligence.

The piece also hits on the crucial idea that talking of ‘technology’ or the ‘internet’ as a coherent whole is unhelpful because it has such different forms each with potentially different effects:

A solid consensus on digital multitasking is unlikely to be reached anytime soon, perhaps because the internet and technology are so broadly encompassing, and there are so many different ways we consume media. Psychological studies have seen a mix of results with different types of technology. For example, some data shows that gamers are better at absorbing and reacting to information than nongamers. In contrast, years of evidence suggest that television viewing has a negative impact on teenagers’ ability to concentrate. The question arises whether tech-savvy multitaskers could consume different types of media more than others and/or be affected in diffferent ways.

A research paper authored by a group of cognitive scientists titled “Children, Wired: For Better and for Worse” (pdf) sums it up best:

“One can no more ask, ‘How is technology affecting cognitive development?’ than one can ask, ‘How is food affecting physical development?’” the researchers wrote. “As with food, the effects of technology will depend critically on what type of technology is consumed, how much of it is consumed, and for how long it is consumed.”

There are some quotes from me included, but don’t let that put you off, as it remains a lively discussion of the science behind a common 21st century talking point.
 

Link to Wired piece on media, technology and brain studies.

The war changed me: brain injury after combat

The Washington Post has an amazing series of video reports on US soldiers who have returned from Iraq and Afghanistan with brain injury – often leading to a marked change in personality.

The reports cover veterans who have suffered numerous types of brain injury, from shrapnel-driven penetrating brain injuries to concussion-related mild-traumatic brain injury, and how they have adapted after returning home.

The treatments range from brain surgery to psychological rehabilitation and the interviews cover the difficulties from all angles – including the patients’, families’ and medical professionals’ perspectives.

It’s probably worth noting that this combat brain injury is not an area without controversy and the Washington Post piece follows the US military orthodoxy on mild traumatic brain injury – that all problems after blast concussion are due to brain damage.

Owing to the widespread use of improvised explosive devices many soldiers get caught up in a blast without suffering any visible physical injury, although they may suffer concussion due to the shock waves.

The US military is focused on the idea that subsequent problems – such as headaches, mood problems, trouble concentrating, irritability, impulsiveness or personality change – are due to the physical effects of the blast. This cluster of symptoms is often diagnosed as post-concussion syndrome or PCS.

Studies have found that, on average, soldiers who have experienced a blast are more likely to show changes in the micro-structure of the brain. However, not every soldier who shows the PCS symptom cluster has detectable brain changes or shows a clear link between the symptoms and the blast.

This latter point is important because the same symptoms can arise from combat stress – being caused by the emotional impact of war rather than a direct result of a shock to the brain.

It becomes clear why this distinction is crucial when you read the US military’s guidance on screening for mild traumatic brain injury which doesn’t have a specific way of distinguishing the effects of emotional stress from the effects of brain disturbance.

Owing to the fact that blasts are such common experiences for soldiers, this could lead to false positives where veterans are thought to have brain damage when they would be better off being treated for combat stress.

In fact, this exact same scenario was played out in World War I where ‘shell shock’ was originally considered to be caused by blast waves affecting the brain (hence the name) only for it to be later discovered that not everyone who had the symptoms had even been near a shell explosion.

Regardless of this medical debate, The Washington Post special report is a fantastic combination of real life experiences and the neuroscience of combat-related brain injuries. Highly recommended.
 

Link to WashPost ‘Coming home a different person’ with intro clip.
Link straight to main menu.

Hypnosis in the lab: the suggestion of altered states

I’ve got an article in The Guardian online about how hypnosis is being increasingly used in the neuroscience lab to simulate unusual mental states and alter the normal flow of automatic psychological processes.

After years of neglect, it turns out hypnosis is a useful experimental tool that allows temporary changes to both the conscious and unconscious mind that are normally very difficult to achieve.

Whenever AR sees a face, her thoughts are bathed in colour and each identity triggers its own rich hue that shines across her mind’s eye. This experience is a type of synaesthesia which, for about one in every 100 people, automatically blends the senses. Some people taste words, others see sounds, but AR experiences colour with every face she sees. But on this occasion, perhaps for the first time in her life, a face is just a face. No colours, no rich hues, no internal lights.

If the experience is novel for AR, it is equally new to science because no one had suspected that synaesthesia could be reversed. Despite the originality of the discovery, the technique responsible for the switch is neither the hi-tech of brain stimulation nor the cutting-edge of neurosurgery, but the long-standing practice of hypnosis.

As it turns out, our scientific paper on the cognitive neuroscience of hypnosis and the ‘hysteria’ has also just been published in the Journal of Neurology, Neurosurgery and Psychiatry.

‘Hysteria’ is the traditional name for an interesting condition now often diagnosed as ‘conversion disorder‘ where people are paralysed, blind, have seizures or show other seemingly neurological problems without any evidence of nervous system damage that could explain the problem.

The 19th Century French neurologist Jean-Martin Charcot proposed that hypnosis and hysteria might work in a similar way – brain circuits outside of conscious control might be inhibiting or ‘shutting down’ other functions.

The idea was dismissed for many years, but we review neuroimaging and neuropsychology studies that suggest he might have been on the right track and something similar may explain why people can seem to lose conscious control over their body and senses during both hysteria and hypnosis.

The Guardian article explores the use of hypnosis in neuroscience more widely, how it is becoming an important experimental tool, and dispels some of the common myths about the effects.

One of the problems with researching or using hypnosis in the lab is its association in popular culture with quacks and stage hypnotists, which means many scientists give it a wide birth as they did with consciousness research a decade ago.

You’ll notice the piece has been given an odd title and a cheesy picture which I suspect is similar to how articles on consciousness are typically accompanied by a picture of a brain flying through space.
 

Link to Guardian piece on hypnosis in neuroscience.
Link to abstract of paper on neuroscience of hysteria and hypnosis.

Neurosurgery simulated

Ohio State University have created a fantastic interactive web application where you play the part of a neurosurgeon operating on a patient who needs a deep brain stimulation device installed to treat their Parkinson’s disease.

When I first loaded it up and saw the cartoon-like style I thought it would just be a bit of eye-candy but it turns out to be quite a detailed guide to exactly how this sort of surgery is undertaken.

It’s great if you’re just curious, as there’s plenty to learn about the procedure, but even if you’re a neuroscience fanatic there are questions throughout the demo that allow you to flex your problem-solving skills.

I have to say, I learnt loads from it, and the best bit is where you get to hear the firing patterns of different areas as the recording electrode is inserted.

A cleverly designed and engaging bit of software.
 

Link to EdHeads deep brain stimulation neurosurgery demo.

Towards an operating system for brain hacking

Electronic devices that interface directly with the brain are now being produced by labs around the world but each new device tends to work in a completely different way. An article in Technology Review argues that we need an agreed neural operating system so brain-machine interfaces can more easily work together.

Although current devices tend only to measure brain activity or stimulate cortical areas, it won’t be very long before devices typically do both – detecting and reacting to neural states – possibly forming a dynamic network of electronic devices that regulate brain activity.

To avoid the ‘Mac vs PC problem of the brain’, neuroscientist Ed Boyden highlights the importance of having devices that speak a common language to avoid both wasted scientific effort and potentially dangerous miscommunication.

Some examples of this kind of “brain coprocessor” technology are under active development, such as systems that perturb the epileptic brain when a seizure is electrically observed, and prosthetics for amputees that record nerves to control artificial limbs and stimulate nerves to provide sensory feedback. Looking down the line, such system architectures might be capable of very advanced functions–providing just-in-time information to the brain of a patient with dementia to augment cognition, or sculpting the risk-taking profile of an addiction patient in the presence of stimuli that prompt cravings.

Given the ever-increasing number of brain readout and control technologies available, a generalized brain coprocessor architecture could be enabled by defining common interfaces governing how component technologies talk to one another, as well as an “operating system” that defines how the overall system works as a unified whole–analogous to the way personal computers govern the interaction of their component hard drives, memories, processors, and displays.

Although not mentioned in the article, another advantage of a common platform for brain devices would be security, as current devices as often completely open and designed to be easily controllable from the outside.
 

Link to TechReview article on ‘Brain Coprocessors’.

Dreams of a consciousness measuring device

The New York Times has an excellent article about Giulio Tononi, one of the few neuroscientists trying to understand consciousness in a way that may have a direct practical application – to create a medical device that can tell whether you are conscious or not.

To be honest, I’ve been a bit bored with consciousness, not in an existential sense you understand, but in terms of the science which tends towards tinkering with interesting but possibly inconsequential effects.

The NYT article, however, is completely riveting, as it discusses Tononi’s quest to understand consciousness to the point of building a ‘consciousness meter’.

Although it may sounds fanciful, it could have an important medical application – to help anaesthetists determine when a patient is actually aware of what’s happening to them.

If you’re not familiar with surgery you’d think this was easy enough to determine except for the fact that muscle relaxant drugs are often administered.

This means that even if you’re awake, you can’t communicate the fact, occasionally leading to terrifying cases of people who are conscious but paralysed while operated on.

So ideally, anaesthetists would like a machine that gives a consciousness ‘read out’ from the brain. There is something called the bispectral index, which claims to measure depth of anesthesia, although it turns out not to be a very good guide to consciousness.

Of course, to create a device to measure consciousness, we need to understand its neuroscience, and Tononi has a unique theory he is working on:

Consciousness, Dr. Tononi says, is nothing more than integrated information. Information theorists measure the amount of information in a computer file or a cellphone call in bits, and Dr. Tononi argues that we could, in theory, measure consciousness in bits as well. When we are wide awake, our consciousness contains more bits than when we are asleep.

For the past decade, Dr. Tononi and his colleagues have been expanding traditional information theory in order to analyze integrated information. It is possible, they have shown, to calculate how much integrated information there is in a network. Dr. Tononi has dubbed this quantity phi, and he has studied it in simple networks made up of just a few interconnected parts. How the parts of a network are wired together has a big effect on phi. If a network is made up of isolated parts, phi is low, because the parts cannot share information…

Dr. Tononi argues that his Integrated Information Theory sidesteps a lot of the problems that previous models of consciousness have faced. It neatly explains, for example, why epileptic seizures cause unconsciousness. A seizure forces many neurons to turn on and off together. Their synchrony reduces the number of possible states the brain can be in, lowering its phi.

The NYT piece is a fantastic look into the ideas behind the theory and the exciting possibilities it presents.
 

Link to NYT on ‘Sizing Up Consciousness by Its Bits’.

An uneven hail of bullets

Gunshot wounds to the head are a major cause of death among soldiers in combat but little is known about where bullets are more likely to impact. A study just published in the Journal of Trauma looked at common bullet entry points among soldiers who died in combat and found clear patterns – but the researchers are not sure why.

The study, led by physician Yuval Ran, looked at Israeli combat deaths from 2000 to 2004 and tracked where bullet entries appeared on the skull (illustrated above), finding that the lower back (occipital region) and front of the temple areas (anterior-temporal regions) were most likely.

The results of our study show that in a combat setting, the occipital and anterior-temporal regions are most frequently hit, as opposed to the anterior-parietal and the posterior-temporal regions, which are rarely hit. Moreover, most of the parietal injuries were in proximity to the occipital bone. In an attempt to explain these findings, we presented them to sniper instructors, only to learn that snipers always aim to center mass, and aiming at high distances to different skull areas is not probable. At this time, we have no plausible theory to explain these findings.

Your first thought may be that the distribution is because helmets better protect certain parts of the head, but as the researchers note, helmets have been shown to be almost entirely ineffective in protecting against direct gunfire.

Getting shot in the head is not just an unfortunate event, it is the result of the interaction between the shooter and the target, and each of their behaviours could affect where bullets are more likely to land.

The researchers also note that the results are strikingly similar to the only other study looking at the location of fatal gunshot wounds to the head, despite the fact that this earlier study only included civilian shootings.

While there is no current theory as to why fatal gunshot wounds are more likely to be distributed as they are, the article suggests that this could be used to save lives in combat.

Effective helmets are not worn by soldiers because sufficient armouring would make them too heavy, but simply adding protective armour to the most common areas would make for a lighter helmet that could stop the majority of fatal bullet wounds.
 

Link to PubMed entry for study.

The death of ‘right brain thinking’

A new study published in Psychological Bulletin has just reviewed all the neuroscience research on creative thinking and found no good evidence for the pop-culture idea that the right side of the brain is more involved in ‘creative thinking’.

Sadly, the full text isn’t available online, but the abstract of the study contains all the punchlines:

A review of EEG, ERP, and neuroimaging studies of creativity and insight.

Psychol Bull. 2010 Sep;136(5):822-48.

Dietrich A, Kanso R.

Creativity is a cornerstone of what makes us human, yet the neural mechanisms underlying creative thinking are poorly understood. A recent surge of interest into the neural underpinnings of creative behavior has produced a banquet of data that is tantalizing but, considered as a whole, deeply self-contradictory. We review the emerging literature and take stock of several long-standing theories and widely held beliefs about creativity.

A total of 72 experiments, reported in 63 articles, make up the core of the review. They broadly fall into 3 categories: divergent thinking, artistic creativity, and insight. Electroencephalographic studies of divergent thinking yield highly variegated results. Neuroimaging studies of this paradigm also indicate no reliable changes above and beyond diffuse prefrontal activation. These findings call into question the usefulness of the divergent thinking construct in the search for the neural basis of creativity.

A similarly inconclusive picture emerges for studies of artistic performance, except that this paradigm also often yields activation of motor and temporoparietal regions. Neuroelectric and imaging studies of insight are more consistent, reflecting changes in anterior cingulate cortex and prefrontal areas.

Taken together, creative thinking does not appear to critically depend on any single mental process or brain region, and it is not especially associated with right brains, defocused attention, low arousal, or alpha synchronization, as sometimes hypothesized. To make creativity tractable in the brain, it must be further subdivided into different types that can be meaningfully associated with specific neurocognitive processes.

 

Link to PubMed entry for studies (via @sarcastic_f).

Dopamine crystal method

A beautiful image of dopamine crystals viewed with polarized light.

Photo by Spike Walker for Wellcome Images. Click for source

From the description: “A polarized light micrograph of dopamine crystals. Dopamine is a naturally occurring precursor of norepinephrine that affects various brain processes, many of which control movements, emotional responses and the experiences of pain and pleasure. Dopamine receptors are especially clustered in the midbrain. The drug L-DOPA, used to help sufferers of Parkinson’s disease, is converted in the brain to dopamine.”
 

Link to Creative Commons licensed image at Wellcome (via NewSci).

Visions of a psychedelic future

This post is part of a Nature Blog Focus on hallucinogenic drugs in medicine and mental health, inspired by a recent Nature Reviews Neuroscience paper ‘The neurobiology of psychedelic drugs: implications for the treatment of mood disorders’ by Franz Vollenweider and Michael Kometer.

This article will be available, open-access, until September 23. For more information on this Blog Focus please visit the Table of Contents.
 


In a hut, in a forest, in the mountains of Colombia, I am puking into a bucket. I close my eyes and every time my body convulses I see ripples in a lattice of multi-coloured hexagons that flows out to the edges of the universe.

Two hours earlier, I had swallowed a muddy brown brew known as yagé, famous for its hallucinogenic effects, its foul taste, and the accompanying waves of nausea that eventually lead to uncontrollable vomiting.

Yagé has been used for hundreds, if not thousands, of years – not as a recreational drug – but as a psychological and spiritual aid that holds a central place in indigenous religion.

Romualdo, a displaced Witoto shaman who led the ceremony, was convinced of its mental health benefits and had confidentially assured me that, after the puking, I would remain in a state of heightened conciencia where I could “ask questions, solve difficulties and communicate with spirits.” “Come with a question,” he told me, “you’ll feel better afterwards.”

The main active ingredient in yagé, known outside Colombia as ayahuasca, is dimethyltryptamine or DMT, a hallucinogenic drug from the tryptamine family that works – like LSD and psilocybin – largely through its effect on serotonin receptors.

Psychedelic drugs, mental health and brain science have traditionally made for a heated combination, but a recent scientific article, published in September’s Nature Reviews Neuroscience, has attempted to more coolly assess the growing research on the potential of hallucinogens to treat depression and anxiety.

Lab studies and medical trials form a small but robust body of knowledge that reveal reliable benefits and promising future avenues. The dissociative anaesthetic ketamine has been found to lift mood – even in cases of severe of depression – while psilocybin, present across the world in mushrooms and fungi, has been shown to have anxiety reducing properties.

But while no serious bad reactions have happened during the trials, the full range of potential risks is still not fully understood, meaning the treatments remain firmly in the lab.

The caution is warranted. Psychiatrists are more than aware of hospital admissions triggered by the same drugs taken outside of controlled conditions, and so the compounds will remain as experimental treatments until the risks are fully known.

Nevertheless, the science is now developed enough for new ideas to be generated based solely on a neurobiological understanding of the drugs.

The authors of this latest review, neuroscientists Franz Vollenweider and Michael Kometer, note that success with psychedelics that largely work on the glutamate system – such as ketamine and PCP – may be due to the fact that these circuits regulate long-term brain changes. This suggests a potential path to extending the mood lifting effects of these drugs beyond the initial ‘trip’.

One key advance would be an understanding of how the chemical structure of a particular hallucinogen relates to the experience it creates, allowing researchers a neurological toolkit that could be used to trigger the beneficial effects while toning down the extreme unreality that some people find unpleasant.

Yet, it is still not clear whether such benefits are separable from the psychedelic effects and it may be that the ‘active ingredient’ lies somewhere between an altered state of consciousness and a reflective mind, as some studies on drug-assisted psychotherapy suggest.

It is also clear that a great number of ritual hallucinogens, widely used by indigenous people for their psychological benefits, have yet to be explored.

The preliminary studies on users of yagé indicate that it has potential benefits for mental health, although it remains largely untouched by more rigorous tests.

As my own investigation ends, I leave the isolated hut feeling exhausted and disoriented as the clear morning light refracts through my thoughts and casts bright trickling colours into unfilled spaces.

As Romualdo promised, I feel better, elated even, but the questions I brought remain unanswered and have similarly refracted into a thousand intricate doubts.
 

Link to Nature Blog Focus on psychedelics Table of Contents.
Link to Nature Reviews Neuroscience article.

You are in a maze of twisty little proteins, all alike

Time magazine has a brief but somewhat intricate article describing the relationship between the synapse and the APC protein.

It has a spectacularly complex and labyrinthine metaphor that doesn’t really help me understand what’s being discussed but is, nonetheless, a joy to read.

Findings by neuroscientists in various Tufts graduate programs—published in the August 18 issue of The Journal of Neuroscience—show a link between the APC protein and the development, or lack thereof, of something called a synapse.

If a synapse were a traffic intersection, the APC protein would be those annoying bridge tolls that are later used to develop roadways. Only instead of freight and passengers, the vehicles on this highway would carry information between neurons. A lack of tax collection will inhibit the development of the intersection—think potholes and faded, unreadable road signs—so that vehicles won’t be able to cross in the intended and most efficient way. In the same way, APC impairment blocks the synapse maturation that is crucial to the mental processes of learning and using memory.

 

Link to brief Time article (via @stevesilberman).

An archive of buried brains, restored

The New York Times reports how a carefully assembled archive of human brains with tumours, collected by the pioneering neurosurgeon Harvey Cushing, was left to gather dust and decay at Yale university. Recently restored, it gives a glimpse of the early days of neurosurgery before brain scans or the consistent use of anaesthetic.

These chunks of brains floating in formaldehyde bring to life a dramatic chapter in American medical history. They exemplify the rise of neurosurgery and the evolution of 20th-century American medicine — from a slipshod trial-and-error trade to a prominent, highly organized profession.

These patients had operations during the early days of brain surgery, when doctors had no imaging tools to locate a tumor or proper lighting to illuminate the surgical field; when anesthesia was rudimentary and sometimes not used at all; when antibiotics did not exist to fend off potential infections. Some patients survived the procedure — more often if Dr. Cushing was by their side.

The collection has now been restored and organised and can be seen at the Cushing Centre at Yale.

Don’t forget to have a look at some of Cushing’s original photos of post-operation patients on the left hand side of the NYT article.
 

Link to NYT piece ‘Inside Neurosurgery’s Rise’ (via @bmossop).