The Brain That Wouldn’t Die!

The classic 1960’s B-movie The Brain That Wouldn’t Die has fallen into the public domain and is now available to download or to watch online.

It’s another classic story of boy meets girl, boy loses girl in terrible car crash, boy keeps girl’s head alive in neuroscience lab while looking for attractive new body.

Needless to say, it all ends in tears, but not before a journey that takes us from the lab, to a cat fight in a strip bar, and back again.

All in the best possible B-movie taste of course with some er… ‘unique’ dialogue that should give any experimental scientist cause for thought:

“The paths of experimentation twist and turn through mountains of miscalculations and often lose themselves in error and darkness!”

Wise words indeed.

Link to download from the Internet Archive.
Link to stream from Google Video.

Inside the psychotic world of Grand Theft Auto

A brief article published in the Journal of the Royal Society of Medicine in 2001 reported the case of a young man who suffered delusions that he was a player inside a computer game.

The game isn’t mentioned by name, but it seems to be Grand Theft Auto.

The authors of the case study point out that they’re not suggesting that computer games cause psychosis, but they comment on how it’s a somewhat unusual illustration of how ideas from a person’s life get incorporated into the themes of psychosis.

A young man was admitted from prison to a psychiatric facility after reports that he had been acting in a bizarre manner. He had been arrested for stealing motor vehicles and assaults with weapons. At interview he was found to be experiencing the delusion that he was a player inside a computer game (adult-certificate game, widely available) in which points are scored for stealing cars, killing assailants and avoiding police vehicles.

Psychotic symptoms had emerged slowly over two years. His family had noticed him becoming increasingly withdrawn and isolated from social activities. He developed delusions that strangers were planning to kill him and also experienced auditory hallucinations, constantly hearing an abusive and derogatory voice. Previously a computer enthusiast, he began to play computer games incessantly. He felt that the games were communicating with him via the headphones.

In a complex delusional system he came to believe he was inside one of these games and had to steal a car to start scoring points. He broke into a car and drove off at speed, believing he had `invulnerable’ fuel and so could not run out of petrol. To gain points he chose to steal increasingly powerful vehicles, threatening and assaulting the owners with weapons. Later he said he would have had no regrets if he had killed someone, since this would have increased his score.

After arrest and while in prison he continued to believe he was in the game, despite initial medication. When he was admitted to hospital six weeks later, part of ward management was to deny him access to computer games. Nothing abnormal was found on physical examination, blood investigations, drug screen, electroencephalography or a computed tomographic brain scan. Paranoid schizophrenia was diagnosed and he responded well to further treatment with antipsychotic medication.

Similarly, ‘rock and roll delusions’ have occasionally been reported in the medical literature (David Bowie seems to be a favourite).

Link to JRSM full-text article ‘Computer Game Delusions’.

Can’t compute the wood for the trees

Computer scientist David Gelernter has written an in-depth article for Technology Review where he criticises the possibility of creating artificial consciousness, but has high hopes for unconscious artificial intelligence.

My case for the near-impossibility of conscious software minds resembles what others have said. But these are minority views. Most AI researchers and philosophers believe that conscious software minds are just around the corner. To use the standard term, most are “cognitivists.” Only a few are “anticognitivists.” I am one. In fact, I believe that the cognitivists are even wronger than their opponents usually say.

But my goal is not to suggest that AI is a failure. It has merely developed a temporary blind spot. My fellow anticognitivists have knocked down cognitivism but have done little to replace it with new ideas. They’ve showed us what we can’t achieve (conscious software intelligence) but not how we can create something less dramatic but nonetheless highly valuable: unconscious software intelligence. Once AI has refocused its efforts on the mechanisms (or algorithms) of thought, it is bound to move forward again.

Gelernter is a a great writer and an interesting guy, not least because of his brush with death, courtesy of disturbed anti-technologist Ted Kaczynski aka ‘The Unabomber’.

Link to TechReview article ‘Artificial Intelligence Is Lost in the Woods’.

2007-06-29 Spike activity

Quick links from the past week in mind and brain news:

Couple of good radio shows on philosophy: In Our Time on the history of ‘common sense philosophy’ and The Philosopher’s Zone has a special on the late Richard Rorty.

When do children think wishes can come true? Mixing Memory examines a psychology study that aimed to find out.

Scientific American investigates the neuroscience of irrationality and economic decision-making.

New Hitachi ‘brain-machine interface’ uses infrared light to read brain activity.

Prospect Magazine has a short article on the psychology of suicide bombers.

Experts say video games are not an addiction. Pope still catholic.

Why do we find it harder to recognise faces of other races than our own? Cognitive Daily looks at the influence of experience.

Supporters of ‘child bipolar disorder’ champion write to the Boston Herald with a strong defence of his work.

New Scientist covers a virtual world that can be explored through the power of thought (with video).

Wired looks at some of the revelations about behavioural control studies from recently de-classified CIA documents.

When brain damage helps. Developing Intelligence looks at a study that found that patients with frontal lobe damage actually do better on some reasoning tasks.

If there such a thing as photographic memory? Scientific American ‘asks the expert’.

The excellent NYC radio show RadioLab has a <a href="http://www.wnyc.org/shows/radiolab/episodes/2007/06/08
“>special on Memory and Forgetting, featuring a well-known science blogger.

The hardest cut: Penfield and the fight for his sister

In 1935, world renowned neurosurgeon Wilder Penfield published three remarkable case studies describing the psychological effects of frontal lobe surgery.

They remain a fascinating insight into the link between brain and behaviour, but one case was unlike anything Penfield had tackled before.

It described the fight to save the life of his only sister.

 

Continue reading “The hardest cut: Penfield and the fight for his sister”

Is bigotry a mental illness?

The Psychiatric Times has an interesting article discussing whether bigotry should be classified as a mental illness. The author concludes no, but the discussion gives an important insight into how we decide what is a mental illness and what is not.

Most people might think that an opinion, no matter how disagreeable, shouldn’t get someone diagnosed with a mental disorder.

The difficulty comes when deciding what criteria you should use to decide that someone’s mental state has gone beyond what is normal and should be considered an illness.

Generally, if a mental state is considered to cause distress or impairment, it’s considered to be a sign of mental illness.

This goes for physical illness as well. A physical difference is only considered an illness if it causes problems as a result.

However, someone who is extremely racist might genuinely suffer problems as a result of their opinions.

As we reported previously, a small group of psychiatrists are pushing for a diagnosis of ‘racist disorder’ to be included in the next revision of the diagnostic manual on this basis.

One argument to be wary of in the justification of this, or any other mental disorder, is that ‘it must exist because biological differences can be found between people thought to have the condition and those without’.

As the mind and behaviour is just a reflection of brain function, any difference, no matter how trivial (ice cream preference for example), will have a related biological difference.

As with physical illness, biological differences in themselves can’t define an illness, because they have to be linked to what is considered serious distress or impairment in everyday life.

Biology might tell us why the difference occurs, but it can’t tell us whether the difference should be considered good or bad.

This decision is essentially a value judgement, because what is considered serious, distressing, impairing or relevant to everyday life aren’t cut-and-dry decisions and are made on the basis of a consensus of opinions.

In some cases, such as cancer, it’s easy, because everyone agrees that an early painful death is bad.

In other cases, particularly for mental illnesses, the issues can be a lot less straightforward because there there are few obvious and direct effects of mental states.

These issues ask us to question what we consider an illness and highlight that the decision is as based as much on social considerations and context, as on the science of biology.

The Psychiatric Times article tackles exactly these sorts of issues in its discussion of bigotry, and is a great guide to the philosophical issues involved in classifying mental disorder.

If you want to explore further, the Stanford Encyclopedia of Philosophy has a great entry on mental illness that tackles many of the conceptual difficulties.

Link to Psychiatric Times article ‘Is bigotry a mental illness?’
Link to Stanford Encyclopedia of Philosophy entry on mental illness.

Kidman new face of brain game, will it sharpen the mind?

As a sure sign that cognitive improvement games have gone mainstream, Nicole Kidman has been announced as the new face of Nintendo’s latest ‘brain training’ title.

The idea that mental training will actually help boost your mental skills is relatively new.

It was traditionally thought that the mind and brain just start losing their edge after young adulthood and your best hope was to learn to use your remaining resources more effectively as you age.

However, studies started to appear in the late 1990s suggesting that practicing certain tasks could act as a sort of ‘mental workout’, actually improving mental abilities directly in people with disorders like Alzheimer’s disease and schizophrenia.

Most people weren’t fully convinced of the benefits in healthy older people until a key study was published last year in the Journal of the American Medical Association that showed modest but reliable improvements, even after five years.

The effects were typically small (often too small to be picked up without standard tests), but interestingly, the training also had a knock-on effect on the participants’ ability to look after themselves effectively on a day-to-day basis.

It seems that cognitive training may have a stronger effect in people with mental impairments. A recent review of 17 studies found a positive effect on mental abilities, everyday activities and mood in people with Alzheimer’s.

However, as far as I know, no controlled trials have ever been published on any off-the-shelf ‘brain training’ game, including Nintendo’s. You’d guess from the medical literature that they might have a similar effect, but it’s yet to be shown for sure.

Link to BBC News article ‘Kidman to be new face of Nintendo’.
Link to JAMA article ‘Long-term Effects of Cognitive Training…’

Formula 1 and Iraqi psychiatry on AITM new series

A new series of BBC Radio 4’s All in the Mind has just kicked off with the first programme investigating the psychology of Formula 1 drivers and including an interview with an Iraqi psychiatrist involved in rebuilding the country’s mental health services.

The programme talks to Jenson Button, Honda’s top driver, Tony Lycholat, Head of Human Performance at Honda, and Dr Kerry Spackman a neuroscientist who is consultant to the Maclaren team.

In relation to mental health in Iraq, Dr Sabah Sadik is interviewed about his role as National Advisor for Mental Health to the Iraqi Ministry of Health.

The Iraqi mental health system has virtually collapsed since the invasion in 2003, and as recently reported by the Washington Post, the conflict has left intense psychological scars on many of the country’s children.

Link to first in the new series of BBC All in the Mind.

Psychiatrists top list of drug maker gift recipients

The New York Times continues its theme of investigating psychiatry and mental health with an article noting that US psychiatrists receive drug company ‘gifts’ worth the largest amount among all the medical specialities.

The data is only from two states, because they are the only ones which have gone public with their records of payments to doctors.

The practice is widespread and usually doesn’t take the form of direct cash payments, but instead funds everything from trips to conferences (which are often little more than marketing presentations in luxurious holiday destinations), to expensive meals and outings, to footing the bill for medical school events and symposiums.

The extent of the funding is quite eye-opening: the article reports that the average payment to each psychiatrist in Vermont last year was over $45,000 dollars.

Vermont officials disclosed Tuesday that drug company payments to psychiatrists in the state more than doubled last year, to an average of $45,692 each from $20,835 in 2005. Antipsychotic medicines are among the largest expenses for the state’s Medicaid program.

Over all last year, drug makers spent $2.25 million on marketing payments, fees and travel expenses to Vermont doctors, hospitals and universities, a 2.3 percent increase over the prior year, the state said.

The number most likely represents a small fraction of drug makers’ total marketing expenditures to doctors since it does not include the costs of free drug samples or the salaries of sales representatives and their staff members. According to their income statements, drug makers generally spend twice as much to market drugs as they do to research them.

The state of psychiatric drug marketing is shocking. It’s gone beyond the point of promotion to what seems to be little more than outright bribery.

As you might expect, this practice has a strong and significant effect of the prescribing behaviour and attitudes of doctors when medical decisions should be taken on the best empirical evidence rather than on marketing information provided by commercial vendors.

UPDATE: An important clarification from Doctor X, taken from the comments:

While I am concerned about the influence of big pharma on psychiatry, I was taken aback by the figures presented in the Times story. I did a little checking and found that the Times article grossly misrepresented the facts as presented in the original Vermont report. The $45,000 per year figure is for the top 11 psychiatrists who are recipients of pharma money. The report does not indicate the average or median for psychiatrists across the state, but extrapolating from the report figures it looks like $1000.00 per year is probably more typical and closer to the median figure for all psychiatrists. The mean is probably in the neighborhood of $4,000 per psychiatrist, a figure that is probably skewed upward by a heavily lopsided distribution of money and fees paid to top recipients.

Further explanation here.

Link to NYT article ‘Psychiatrists Top List in Drug Maker Gifts’.

Enough about you doctor, what about me?

The New York Times reports on a new study that examined how doctors disclose information about themselves during patient consultations. The study found that disclosures are usually for the benefit of the doctor and rarely help the patient.

The study recorded 113 doctor-patient interactions and analysed the conversation for themes, timing, effect and number of self-disclosures.

Self-disclosure is usually specifically covered in clinical training and, if done carefully, is thought to enhance the relationship with the patient and make them feel more at ease.

In this case, the research team found that none of the self-disclosures were primarily focused on patient concerns and only 4% were useful, providing education, support, explanation, or acknowledgment, or prompting some indication from the patient that it had been helpful.

The study also contains a few transcripts, including this gem:

Physician: No partners recently?

Patient: I was dating for a while and that one just didn’t work out. . . . about a year ago.

Physician: So you’re single now.

Patient: Yeah. It’s all right.

Physician: [laughing] It gets tough. I‚Äôm single as well. I don’t know. We’re not at the right age to be dating, I guess. So, let’s see. No trouble urinating or anything like that?

As was found in a previous study, it was also found that the longer the doctor talked about themselves, the less likely it was to be useful.

We tend to think of medical diagnosis as a scientific process, but so much of it relies on conversation, with patients – to get their experience of symptoms, and colleagues – to get their opinions and advice. In other words, it relies as much on negotiation as diagnostic tests.

Another key element is how the doctor transforms the patient’s personal problem into a medical one, so he or she can apply medical knowledge and problem-solving techniques to it.

As found by a key study in medical sociology, doctors use various non-scientific strategies to interpret the objective medical symptoms while making a diagnosis.

When medicine is discussed as ‘part art, part science’, the art seems to be in how doctors interact with their patients and interpret their concerns, which seems to be equally as important as medical tests.

Link to NYT article ‘Study Says Chatty Doctors Forget Patients’.
Link to abstract of study.

Harnessing humans for subconscious computing

Technology Review has an article on using humans as part of a digital face recognition system. Uniquely, you don’t have to take part in any deliberate recognition, the system uses electrical readings to automatically measure the response of the brain – even if you’re not aware of it.

The system, developed by Microsoft Research, takes advantage of the fact that when we see something we recognise as a face, a specific electrical signal is generated by face-perception brain activity that can be picked up by electrodes.

Crucially, this brain activity happens automatically, we don’t have to make a special effort.

Last year, I wrote an article entitled ‘Hijacking Intelligence‘, noting that software is increasingly being designed to use humans as ‘biological subroutines’ for the things computers find most difficult.

Labelling pictures is one such task – it’s something humans find trivial, computers find difficult, and it’s needed in large numbers to create an index for image searches.

To get round this problem, Google designed an online game that involved labelling pictures. Humans play for fun, while Google get the benefit of your intelligence for their database.

This new system takes it a step further, as you don’t have to be doing anything related for it to take advantage of your ‘mental work’.

For example, a picture could flash up every time you hit save on a word processor, or every time you look at a certain website.

Each time your brain signals that you’ve seen a face, the system reads your recognition activity and sends it back to the main database to classify the image.

This might be one way of sifting through security images to see which should be inspected in more detail.

As a substitute for advertising, maybe you’d be offered free internet access if you had the system installed. Your brain would pay the bills.

While the system has only been developed as a proof-of-concept, it’s interesting, if not a little scary, to speculate how technology will harness our mental skills, even when we’re not aware of it.

Link to Technology Review article ‘Human-Aided Computing’.

Tooth marks reveal childhood trauma

Childhood stress can interfere with the development of the teeth to the extent that a traumatic experience leaves a recognisable line in the tooth enamel that remains as a record of past traumas.

I discovered this when reading about a study published in the Annals of the New York Academy of Sciences [pdf] that used these lines to compare the number of childhood traumatic experiences that occured in people diagnosed with schizophrenia and healthy controls.

New approaches to the problem of estimating stress during early brain development are required. In this regard, human enamel has promise as accessible repositories of indelible information on stress between gestation and the age of 13. Stressful experiences induce long-term activation of the sympatho-adrenal system, slowing of tropic [growth-related] parasympathetic functions, and they then induce disrupted secretion of the enamel matrix.

During the brain development (in infancy, childhood and preadolescence), ameloblast activity in human enamel is slowed during 1 to 2 days of extreme stress, and the segment of enamel rods is smaller and often misshapen, making a particular dark line seen by the use of a microscope (we referred this line to Pathological Stress Line, PSL in short). Retzius reported that this line is incremental lines reflecting the layered apposition of enamel during amelogenesis (Retzius, 1937), and after that this line is termed the Retzius line. The line is conceptually akin to tree rings which are markers of environmental adversity in the tree’s life.

Schizophrenia was once thought to be largely caused by genetic factors, but in the last decade a number of studies have shown that childhood trauma contributes to the chance of developing the disorder.

One difficulty with this type of research is that it often relies on people remembering back to their childhood after the onset of psychosis, which could mean that the memories aren’t perfectly reliable in some cases.

Stress-induced lines in tooth enamel are one way of looking at the link between trauma and schizophrenia that doesn’t rely on potentially hazy memories of the past.

Link to abstract of study.
pdf of scientific paper.

Why don’t ethics professors behave better?

If you spent your whole life trying to work out how to be ethical, you would think you’d be more moral in everyday life. Philosopher Eric Schwitzgebel has found that this isn’t the case, and asks the question “Why don’t ethics professors behave better than they do?”.

Initially, this was based on a hunch, but Schwitzgebel, with colleague Joshua Rust, has begun to do research into the question. They’ve found some surprising results.

At a recent philosophy conference, he offered chocolate to anyone who filled in a questionnaire asking whether ethicists behaved better than other philosophers.

It wasn’t long before an ethics professor stole a chocolate without filling in a questionnaire. (This reminds me of a famous psychology study that found that trainee priests on their way to give a talk on ‘The Good Samaritan’ mostly ignored someone in need if they were in a hurry!).

When the results came in, ethicists rated other ethicists as behaving better, but other philosophers rated them as no more moral than everyone else.

In another study, Schwitzgebel investigated whether people interested in moral issues are more likely to steal books. By looking at library records, he’s found that books on ethics are more likely to be stolen than other philosophy books.

So why aren’t ethics professors more ethical than the rest of us? Schwitzgebel wonders whether it is because there is a difference between emotional engagement with moral issues and a more detached reasoning style that is necessary for careful analysis, but which may not make someone feel compelled to act more ethically.

Ominously, he notes that “More and more, I’m finding myself inclined to think that philosophical reflection about ethical issues is, on average, morally useless”.

It is interesting that there are similar problems in other professions. For example, doctors don’t follow health advice adequately and are much more likely to suffer from mental illness.

As an aside, Schwitzgebel has made all his papers and publications available online and has a fantastic blog that is well worth keeping tabs on.

Link to Schwitzgebel’s articles on ‘The problem with ethics professors’.
Link to Schwitzgebel’s homepage with publications and blog links.

Law, ethics, brain scans and mind reading

ABC Radio National’s All in the Mind has just broadcast the first of a two-part series on using neuroscience to read the mind.

The first programme investigates whether neuroscience can tell us anything about criminality and violence, and what role brain-based evidence will play in the court room.

The programme talks to many of the delegates from last April’s The Law and Ethics of Brain Scanning conference which was one of the first to consider the legal issues of brain scans in detail.

All of the conference talks have been put online as mp3 files so you can listen to the talks yourself if you want to hear more.

In the mean time, this edition of All in the Mind covers the key issues and next week’s will investigate some more (as yet undisclosed) aspects of so-called ‘mind-reading’ technology.

Link to AITM on ‘Mind Reading’.
Link to The Law and Ethics of Brain Scanning conference audio.

Encoding memory: from a free issue of SciAm

To celebrate the launch of a redesign, Scientific American have made the July edition freely available online as a pdf file. The cover story examines the search for how the brain encodes memories.

The issue is only available online until the end of June (one more week!) so you’ll need to be quick, but it’s a copy of the entire issue.

On a related note, the June 25th podcast is on the neurology of boxing-related brain damage.

pdf of July 2007 Scientific American (via Neurophilosopher).
Link to July edition table of contents.

Oldest children have highest IQ: a family effect?

Science has just published a study of almost a quarter-of-million people providing strong evidence that oldest children have slightly higher IQs, and, most interestingly, the evidence suggests that this isn’t a biological effect – it’s likely to do with family environment and upbringing.

In fact, first-born children are known to have a number of psychological differences. For example, they are less likely to be gay, show differences in autistic-like traits, and are typically less severely affected by schizophrenia if it occurs.

These differences have often been explained by a theory that argues that the mother adapts her immune system during the first pregnancy and it might not be fully attuned to later children and this might affect the brain development of subsequent children.

In order to test this idea the Science study looked at the records of almost 250,000 Norwegian army recruits, all of which have routine IQ tests and full medical and family histories.

It turned out, as has been found many times before, that first-born children had higher IQs by about 3 points on average.

Crucially, it also turned out that some second-born children who had an older sibling who had died young also had higher IQs.

In other words, although they were second-born biologically, they were brought up as the oldest child after their sibling passed away.

Being brought up as the oldest child seems to be the crucial factor: family-rank, not birth order affects IQ. This suggests that the immune system theory is unlikely to explain this effect.

This has generated a great deal of discussion and many parents are interested in whether they can provide the ‘first child advantage’ to their younger children as well.

The New York Times featured the study and just published a follow-up article discussing the role of family-dynamics in the development of intelligence after all the interest it generated.

Some psychologists are suggesting that the effect might be because older children get the chance to coach the junior family members which may help them consolidate knowledge and provide practice in manipulating information.

It’s also interesting that a recent study on birth-order in Thai medical students found exactly the reverse pattern. Younger siblings were found to be more intelligent and have more positive personality factors.

All of these studies suggest that culture and environment are crucial factors during childhood, both for mental and emotional development.

Link to abstract of Science study (thanks Laurie!).
Link to NYT write-up.
Link to NYT on intelligence and family dynamics.