Indie reports on surprising structure of artists’ brains

Artists brains are ‘structurally different’ according to The Independent, who report on a small, thought-provoking but as yet quite preliminary study.

The image used to illustrate the article (the one on the right) is described as showing “more grey and white matter in artists’ brains connected to visual imagination and fine motor control”.

This could be a bit alarming, especially if you are an artist, because that’s actually a map of a mouse brain.

Whether artists have ‘different brains’ or not, in any meaningful sense, is perhaps slightly beside the point, but you can be rest assured that they’re not so different that they will give you a sudden desire to scamper around looking for cheese.

Put your hands up and move away from the therapy

An editorial in Molecular Psychiatry has been titled “Launching the War on Mental Illness” – which, considering the effects of war on mental health, must surely win a prize for the most inappropriate metaphor in psychiatry.

But it also contains a curious Freudian slip. Five times in the article, the project is described as the ‘War on Mental Health’, which is another thing entirely.

…how can we then proceed to successfully launch a ‘War on Mental Health’? Our vision for that is summarized in Figure 3 and Table 1.

Sadly, Figure 3 and Table 1 don’t contain a description of a world with continuous traffic jams, rude waiters and teenagers constantly playing R&B through their mobile phone speakers.
 

Link to Launching the ‘War on Mental Illness’ (thanks @1boringyoungman)

Scraping the bottom of the biscuit barrel

As a wonderful demonstration how media outlets will report the ridiculous as long as ‘neuroscience’ is mentioned, I present the ‘Oreos May Be As Addictive As Cocaine’ nonsense.

According to Google News, it has so far been reported by 209 media outlets, including some of the world’s biggest publications.

That’s not bad for some non-peer reviewed, non-published research described entirely in a single press release from a Connecticut college and done in rats.

The experiment, described in five lines of the press release, is this:

On one side of a maze, they would give hungry rats Oreos and on the other, they would give them a control – in this case, rice cakes. (“Just like humans, rats don’t seem to get much pleasure out of eating them,” Schroeder said.) Then, they would give the rats the option of spending time on either side of the maze and measure how long they would spend on the side where they were typically fed Oreos…

They compared the results of the Oreo and rice cake test with results from rats that were given an injection of cocaine or morphine, known addictive substances, on one side of the maze and a shot of saline on the other. Professor Schroeder is licensed by the U.S. Drug Enforcement Administration to purchase and use controlled substances for research.

The research showed the rats conditioned with Oreos spent as much time on the “drug” side of the maze as the rats conditioned with cocaine or morphine.

Needless to say, South American drug lords are probably not shutting up shop just yet.

But this is how you make headlines around the world and get your press release reported as a ‘health story’ in the international media.

As we’ve noted before, the ‘as addictive as cocaine’ cliché gets wheeled out on a regular basis even for the most unlikely of activities but this really takes the biscuit (“Bad jokes addictive as cocaine” say British scientist’s readers).

However, the alternative conclusion that ‘Cocaine is no more addictive than Oreos’ seems not to have been as popular. Only Reason magazine opted for this one.

The reason that this sort of press release makes headlines is simply because it agrees with the already established tropes that obesity is a form of ‘addiction’ and is ‘explained’ by some vague mention of the brain and dopamine.

The more easily we agree with something, the less critical thinking we apply.
 

Link to a more sensible take from Reason magazine.

This complex and tragic event supports my own view

As shots rang out across the courtyard, I ducked behind my desk, my adrenaline pumping. Enraged by the inexplicable violence of this complex and multi-faceted attack, I promised the public I would use this opportunity to push my own pet theory of mass shootings.

Only a few days have passed since this terrible tragedy and I want to start by paying lip service to the need for respectful remembrance and careful evidence-gathering before launching into my half-cocked ideas.

The cause was simple. It was whatever my prejudices suggested would cause a mass shooting and this is being widely ignored by the people who have the power to implement my prejudices as public policy.

I want to give you some examples of how ignoring my prejudices directly led to the mass shooting.

The gunman grew up in an American town and had a series of experiences, some common to millions of American people, some unique to him. But it wasn’t until he started to involve himself in the one thing that I particularly object to, that he started on the path to mass murder.

The signs were clear to everyone but they were ignored because other people haven’t listened to the same point-of-view I expressed on the previous occasion the opportunity arose.

Research on the risk factors for mass shootings has suggested that there are a number of characteristics that have an uncertain statistical link to these tragic events but none that allow us to definitively predict a future mass shooting.

But I want to use the benefit of hindsight to underline one factor I most agree with and describe it as if it can be clearly used to prevent future incidents.

I am going to try and convince you of this in two ways. I am going to selectively discuss research which supports my position and I’m going to quote an expert to demonstrate that someone with a respected public position agrees with me.

Several scientific papers in a complex and unsettled debate about this topic could be taken to support my position. A government report also has a particular statistic which I like to quote.

Highlighting these findings may make it seem like my position is the most probable explanation despite no clear overall conclusion but a single quote from one of the experts will seal the issue in my favour.

“Mass shootings” writes forensic psychiatrist Anand Pandya, an Associate Clinical Professor in the Department of Psychiatry and Behavioral Neurosciences at the UCLA School of Medicine, Los Angeles, “have repeatedly led to political discourse”. But I take from his work that my own ideas, to quote Professor Pandya, “may be useful after future gun violence”.

Be warned. People who don’t share my biases are pushing their own evidence-free theories in the media, but without hesitation, I can definitely say they are wrong and, moreover, biased.

It is clear that the main cause of this shooting was the thing I disliked before the mass shooting happened. I want to disingenuously imply that if my ideas were more widely accepted, this tragedy could have been averted.

Do we want more young people to die because other people don’t agree with me?

UPDATE: Due to the huge negative reaction this article has received, I would like to make some minor concession to my critics while accusing them of dishonesty and implying that they are to blame for innocent deaths. Clearly, we should be united by in the face of such terrible events and I am going to appeal to your emotions to emphasise that not standing behind my ideas suggests that you are against us as a country and a community.

I’m experiencing a lot of automaticity right now

Funny or Die is supposedly a comedy site but they seem to have a brief video tutorial on how to undertake neurally informed domestic negotiations.

The credits of the video give special thanks to Dr Dan Siegel – founder of ‘the exciting field of interpersonal neurobiology’.

I think that might be a joke though as the video seemed relatively free of flowery neurojargon.

‘digital dementia’ lowdown – from The Conversation

The Headlines

The Telegraph: Surge in ‘digital dementia’

The Daily Mail: ‘Digital dementia’ on the rise as young people increasingly rely on technology instead of their brain

Fox News: Is ‘digital dementia’ plaguing teenagers?

The Story

South Korea has the highest proportion of people with smartphones, 67%. Nearly 1 in 5 use their phone for more than 7 hours in a day, it is reported. Now a doctor in Seoul reports that teenagers are reporting with symptoms more normally found in those with head injury or psychiatric illness. He claims excessive smartphone use is leading to asymmetrical brain development, emotional stunting and could “in as many as 15 per cent of cases lead to the early onset of dementia”.

What they actually did

Details from the news stories are sketchy. Dr Byun Gi-won, in Seoul, provided the quotes, but it doesn’t seem as if he has published any systematic research. Perhaps the comments are based on personal observation?

The Daily Mail quotes an article which reported that 14% of young people felt that their memory was poor. The Mail also contains the choice quote that “[Doctors] say that teenagers have become so reliant on digital technology they are no longer able to remember everyday details such as their phone numbers.”

How plausible is this?

It is extremely plausible that people should worry about their memories, or that doctors should find teenagers uncooperative, forgetful and inattentive. The key question is whether our memories, or teenagers’ cognitive skills, are worse than they ever have been – and if smart phones are to blame for this. The context for this story is a recurring moral panic about young people, new forms of technology and social organisation.

For a long time it was TV, before that it was compulsory schooling (“taking kids out of their natural environment”). When the newspaper became common people complained about the death of conversation. Plato even complained that writing augured the death of memory and understanding). The story also draws on the old left brain-right brain myth, which – despite being demonstrably wrong – will probably never die.

Tom’s take

Of course, it is possible that smartphones (or the internet, or TV, or newspapers, or writing) could damage our thinking abilities. But all the evidence suggest the opposite, with year by year and generation-by-generation rises found in IQ scores. One of the few revealing pieces of research in this area showed that people really are more forgetful of information they know can be easily retrieved, but actually better able to remember where to find that information again.

This isn’t dementia, but a completely normally process of relying on our environment to store information for us. You can see the moral panic driving these stories reflected in the use of that quote about teenagers not being able to remember phone numbers. So what! I can’t remember phone numbers any more – because I don’t need to. The only evidence for dementia in these stories is the lack of critical thought from the journalists reporting them.

Read more

Vaughan Bell on a media history of information scares.

Christian Jarret on Why the Left-Brain Right-Brain Myth Will Probably Never Die

The Conversation

This article was originally published at The Conversation.
Read the original article.