The hot hand smacks back

The idea of the ‘hot hand’, where a player who makes several successful shots has a higher chance of making some more, is popular with sports fans and team coaches, but has long been considered a classic example of a cognitive fallacy – an illusion of a ‘streak’ caused by our misinterpretation of naturally varying scoring patterns.

But a new study has hard data to show the hot hand really exists and may turn one of the most widely cited ‘cognitive illusions’ on its head.

A famous 1985 study by psychologist Thomas Gilovich and his colleagues looked at the ‘hot hand’ belief in basketball, finding that there was no evidence of any ‘scoring streak’ in thousands of basketball games beyond what you would expect from natural variation in play.

Think of it like tossing a weighted coin. Although the weighting, equivalent to the players skill, makes landing a ‘head’ more likely overall, every toss of the coin is independent. The last result doesn’t effect the next one.

Despite this, sometimes heads or tails will bunch together and this is what people erroneously interpret as the ‘hot hand’ or being on a roll, at least according to the theory. Due to the basketball research, that seemed to show the same effect, the ‘hot hand fallacy’ was born and the idea of ‘scoring streaks’ thought to be sports myth.

Some have suggested that while the ‘hot hand’ may be an illusion, in practical terms, in might be useful on the field.

Better players are more likely to have a higher overall scoring rate and so are more likely to have what seem like streaks. Passing to that guy works out, because the better players have the ball for longer.

But a new study led by Markus Raab suggests that the hot hand does indeed exist. Each shot is not independent and players that hit the mark may raise their chances of scoring the next time. They seem to draw inspiration from their successes.

Crucially, the researchers chose their sport carefully because one of the difficulties with basketball – from a numbers point of view – is that players on the opposing team react to success.

If someone scores, they may find themselves the subject of more defensive attention on the court, damping down any ‘hot hand’ effect if it did exist.

Because of this, the new study looked at volleyball where the players are separated by a net and play from different sides of the court. Additionally, players rotate position after every rally, meaning its more difficult to ‘clamp down’ on players from the opposing team if they seem to be doing well.

The research first established the belief in the ‘hot hand’ was common in volleyball players, coaches and fans, and then looked to see if scoring patterns support it – to see if scoring a point made a player more likely to score another.

It turns out that over half the players in Germany’s first-division volleyball league show the ‘hot hand’ effect – streaks of inspiration were common and points were not scored in an independent ‘coin toss’ manner.

What’s more, players were sensitive to who was on a roll and used the effect to the team’s advantage – more commonly passing to those on a scoring streak.

So it seems the ‘hot hand’ effect exists. But this opens up another, perhaps more interesting, question.

How does it work? Because if teams can understand the essence of on court inspiration, they’ve got a recipe for success.
 

Link to blocked study. Clearing a losing strategy.
Link to full text which has mysteriously appeared online.

The personality of sperm donors

The biggest ever study on the personality of sperm donors has just been published.

Each was asked to fill out the Temperament and Character Inventory personality scale, also known as the TCI, and the results were compared to a similar group of men who hadn’t whacked off into a plastic tube for the benefit of society.

So who donates sperm?

With regard to personality, we found significant differences on the temperament dimension of harm avoidance between the sperm donors and the comparison group, with lower means for sperm donors. This indicates that the sperm donors described themselves as being less worried, uncertain, shy and less subject to fatigue

Furthermore, we also found significant differences on the character dimensions, where the sperm donors showed higher means on self-directedness. This indicates that they perceived themselves as more autonomous individuals, with a capacity for responsibility, as behaving in a more goal-directed manner, and to be more resourceful and self-acceptant than the comparison group.

The sperm donors also showed significantly higher means on cooperativeness. This means that they described themselves as being more integrated with society and having a greater capacity for identification with and acceptance of other people than the comparison group.

The personality dimensions from the Temperament and Character Inventory have been found to be among the most heavily influenced by genetics, so knowing that your average sperm donor is a generally nice chap is very useful information.

I suspect, however, that ‘not easily fatigued’ may be a selection bias due to the demands of the job.
 

Link to paywalled study. No chance of a donation then?

The psychiatry of vegetarianism

A fascinating but unfortunately locked review article on the psychology of vegetarianism has this paragraph on how avoiding the pleasures of cooked flesh has been seen as a mental illness in times past.

How vegetarians are seen has shifted radically over time. During the Inquisition, the Roman Catholic Church declared vegetarians to be heretics, and a similar line of persecutions occurred in 12th century China (Kellman, 2000). In the earlier half of the twentieth century, the sentiment toward vegetarians remained distinctly negative, with the decision not to eat meat being framed as deviant and worthy of suspicion.

Major Hyman S. Barahal (1946), then head of the Psychiatry Section of Mason General Hospital, Brentwood, wrote openly that he considered vegetarians to be domineering and secretly sadistic, and that they “display little regard for the suffering of their fellow human beings” (p. 12). In this same era, it was proposed that vegetarianism was an underlying cause of stammering, the cure for which was a steady diet of beefsteak.

In contrast, research shows the general attitude to vegetarianism has generally shifted to be, shall we say, somewhat more positive.
 

Link to locked article. Forbidden fruit and all that.

A review of Pinker’s The Better Angels of Our Nature

I’ve written an in-depth review of Steven Pinker’s new book on the decline of violence for the latest Wilson Quarterly

I thought getting a free copy and working on a review would be great fun but was rather taken aback when the 848 page book landed on my doorstep. I shouldn’t have been because there isn’t a wasted page.

I go into the details of some of Pinker’s key arguments in the book, which you can read in more detail in the review, but as you can see from this part, the book is definitely worth reading.

Despite my concerns about how Pinker portrays individual psychology and neuroscience, The Better Angels of Our Nature is so comprehensive that these faults represent only a fraction of the book. Taken as a whole, it is powerful, mind changing, and important. Pinker does not shy away from the gritty detail and is not to be taken lightly—quite literally in fact, as at more than 800 pages his book could easily be used as a weapon if you remained unpersuaded by its arguments. But this avalanche of information serves to demonstrate convincingly and counterintuitively that violence is on the decline.

In many ways, violence is a disease of the emotions. While we should never ignore the victims, it can be managed and curbed so it affects as few people as possible and remains minimally contagious. Many illnesses that once felled multitudes are now largely vanquished through greater knowledge and simple preventive measures; a similar process has made us all less likely to be targets, and perpetrators, of brutality. As Pinker argues, this is an achievement we should take pride in.

You can read the full text of the review by clicking on the link below. Thanks to The Wilson Quarterly for making it available online.
 

Link to review of Pinker’s new book in The Wilson Quarterly.

Glitches in The Technology of Orgasm

We’ve covered The Technology of Orgasm before, a hugely influential book arguing that 19th century doctors were using Victorian vibrators to cure ‘female hysteria’ through the induction of [serious look] ‘hysterical paroxysms’, but it seems that the main argument may not be as breathtaking as it first appears.

Cory Silverberg discusses how historians of sex have been less than impressed with the idea and the issue has now become a hot topic because the book, written by author Rachel Maines, has been made into a film starring Maggie Gyllenhaal.

The Technology of Orgasm is a somewhat controversial book. Controversial in that the thesis of the book has been almost universally accepted and embraced by the mainstream press and the sex toy industry, while at the same time being quite seriously critiqued by historians of sexuality. In her book Maines contends that the vibrator was regularly used by doctors to treat “hysteria” which they had previously been treating by manually stimulating women to orgasm. Included in this argument is the idea that the women didn’t know they were having orgasms and the doctors didn’t seem to worry about the professional boundaries involved in essentially masturbating their patients.

Silverberg also notes a comprehensive page by historian Lesley Hall who has detailed difficulties with the ‘Victorian vibrator cure’ idea.

The page also has loads of other fascinating information about 19th century sex.

Don’t be put off by the page’s dreadful green background – as the title suggests, it is full of wonderful ‘Victorian sex factoids’, including why it is unlikely that Queen Victoria ever used cannabis to help alleviate period pains.
 

Link to Cory Silverberg’s coverage of the new film (via @DrPetra).
Link to Lesley Hall’s page on ‘Victorian Sex Factoids’.

A case of simulated fragmentation

The New York Times has an excerpt of a book that claims to expose one of the most famous psychiatric cases in popular culture as a fraud.

Based on an analysis of previously locked archives the book suggests that the patient at the centre of the ‘Sybil’ case of ‘multiple personality disorder’ was, in fact, faking and admitted so to her psychiatrist.

The diagnosis, now named dissociative identity disorder, is controversial because the idea that someone can genuinely have several ‘personalities’ inside a single body has not been well verified and diagnoses seemed to boom after the concept became well-known.

This particular case became well known because it was written up as a best-selling 1973 book and was later turned into successful film of the same name.

The book and the film are though to have been key in the shaping the concept of the diagnosis and making it popular during the late 70s and 80s.

However, detective work by author Debbie Nathan has seemed to uncover medical notes that suggest the psychiatrist at the centre of the case, Cornelia Wilbur, may have known that his patient had admitted to faking for some time.

One may afternoon in 1958, Mason walked into Wilbur’s office carrying a typed letter that ran to four pages. It began with Mason admitting that she was “none of the things I have pretended to be.

“I am not going to tell you there isn’t anything wrong,” the letter continued. “But it is not what I have led you to believe. . . . I do not have any multiple personalities. . . . I do not even have a ‘double.’ . . . I am all of them. I have been essentially lying.”

Before coming to New York, she wrote, she never pretended to have multiple personalities. As for her tales about “fugue” trips to Philadelphia, they were lies, too. Mason knew she had a problem. She “very, very, very much” wanted Wilbur’s help. To identify her real trouble and deal with it honestly, Mason wrote, she and Wilbur needed to stop demonizing her mother. It was true that she had been anxious and overly protective. But the “extreme things” — the rapes with the flashlights and bottles — were as fictional as the soap operas that she and her mother listened to on the radio. Her descriptions of gothic tortures “just sort of rolled out from somewhere, and once I had started and found you were interested, I continued. . . . Under pentothal,” Mason added, “I am much more original.”

 

Link to excerpt of book in the New York Times.

Games of Invention

I’ve been collecting card decks. First I got the Oblique Strategies, Brian Eno’s deck of worthwhile dilemmas. When I’m stuck with something I’m working on I sit completely still for a few moments, holding the problem in mind. Then I take a breath, draw a card and apply what’s written to my problem.

Trying this now I get:
“Make something implied more definite (reinforce, duplicate)”

Other cards say things such as “Remove elements in decreasing order of importance”, “Honour thy error as a hidden intention” or simply “Water”

Often as not this process frees me from the rut I’m in. I don’t always get the answer in a flash, but mentally I get moving again.

The Oblique Strategies work because they use our talent for justification to stimulate invention. Justification is the mental skill of tracing causes to understand a situation. It is closely related to deductive reasoning. Most of us get a lot of practice at justification and deduction. We’re used to tracing causation and necessity down the loops and chicanes of “if-then” rules, used to figuring out what is allowed, forbidden and required. These are useful skills for understanding laws, code and the bureaucracies of advanced industrial society, but it is a mental set for reducing possibility, not for increasing it.

Edward de Bono, the guy who invented the term “lateral thinking”, talks about how this talent we all cultivate for deduction and justification can be hijacked in the service of creativity and invention. Rather than ask of ourselves, with our highly cultivated deduction machinery, “what is the next best move?”, we instead make a blind move in the space of possibilities. We force ourselves, for example, to remove the most important element in our design, or to apply the idea of water. This blind move whatever it is shifts us to asking “how could the world get this way?” We can then use our deduction machinery to build a bridge back from the move we’ve forced ourselves to make, finding reasons why or how this could be the next best move. The results can be so inventive they feel like they come from outside ourselves, but they are really just our ordinary logical machinery thrown into reverse by the need to justify a blind move.

The next deck of cards I bought were Stephen Anderson’s Mental Notes, a set of 50 insights from psychology designed as prompts for web designers. The insights are grouped under categories such as “Persuasion” or “Attention” and each card gives has a short description of a psychological phenomenon and notes on how to create or encourage it.

What I love about the cards is that they capture a huge amount of information from the field of Psychology, but in a completely different way from the ways psychologist usually try and present the information. Experts write textbooks laboriously cataloguing phenomena, enumerating arguments for and against their nuances. The Mental Notes don’t do this – brevity is the soul of their wit. The other thing academic psychologists do, is try and reduce phenomena to their essences, sifting the real and eternal from the incidental, the ephemeral and secondary. The Mental Notes could have done this, but they don’t. To ask why there are separate cards “Scarcity”, “Limited Choice”, “Limited Duration” and “Limited Access” when these are describing essentially the same thing would be to miss the point. The way the cards are they present the information in a form which means it can immediately be taken and thought about in a concrete way and applied to the design problem you are dealing with. Reduction to essences would be counter-productive here.

The third set of cards I’ve bought are Dan Lockton‘s “Design with Intent” toolkit. These cards are an attempt to catalogue patterns in design which influence behaviour, things like “prominence”, “decoys” or “threat of injury”. What’s nice about these cards is that they recognise explicitly that the cards are prompts. The main text of each card is a question: “Can you direct users’ attention to what you want, by making it more prominent, obvious or exaggerated?”, “Can you add ‘decoy’ choices, making the others (which you want people to pick) look better in comparison?”

Collecting information like this in cards recognises that the creative process needs an element of randomness, that making thoughts physical makes it easier for us to play games of invention with ourselves, and that too much organisation can sometimes restrict what we know – the information might be all there in a textbook, but the ends are all tied off, stopping our current state of mind latching onto what is needed. Invention comes naturally from inside ourselves, but sometimes we need a spark to set it off. We need external prompts which ask us questions we didn’t think to ask of ourselves alone, which lift us into seeing more of ourselves than we would on our own.

Links

Oblique Strategies
Anderson’s Mental Notes
Dan Lockton’s website

This is the text of an article I originally wrote for the boys at Rattle, and their newspaper the Rattle Review. It is republished here with their permission

The cutting edge of the easy high

Perhaps the most complete scientific review of what we know about synthetic cannabis or ‘spice’ products has just appeared in Frontiers in Behavioral Neuroscience.

These ‘legal highs’ are typically sold as nudge-nudge wink-wink ‘incense’ but contain synthetic cannabinoids which have a similar effect to smoking dope but are legal in many countries.

We covered the history of these compounds recently and we also discussed the market approach of the neuroscientist-packing ‘legal high industry’ back in 2009.

Essentially, the industry is based on the fact that their psychopharmacologists can churn out new substances faster than governments can regulate against them, with the web providing a distributed marketplace that opens up the customer base.

This new article takes a scientific look at what compounds are actually appearing in ‘synthetic marijuana’ (of which there are many and various) as well as examining the known effects, good and bad.

If you’re not into phrases like “well-characterized aminoalkylindole class of ligands” you may want to skip the neurochemistry and just focus on the availability and effects.

It’s probably the most complete review of these compounds available to date, so definitely worth a look if you’re tracking the ‘synthetic blow’ story.
 

Link to ‘Beyond THC’ on cannabinoid designer drugs (via @sarcastic_f)

The father of Randle P. McMurphy

An article in the Journal of Medical Humanities has a fascinating look at one of playwright Samuel Beckett’s early novels – an exploration of madness and mental health care that foreshadowed One Flew Over the Cuckoo Nest.

Beckett is best known for Waiting for Godot, but his novel Murphy was previously one of the best known literary treatments of mental ill health until Ken Kesey’s famous work.

It turns out that Kesey gives a knowing nod to Beckett’s earlier work through his character Randle McMurphy.

As far as twentieth-century accounts of mental health nursing and psychiatry go, Beckett’s (1937) tale of Murphy has been much over-shadowed by Ken Kesey’s One Flew Over the Cuckoo Nest. For better or for worse, Kesey’s nurse Ratchet became the epitome of the 20th century asylum attendant. But it was a notable act of approbation by Kesey to name his main protagonist, Randle P. MacMurphy, with due deference to Beckett; ‘MacMurphy’ literally meaning ‘son of Murphy.’

The comparison between the two novels is interesting, because Kesey drew his inspiration from his time working as a staff member on a psychiatric ward while Beckett drew his inspiration from being a patient.
 

Link to locked article (the humanities are deadly in the wrong hands).

Preferences of the lady wooers

A study on female breast size attractiveness just published in the Archives of Sexual Behaviour highlights the remarkable gap between academic discourse and everyday language.

Female Breast Size Attractiveness for Men as a Function of Sociosexual Orientation (Restricted vs. Unrestricted)

Arch Sex Behav. 2011 Oct 6.

Zelazniewicz AM, Pawlowski B.

Mate preferences are context-dependent and may vary with different ecological conditions and raters. The present study investigated whether sociosexual orientation influenced men’s rating of attractiveness of female breast size. Participants (N = 128) rated female breast attractiveness as a function of size (five levels) and viewing angles (front view, oblique view, and side view). Men were divided into two groups (restricted and unrestricted), based on their responses to the Revised Sociosexual Orientation Inventory (SOI-R). As predicted, men with higher SOI-R scores (unrestricted) generally gave higher ratings than did men who scored lower on the SOI-R (restricted), but the difference was significant only at larger breast sizes. We also found that medium to large sizes were rated as the most attractive by both male groups and that viewing angle changed rating of female attractiveness and breast presented in oblique view were rated generally higher than in side view. The results of the study indicate that sociosexuality influences male perception of female breast attractiveness and confirm that accentuation of female-specific physical traits produces a stronger response in unrestricted than in restricted men.

Translation: Guys who want to shag around prefer bigger tits.

Obviously, you can’t use the word tits in a scientific article so you’d have to say ‘Gentleman who want to woo more ladies prefer larger hooters’.

And they say science isn’t relevant to the man in the street.
 

Link to full text of open-access study.

Bookended by amnesia and neurofeedback

A new edition of RadioLab has just hit the wires which riffs on the concept of loops and is bookended by an initial piece on transient global amnesia and a closing piece on the use of neurofeedback to control pain.

The programme is a sublime, lucid trip into a series of cycles, from the effects of memory disruption to the unprovability of mathematics.

Our lives are filled with loops that hurt us, heal us, make us laugh, and, sometimes, leave us wanting more. This hour, Radiolab investigates the strange things that emerge when something happens, then happens again, and again, and again, and again, and again, and … well, again.

As always, enchanting stuff.
 

Link to RadioLab edition on loops.

Entertainingly mislead me

A beautifully recursive study has shown that viewing an episode of the psychology of deception TV series Lie To Me makes people worse at distinguishing truth from lies.

The TV series is loosely based on the work of psychologist Paul Ekman who pioneered the study of emotions and developed the Facial Action Coding System or FACS that codes even the slightest of changes in facial expression.

Although in poplar culture Ekman and the FACS are often associated with the detection of lies through changes in ‘micro expressions’, there is actually no good research to show it can help detect falsehoods.

However, the TV series relies heavily on this premise and suggests that there is more of a scientific basis to lie detection than is actually feasible and that it is possible to detect deception through careful observation of specific behaviours.

This, however, is not very accurate. The authors of the study don’t mince their words:

Lie to Me is based on the premise that highly accurate deception detection is possible based on real-time observation of specific behaviors indicative of lying. The preponderance of research demonstrates that the exact opposite is true.

Lie to Me also suggests that certain people are naturally gifted lie detectors. This is also inconsistent with the preponderance of research. Thus, when looking at the evidence generated across several hundred individual studies, the idea of Lie to Me is highly implausible and almost certainly misleading.

Rather shrewdly, this new study, led by psychologist Timothy Levine, decided to test whether this misleading view of lie detection might actually influence the viewer’s ability to detect lies.

They split participants into three groups, one who watched and episode of Lie to Me, another an episode of Numb3rs – in which crimes are solved by a genius math professor, and a final group who didn’t watch anything.

Afterwards, everyone saw a series of 12 interviews – half of which were honest and half which involved lies – and were asked to rate the truthfulness of the interviewee.

Normally, when we do tasks like this where honesty and deception are present in equal numbers, we tend to over-rate how truthful people are – probably due to the fact that in everyday like most people are being genuine with us, so we have a tendency to assume people are telling the truth – even when we know there’s some falsehood to be found.

In the study, those who had just watched Lie To Me didn’t show this truth-accepting bias, they were more skeptical, but crucially, they were actually worse at distinguishing deception than the others.

They applied their skepticism in a blanket fashion and became less accurate as a result.

In other words, not only does the programme misrepresent the psychology of lie detection, but this has an effect on the psychology of the viewers themselves.

Which, by the way, would make a great plot device for Lie To Me.
 

Link to locked study (via @velascop)

A history of the mid-life crisis

Scientific American’s Bering in Mind has a fantastic article on how the concept of the mid-life crisis was invented and whether it has any evidence behind it beyond the occasional inadvisable pair of cycling shorts and sudden interest in cheesy sports cars.

It turns out that the idea of the ‘mid-life crisis’ is surprisingly new – first touted in 1965 – but was invented to refer to a crisis of creativity in geniuses – rather than a sudden urge to dye one’s greying hair.

There isn’t actually any evidence that middle age is more of a time of crisis than any other period of life, but the concept has stuck.

In the decades since Jacques and Levinson posited their mostly psychoanalytic ideas of the midlife crisis, a number of more empirically minded psychologists have attempted to validate it with actual data. And with little success. Epidemiological studies reveal that midlife is no more or less likely to be associated with career disillusionment, divorce, anxiety, alcoholism, depression or suicide than any other life stage; in fact, the incidence rates of many of these problems peak at other periods of the lifespan.

Adolescence isn’t exactly a walk in the park either—as a teen, I’d worry so much about the uncertainties of my future that I vividly recall envying the elderly their age, since for them, no such uncertainties remained. Actually, old people—at least Swiss old people—aren’t fans of the “storm and stress” of adolescence, either. Freund and Ritter asked their elderly respondents which stage of their lives they’d prefer to return to, if they could. Most said middle age.

From another point of view, of course, the concept could also be a socially convenient way of helping to curtail certain behaviours in men when their actions are no longer thought to be age appropriate.

That’s my theory and I’m sticking to it.
 

Link to Bering in Mind on the mid-life not so crisis.

Epilepsy as a door between worlds

There’s a wonderful anthropology study on beliefs about epilepsy among the Guaraní people in Bolivia in the latest Epilepsy and Behavior.

The Guaraní believe that people with recurrent seizures are a gateway between the worlds of life and death.

Among the Guaraní, epilepsy is called mano-mano, which literally means “die-die” and refers to the concept of death with a notion of frequency (die several times) and also of being in a constant passage between life and death. In other terms, this word means always being on the border between life and death, reflecting the fact that mano-mano produces a constant interruption of life or a “partial death.”…

In fact, the expression mano-mano is meaningful. It refers to the idea of a round trip between life and death. This concept addresses the loss of consciousness and shows that epilepsy is recognized mostly in terms of generalized seizures. The uncertain state between life and death is seen as a kind of “third possible condition” for a human being, a state that generates hesitation over what attitude to hold. PWE [people with epilepsy] are omano-mano-vae, the “undeads,” different from the other members of the community and considered both as victims of this life–death relationship and as enablers of the meeting of these two worlds…

The representation of epilepsy as a state of human being and the perception of this in a vision that involves the entire community allow an interpretation of Guaraní attitudes toward PWE [people with epilepsy]. Guaraní PWE are rarely condemned, misjudged, or isolated as in other cultures. Apparently, PWE do not represent a threat to the Guaraní, who seem to hold the attitude of helping and protecting PWE. As noted, the restrictions and prohibitions cited by the Guaraní appear to derive from the need to take care of PWE, as heavy work, traveling alone, and being involved in problems are believed to worsen the condition or trigger seizures in PWE.

It’s worth noting that while their perception of people with epilepsy is generally positive, several of the people interview gave advice about avoiding contact between affected people and children or pregnant women.

As the researchers note this raises questions “that could be related to a belief that was not mentioned: possible transmission of the disease to those who are considered the weakest and most defenseless in the community.”

However, their general outlook is markedly positive in light of widespread beliefs about epilepsy being the results of evil spirits or a divine punishment.

In contrast, the Guaraní most frequently cited epilepsy as being caused by a “failure to observe the Yekuaku, a fasting period linked to special events”.
 

Link to locked anthropology study.

A profession with “no” at its core

I’ve just finished Randy Olson’s “Don’t be Such a Scientist: Talking Substance in an age of style” (after loving his article in New Scientist, “Top five tips for communicating science “). Olson is a marine biologist turned filmmaker, so knows the world of science from the inside, and from the outside perspective.

This book is 75% solid gold – absolutely essential perspective for scientists who want to communicate outside of their specialism. But it is also 25% misleading and elitistic simplification. At heart, Randy Olson’s message as a populariser ends up pandering to a mistaken belief in scientific exceptionalism – that what scientists do and who scientists are is so beyond the ken of the rest of the population that it cannot be conveyed to them, that we have to use a pound of silly songs and fart jokes to make the public to swallow an ounce of important information. Sorry, Randy, but when you underestimate the public taste you end up demeaning it.

Part of the 75% I loved is Olson’s perspective on the value of acting and improv classes for science communicators. This something close to my heart, after I had my own mind-blowing experience of improv training. An essential – some would say the essential – of improv is to avoid negating your fellow improvisers suggestions. Whatever happens, improvisers are taught to accept and build – using a “yes and” mindset instead of a “no but” one. This lends itself to humour and creativity. Science, on the other hand, tends to downplay “yes and” at the favour of “no but”, lending itself to rigour and certainty, at the risk of cynicism and myopia. Olson puts this particularly well in the following passage:

The entire profession of science has at its core a single word, and that word is “no”. Science is a process not of affirming ideas but of attempting to falsify ideas in the search for truth. This is what a hypothesis is – an idea that can be tested and possibly falsified and rejected.
When you give a scientist a paper, he or she reads it with the assumption that the writer is guilty of being wrong until proven innocent. The writers proves his or her innocence by either presenting data or citing sources. With each statement made in the paper, the scientist reading it says “I’m not sure I believe this.” As the author presents graphs and tables of data and cites sources, the good critical scientist attempts to falsify what is being said.
Eventually, after the scientist has examined the data, looked up the cited sources and found that in fact, despite considerable effort, the hypothesis presented cannot be falsfied – only then does the scientist finally start to relax and a bit and say, “Well, okay, I think I can probably live with this.”
Tough buisness. It really is. As I waded through my first decade of rejection in Hollywood as a filmmaker, people would ask me whether I found the rejection hurtful or depressing. And I would respond, “Are you shitting me? Do you have any idea what it’s like to deal with the rejection of scientists? Hollywood folks reject things on the basis of the idea that ‘it just didn’t grab me,’ and they can’t even articulate the reason for their decision. When scientists reject you they hit you with a stack of data and sources that are the basis for it. That’s the sort of specific, substantive rejection that truly hurts (p128-129)

Link to page for Randy Olson’s “Don’t be such a scientist”

I wrote about improv for Prospect magazine, here

The death of atypical antipsychotics

The British Journal of Psychiatry has just published the latest in a long line of studies to find that the newer ‘atypical’ or ‘second generation’ antipsychotic drugs are barely better than the old style medications and has a stinging editorial that accompanies the piece calling out years of drug company marketing spun as an illusory advance in medical science.

Unfortunately both are locked (after all, you’d just worry yourself with all those facts) but here is the last paragraph of the editorial. It leaves no ass unkicked.

In creating successive new classes of antipsychotics over the years, the industry has helped develop a broader range of different drugs with different side-effect profiles and potencies, and possibly an increased chance of finding a drug to suit each of our patients. But the price of doing this has been considerable – in 2003 the cost of antipsychotics in the USA equalled the cost of paying all their psychiatrists.

The story of the atypicals and the SGAs [‘second-generation antipsychotics’] is not the story of clinical discovery and progress; it is the story of fabricated classes, money and marketing. The study published today is a small but important piece of the jigsaw completing a picture that undermines any clinical or scientific confidence in these classes.

With the industry reputation damaged by evidence of selective publishing and its deleterious effects, and the recent claims that trials of at least one of the new atypicals have been knowingly ‘buried’, it will take a great deal for psychiatrists to be persuaded that the next new discovery of a drug or a class will be anything more than a cynical tactic to generate profit. In the meantime, perhaps we can drop the atypical, second-generation, brand new and very expensive labels: they are all just plain antipsychotics.

 

Link to locked editorial ‘The rise and fall of the atypical antipsychotics’.