Predictably irrational, variably dishonest

Behavioural economist Dan Ariely was the guest on the latest edition of ABC Radio National’s All in the Mind where he discusses why we’re so bad at predicting what’s best for us, and why honesty is a shifty behaviour.

As well as being a researcher, Ariely is also author of a psychology book called Predictably Irrational which is currently riding high in the book charts.

It’s worth catching the mp3 version of the programme, as it’s slightly extended, and I found the last part, where Ariely talks about honesty, the most interesting.

Using various experimental conditions where participants are given varying degrees of room for dishonesty, Ariely notes that people tend to be dishonest enough to give themselves an advantage, but suggests we’re not so dishonest to feel bad about ourselves.

In other words, he’s suggesting that honesty is a cognitive dissonance style reasoning process, balancing our desire for personal gain against our willingness to believe in ourselves as a ‘good person’ – an idea explored further in a forthcoming paper [pdf] by Nina Mazar and Dan Ariely.

If you’re interested in a good overview of the psychology of honesty and deception, I’ve just read a fantastic paper [pdf] by the same pair, which is fascinating as much for its insights into what influences our level of honesty for its recommendations about applying the research to encourage people to be more honest.

It notes that getting people to focus on themselves increases honesty, as does getting them to focus on moral ideas, such as the Ten Commandments.

In their experiment, participants were told to write down either as many of the Ten Commandments as they could remember (increased self-awareness of honesty) or the names of ten books that they read in high school (control). They had two minutes for this task before they moved on to an ostensibly separate task: the math test. The task in the math test was to search for number combinations that added up to exactly ten. There were 20 questions, and the duration of the experiment was restricted to five minutes. After the time was up, students were asked to recycle the test form they worked on and indicate on a separate collection slip how many questions they solved correctly. For each correctly solved question, they were paid $.50.

The results showed that students who were made to think about the Ten Commandments claimed to have solved fewer questions than those in the control. Moreover, the reduction of dishonesty in this condition was such that the declared performance was indistinguishable from another group whose responses were checked by an external examiner. This suggests that the higher self-awareness in this case was powerful enough to diminish dishonesty completely.

However, I wonder whether the effect of focusing on the Ten Commandments was due to their moral or supernatural associations.

I am reminded of Eric Schwitzgebel’s ongoing project on why ethics professors, who think about moral issues a lot, are no more moral (and perhaps less!) than other people, and a study [pdf] by psychologist Jesse Bering that found that simply telling participants that the lab was haunted increased honesty in a computer task.

Link to Dan Ariely on All in the Mind.
pdf of Mazar and Ariely’s paper on the psychology of dishonesty.


  1. Posted April 1, 2008 at 1:53 am | Permalink

    I often find myself arguing with rational choice theorist – I like what Ariely reveals. He recently gave a talk on his book ( I found his discussion one of his experiment with cheating where an actor with a rival school’s sweatshirt on blatantly cheats in front of everyone.

  2. Posted April 1, 2008 at 12:17 pm | Permalink

    Ariely was also interviewed on the BBC’s mid-week programme (listen again from
    I wonder how many parents (like me) tell their children it’s better to rip plasters off quickly than ease them off slowly. Having heard Ariely talk about it, it of course made absolute sense that might be expedient for the parent…so never again.

  3. Posted June 9, 2011 at 7:16 pm | Permalink

    “Researchers have explained this paradox about self-deception under the assumption that a person can simultaneously store both true and false knowledge, with a bias toward the true knowledge being stored in the unconscious and the false (i.e., misrepresented) knowledge being stored in the conscious”.


    Thank you for sharing
    Dawn Pugh

Post a Comment

Required fields are marked *


Get every new post delivered to your Inbox.

Join 26,864 other followers