Adjust the facts, ma’am

Photo by Flickr user Dustin Diaz. Click for sourceThe Boston Globe has an interesting piece on democracy, knowledge and reasoning biases, highlighting the fact that we can often decide facts are true based more on our pre-existing political biases than the evidence for their accuracy.

The article is full of fascinating snippets from recent studies. One, for example, finding that people who are the least well-informed are the ones most likely to be believe their opinions on the topic are correct.

However, there is also some intriguing discussion about how we filter, fudge and integrate new information into our existing beliefs:

New research, published in the journal Political Behavior last month, suggests that once those facts — or “facts” — are internalized, they are very difficult to budge. In 2005, amid the strident calls for better media fact-checking in the wake of the Iraq war, Michigan’s Nyhan and a colleague devised an experiment in which participants were given mock news stories, each of which contained a provably false, though nonetheless widespread, claim made by a political figure: that there were WMDs found in Iraq (there weren’t), that the Bush tax cuts increased government revenues (revenues actually fell), and that the Bush administration imposed a total ban on stem cell research (only certain federal funding was restricted). Nyhan inserted a clear, direct correction after each piece of misinformation, and then measured the study participants to see if the correction took.

For the most part, it didn’t. The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic — a factor known as salience — the stronger the backfire. The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didn’t backfire, but the readers did still ignore the inconvenient fact that the Bush administration’s restrictions weren’t total.

It turns out that new information doesn’t always persuade us but the article does a good job of outlining how both psychological and situational factors influence our openness to updating our knowledge about the world.

Link to Boston Globe piece ‘How facts backfire’.
Link to study on facts and political bias (open access).

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: