I have a guest post for the Research Digest, snappily titled ‘People who think their opinions are superior to others are most prone to overestimating their relevant knowledge and ignoring chances to learn more‘. The paper I review is about the so-called “belief superiority” effect, which is defined by thinking that your views are better than other people’s (i.e. not just that you are right, but that other people are wrong). The finding that people who have belief superiority are more likely to overestimate their knowledge is a twist on the famous Dunning-Kruger phenomenon, but showing that it isn’t just ignorance that predicts overconfidence, but also the specific belief that everyone else has mistaken beliefs.
Here’s the first lines of the Research Digest piece:
We all know someone who is convinced their opinion is better than everyone else’s on a topic – perhaps, even, that it is the only correct opinion to have. Maybe, on some topics, you are that person. No psychologist would be surprised that people who are convinced their beliefs are superior think they are better informed than others, but this fact leads to a follow on question: are people actually better informed on the topics for which they are convinced their opinion is superior? This is what Michael Hall and Kaitlin Raimi set out to check in a series of experiments in the Journal of Experimental Social Psychology.
Read more here: ‘People who think their opinions are superior to others are most prone to overestimating their relevant knowledge and ignoring chances to learn more‘
8 thoughts on “Believing everyone else is wrong is a danger sign”
Read the paper to see!
This seems related to the idiot effect. I do know that particular groups, such as experts, have been shown to being more prone to it (along with with groups like Fox News viewers who are simultaneously more informed and more misinformed than the general public).
But not everyone who thinks they are an expert actually is an expert, such that the ignorant and ill-informed are often unaware of their state of intellectual paucity. And being certain that others are wrong wouldn’t offer much room for self-awareness, intellectual humility, and a continual process of learning.
I meant to say “smart idiot effect”.
Believing everyone else is wrong is a danger sign. Caveat: sometimes this is true, sometimes this is not, so how do you test and tell The difference?
Yet, when you ask for investment advice, some experts say: “Do the opposite of whatever everyone else is doing”
I appreciate your post. Being open-minded to potentially being wrong is so important, I touch on that idea in 1 of my blog posts as well – http://www.sobeit32.com/open-mindedness-it-is-better-to-understand-than-to-be-understood/
The three-year-old post seems more relevant now than it did then. There seems to be a general, and growing, lack of humility at present. We cannot become experts in any field with a few hours of watching television, or searching and reading echo-chamber-type sources. Even when I want to grasp a very narrowly defined point about a medicine, for example, to better discuss with physicians, I need to really work and spend quite a bit of time with reputable sources.
As an undergrad, I had a professor who said in our first class that we weren’t entitled to an opinion, but rather had to study enough to “earn” the right. I figured he was responding to our tendency to act like one-semester “professors,” but it’s a life-long issue.