Why Facts Don't Win Arguments

Have you ever noticed how easy it is to dismiss a positive remark but a negative one can bounce around your head for days?  What makes opposing, adverse, or unfavorable information so sticky?  And why do conversations

 that don’t align with our mental models of the world consume so much mental energy?    

Think about the last time you engaged in an online discussion with someone who was either misinformed or just plain ignorant about a topic that you have researched and studied and have very strong convictions about. Take climate change, for example. Many of the most vocal climate change deniers will freely admit they aren’t “experts” before launching into a litany of reasons why the science is wrong.

A classic example of misinformed ignorance posing as an expert is Senator James Inhofe (R-OK) (and not just because his “proof” of the global warming hoax was the snowball he brought for Senate show-and-tell.)  Inhofe has repeatedly maintained that “man-made global warming is the greatest hoax ever perpetrated on the American people.  It’s worth noting that, According to Oil Change International, Inhofe has received over $2 million in donations from the fossil fuel industry.  His ignorance is astounding, as is his inability to hear any kind of data, research, or information that conflicts with his position.

Don’t get me wrong. Scientific skepticism is healthy – necessary even. It forces scientists to examine claims (their own and those of others) and systematically question all information in search of flaws and fallacies. But, deniers like Inhofe vigorously criticize any evidence that substantiates climate change and embrace any argument that refutes it. Presenting actual facts and data that challenge their thinking just makes them dig in their heels and even more sure of their position.

Skepticism is healthy both for science and society. Denial is irresponsible and dangerous.

There are several related cognitive processes that explain what’s going on here.

Confirmation Bias

Confirmation bias is that unconscious force that nudges us to seek out information that aligns with our existing belief system.  Once we have formed a view, we embrace information that confirms that view while ignoring information to the contrary. We pick out the bits of data that confirm our prejudices because it makes us feel good to be “right.” When we want to be right badly enough, we become prisoners of inaccurate assumptions.

Extending the example of climate change, a 2018 study was conducted with more than 1,000 residents living in South Florida who were at risk from either the direct or indirect effects of flooding to their homes, including a decrease in property values as coastal property is perceived as a less desirable destination.

Half of the participants received a map of their own city that illustrated what could happen just 15 years from now at the present rate of sea level rise if there were a Category 3 hurricane accompanied by storm surge flooding. Those who had viewed the maps were less likely to say they believed that climate change was taking place than those who had not seen the maps. Furthermore, those who saw the maps were less likely than those who had not seen the maps to believe that climate change was responsible for the increased intensity of storms.

The Backfire Effect

A second cousin to confirmation bias is the backfire effect.  Not only do we search out information consistent with our beliefs, but we instinctively and unconsciously protect those beliefs when confronted with information that conflicts with them.  It’s an instinctive defense mechanism that kicks in when someone presents information – even statistically sound research or data – that disputes your position.  Those facts and figures backfire and only strengthen our misconceptions. In addition, the cognitive dissonance produced by conflicting evidence actually builds new neural connections that further entrench our original convictions. 

 A 2006 study examined why sound evidence fails to correct misperceptions. Subjects read fake news articles that included a misleading claim from a politician, or a misleading claim and a correction about polarizing political issues.  People on opposing sides of the political spectrum read the same articles and then the same corrections, and when new evidence threatened their existing beliefs, they doubled down. The corrections backfired. The evidence made them more certain that their original beliefs were correct.

The Dunning-Kruger Effect

The Dunning-Kruger Effect is based upon the notion that we all have pockets of incompetence with an inverse correlation between knowledge or skills and confidence. People who are ignorant or unskilled in a particular subject area tend to believe they are much more competent than they are. Bad drivers believe they're good drivers, cheapskates think they are generous, and people with no leadership skills think they can rule the world. How hard can it be?

Those who have the slightest bit of experience think they know it all. Then, as people gain experience, they begin to realize how little they do know. This is the point at which they search out the knowledge they need to build their expertise. Those at the level of genius recognize their talent and demonstrate the confidence commensurate with their ability. There is also a corollary to the effect.  Just as highly incompetent people overestimate their abilities, highly competent tend to underestimate their abilities. Dunning and Kruger found that most people, regardless of how they perform on any given task, rate themselves at 7-8 out of 10.

The bottom line:  the next time you are convinced of your “rightness,” it might be worth it to take a minute to examine your biases.  

Mark Twain may have said it best:

It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.”


Intellectual Humility Blog Melissa Hughes

ClickHere NN Every Friday-1

 

 

 

 

 

 

 

 

 

 

Let's Chat!
Share Post