Nicholas D. Kristof writes for the New York Times. In an article entitled “Media should try to fight, not spread, fear and lies,” he had an interesting observation about fake news and biased reporting.

Often information is passed on by the media and everyday people without verifying its truthfulness. Fact checking can be time consuming and tedious but Kristof adds an interesting angle on the way we process information.

According to this journalist, social psychology experiments have found that when people are presented with factual corrections that contradict their beliefs, they may cling to mistaken beliefs more strongly than ever. This is called the “backfire effect.” I had never heard this term before so I decided to check it out.

In 2006, Brendan Nyhan and Jason Reifler at The University of Michigan and Georgia State University created fake newspaper articles about polarizing political issues. The articles were written in a way which would confirm a widespread misconception about certain ideas in American politics. As soon as a person read a fake article, researchers then handed over a true article which corrected the first.

They repeated the experiment with several “hot button” issues like stem cell research and tax reform. Again they found corrections tended to increase the strength of the participants’ misconceptions. This was consistent even when people on opposing sides of the issue read the same articles and then the same corrections. When new evidence was interpreted as threatening to their beliefs, the corrections backfired. Instead of changing what people believed, their beliefs were strengthened.

This is nothing new. Hundreds of years ago Francis Bacon (1561-1626) said, “The human understanding when it has once adopted an opinion draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects, in order that by this great and pernicious predetermination the authority of its former conclusion may remain inviolate.”

Psychologist Thomas Gilovich, Professor of Psychology at Cornell University concludes, “When examining evidence relevant to a given belief, people are inclined to see what they expect to see, and conclude what they expect to conclude.”

I do not intend to suggest that a person should be open to just anything. I am not suggesting that we discard our understanding or position on any issue. I believe there are some absolutes in life. All things are not negotiable. Strong convictions and firm beliefs are desirable but we need to be open to the possibility that there is a different perspective that we have not yet seen. We could be mistaken. Our opinions (beliefs) might be subject to correction. There could be more than one way to look at a particular topic.  

God, help us to be open to truth!

Jamie Jenkins