Why Do People Believe Stupid Stuff, Even When They're Confronted With the Truth?
June 26, 2011 |
Join our mailing list:
Sign up to stay up to date on the latest headlines via email.
This story is cross-posted from You Are Not So Smart.
The Misconception: When your beliefs are challenged with facts, you alter your opinions and incorporate the new information into your thinking.
The Truth: When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.
Wired, The New York Times, Backyard Poultry Magazine – they all do it. Sometimes, they screw up and get the facts wrong. In ink or in electrons, a reputable news source takes the time to say “my bad.”
If you are in the news business and want to maintain your reputation for accuracy, you publish corrections. For most topics this works just fine, but what most news organizations don’t realize is a correction can further push readers away from the facts if the issue at hand is close to the heart. In fact, those pithy blurbs hidden on a deep page in every newspaper point to one of the most powerful forces shaping the way you think, feel and decide – a behavior keeping you from accepting the truth.
In 2006, Brendan Nyhan and Jason Reifler at The University of Michigan and Georgia State University created fake newspaper articles about polarizing political issues. The articles were written in a way which would confirm a widespread misconception about certain ideas in American politics. As soon as a person read a fake article, researchers then handed over a true article which corrected the first. For instance, one article suggested the United States found weapons of mass destruction in Iraq. The next said the U.S. never found them, which was the truth. Those opposed to the war or who had strong liberal leanings tended to disagree with the original article and accept the second. Those who supported the war and leaned more toward the conservative camp tended to agree with the first article and strongly disagree with the second. These reactions shouldn’t surprise you. What should give you pause though is how conservatives felt about the correction. After reading that there were no WMDs, they reported being even more certain than before there actually were WMDs and their original beliefs were correct.
They repeated the experiment with other wedge issues like stem cell research and tax reform, and once again, they found corrections tended to increase the strength of the participants’ misconceptions if those corrections contradicted their ideologies. People on opposing sides of the political spectrum read the same articles and then the same corrections, and when new evidence was interpreted as threatening to their beliefs, they doubled down. The corrections backfired.
Once something is added to your collection of beliefs, you protect it from harm. You do it instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens them instead. Over time, the backfire effect helps make you less skeptical of those things which allow you to continue seeing your beliefs and attitudes as true and proper.
In 1976, when Ronald Reagan was running for president of the United States, he often told a story about a Chicago woman who was scamming the welfare system to earn her income.
Reagan said the woman had 80 names, 30 addresses and 12 Social Security cards which she used to get food stamps along with more than her share of money from Medicaid and other welfare entitlements. He said she drove a Cadillac, didn’t work and didn’t pay taxes. He talked about this woman, who he never named, in just about every small town he visited, and it tended to infuriate his audiences. The story solidified the term “Welfare Queen” in American political discourse and influenced not only the national conversation for the next 30 years, but public policy as well. It also wasn’t true.