I have been ceaselessly curious (and more than a bit self-righteous) about people's seeming unwillingness, perhaps inability, to think critically.
Why, oh why do we hold our big brainy heads up high as the sole rational beings on this earth only to ignore, deny, misinterpret or grossly misrepresent facts? (No, John McCain and people who listen seriously to John McCain, Phoenix, Arizona is not the "number-two kidnapping capitol of the world.")
I was starting to think that the only solution to my affliction was to take a break from political news of any kind. But then, as if it was my birthday, NPR sent me "In Politics, Sometimes the Facts Don't Matter," a Talk of the Nation piece.
A new study by Brendan Nyhan and Jason Reifler published in the Journal of Political Behavior investigates what happens when people are misinformed, rather than just measure how often they are uninformed. The authors drew on literature in psychology showing that "people tend to display bias in evaluating political arguments and evidence, favoring those that reinforce their existing views and disparaging those that contradict their views."
Moreover, with the most ardent believers they found there is often a "backfire" effect, which causes them to be more attached to their beliefs after presented with corrections.(The full article, "When Corrections Fail: The Persistence of Political Misperceptions," can be downloaded here.)
This does not bode well for meaningful discourse in politics.
(Aside: I immediately thought of the funny inherent in a scientist studying whether their belief that people cling to their beliefs, despite contrary evidence, is true.)
With fact-checkers popping up to sort through all the crap doled out by the media and politicians from both sides (and to provide Daily Show researchers with leads), this work brings up an important point. Will it help?
There is evidence to show that people will change their views if information is given in certain ways (perhaps delivered personally by a Truth messenger?), though not the ways of the general media. The authors note, "preference-inconsistent information is likely to be subjected to greater skepticism than preference-consistent information, but individuals who are 'confronted with information of sufficient quantity or clarity... should eventually acquiesce to a preference-inconsistent conclusion.'"
Also, as a fact becomes more widely embraced by the public and media, people will begin to accept it, despite their contrary convictions. So, no, people are not simply misperception-defending believe machines. (Yay!)
But we are stubborn as hell, it seems. Yes, I believe it. In fact, I welcome this news. It's nice to have some explanation of why people seem to turn a blind eye (or a left hook) to reality. I can't help but hope that there's more to the story though. Please let there be a way for us humans to overcome our deep seated fear of ideological change!
Nyhan likens the political parties, for instance, that people align themselves with as a team. That makes people all the more susceptible to this defensive processing, it seems. Is there a way to help facts to flourish, thereby enriching our democratic society?
Some say a partial solution is to hold the elite more accountable, introducing a shame factor to combat the team spirit effect. Certainly, that is a worthy activity. Now, if we can get more networks than comedy central and PBS on board...
One of the first thoughts I had while listening to the show was, "If I was presented with data that contradicted my beliefs, Iknow I would change my mind."