By Jonathan M. Gitlin | Published: September 24, 2008 - 07:30PM CT
We like to think that people will be well informed before making important decisions, such as who to vote for, but the truth is that's not always the case. Being uninformed is one thing, but having a population that's actively misinformed presents problems when it comes to participating in the national debate, or the democratic process. If the findings of some political scientists are right, attempting to correct misinformation might do nothing more than reinforce the false belief.
This sort of misinformation isn't hypothetical; in 2003 a study found that viewers of Fox News were significantly more misinformed about the Iraq war, with far greater percentages of viewers erroneously believing that Iraq possessed WMDs or that there was a credible link between the 9/11 attack and Saddam Hussein than those who got their news from other outlets like NPR and PBS. This has led to the rise of websites like FactCheck and SourceWatch.
Saying that correcting misinformation does little more than reinforce a false belief is a pretty controversial proposal, but the claim is based on a number of studies that examine the effect of political or ideological bias on fact correction. In the studies, volunteers were shown news items or political adverts that contained misinformation, followed by a correction. For example, a study by John Bullock of Yale showed volunteers a political ad created by NARAL that linked Justice John Roberts to a violent anti-abortion group, followed by news that the ad had been withdrawn. Interestingly, Democratic participants had a worse opinion of Roberts after being shown the ad, even after they were told it was false.
Over half (56 percent) of Democratic subjects disapproved of Roberts before the misinformation. That rose to 80 percent afterward, but even after correcting the misinformation, 72 percent of Democratic subjects still had a negative opinion. Republican volunteers, on the other hand, only showed a small increase in disapproval after watching the misinformation (11 percent vs 14 percent).
Along those lines, a pair of political scientists, Brendan Nyhan of Duke and Jason Reifler of Georgia State, have shown a similar effect, this time concerning misinformation surrounding the presence of WMDs in Iraq, tax cuts, or stem cell research. Participants were shown news reports that contained inaccuracies, followed by a correction. The news reports were not real, but were presented to the volunteers as coming from either the New York Times or Fox News. Again, the findings suggest that facts that contradicted political ideology were simply not taken in; if anything, challenging misbelief with fact checking has the counterintuitive effect of reinforcing that misbelief.
Unlike the Bullock study, Nyhan and Reifler only demonstrate this effect of cognitive dissonance on Republican volunteers, and acknowledge that follow-up studies are needed with liberal or Democratic volunteers.
These findings, if true, have worrying implications. Cognitive dissonance won't help people make rational decisions, but it also suggests that there's little point in arguing with someone who holds an opposing belief. Could this response be why, despite being repeatedly refuted in the media, the percentage of Americans who believe Sen. Obama to be a Muslim continues to grow? The research might also apply beyond the political to other attitudes—I'm thinking of the constant flame wars between fans of the PS3 and Xbox 360, or Mac and PC users. Is all that time spent in the Battlefront or the Soap Box wasted?
It seems to suggest that this effect might lead to problems when it comes to efforts to educate people about controversial or politically charged topics; I'm thinking here of climate change or evolution skeptics, both groups that have been targeted by think tanks and interest groups with vested interests in challenging accepted facts. It also points to the rationale behind media outlets like Fox News or Air America, where ideologues can have facts that support their world view continually reinforced. Sadly, that's bad news for anyone who's interested in honest and open public debate.