Sociologists at the University of North Carolina and Northwestern University examined an earlier case of deep commitment to the inaccurate: the belief, among many conservatives who voted for George W. Bush in 2004, that Saddam Hussein was at least partly responsible for the attacks on 9/11.
Of 49 people included in the study who believed in such a connection, only one shed the certainty when presented with prevailing evidence that it wasn't true.
The rest came up with an array of justifications for ignoring, discounting or simply disagreeing with contrary evidence — even when it came from President Bush himself.
"I was surprised at the diversity of it, what I kind of charitably call the creativity of it," said Steve Hoffman, one of the study's authors and now a visiting assistant professor at the State University of New York, Buffalo.
The voters weren't dupes of an elaborate misinformation campaign, the researchers concluded; rather, they were actively engaged in reasoning that the belief they already held was true.
This type of "motivated reasoning" — pursuing information that confirms what we already think and discarding the rest — helps explain why viewers gravitate toward partisan cable news and why we tend to see what we want in The Colbert Report. But when it comes to justifying demonstrably false beliefs, the logic stretches even thinner.
By the time the interviews were conducted, just before the 2004 election, the Bush Administration was no longer muddling a link between al-Qaeda and the Iraq war. The researchers chose the topic because, unlike other questions in politics, it had a correct answer.
Subjects were presented during one-on-one interviews with a newspaper clip of this Bush quote: "This administration never said that the 9/11 attacks were orchestrated between Saddam and al-Qaeda."
The Sept. 11 Commission, too, found no such link, the subjects were told.
"Well, I bet they say that the commission didn't have any proof of it," one subject responded, "but I guess we still can have our opinions and feel that way even though they say that."
Reasoned another: "Saddam, I can't judge if he did what he's being accused of, but if Bush thinks he did it, then he did it."
Others declined to engage the information at all. Most curious to the researchers were the respondents who reasoned that Saddam must have been connected to Sept. 11, because why else would the Bush Administration have gone to war in Iraq?
"I think we'd all like to believe that when people come across disconfirming evidence, what they tend to do is to update their opinions," said Andrew Perrin, an associate professor at UNC and another author of the study.
That some people might not do that even in the face of accurate information, the authors suggest in their article, presents "a serious challenge to democratic theory and practice."
"The implications for how democracy works are quite profound, there's no question in my mind about that," Perrin said. "What it means is that we have to think about the emotional states in which citizens find themselves that then lead them to reason and deliberate in particular ways."
Evidence suggests people are more likely to pay attention to facts within certain emotional states and social situations. Some may never change their minds. For others, policy-makers could better identify those states, for example minimizing the fear that often clouds a person's ability to assess facts and that has characterized the current health care debate.
Hoffman's advice for crafting such an environment: "The congressional town hall meetings, that is a sort of test case in how not to do it."