This site is moving to a new domain: check out

“Some of these stories are closer to my own life than others are, but not one of them is as close as people seem to think.” Alice Murno, from the intro to Moons of Jupiter

"Talent hits a target no one else can hit; genius hits a target no one else can see." Arthur Schopenhauer

“Why does everything you know, and everything you’ve learned, confirm you in what you believed before? Whereas in my case, what I grew up with, and what I thought I believed, is chipped away a little and a little, a fragment then a piece and then a piece more. With every month that passes, the corners are knocked off the certainties of this world: and the next world too. Show me where it says, in the Bible, ‘Purgatory.’ Show me where it says ‘relics, monks, nuns.’ Show me where it says ‘Pope.’” –Thomas Cromwell imagines asking Thomas More—Wolf Hall by Hilary Mantel

My favorite posts to get started: The Self-Righteousness Instinct, Sabbath Says, Encounters, Inc., and What Makes "Wolf Hall" so Great?.

Tuesday, July 13, 2010

I Got Your Facts Right Here

Imagine two people debating, say, immigration reform. One of them says we absolutely must stem the flow of immigrants to get control of the crime wave plaguing the border. The other says there is no crime wave. The frequency of violent crime has remained stable, or in some areas actually gone down, in recent years. They each go their separate ways convinced the other is wrong. But then the second of them recalls where he read about the steady crime rate, finds the article, follows it to a link to FBI crime statistics, and send that link to the first. "You were wrong."

Does the anti-immigration advocate change his mind? Dana Milbank of The Washington Post experienced something very similar to this scenario, and in his case the answer wasn't just no--being presented with the facts actually strengthened his interlocutor's demonstrably wrong position. He discussed this experience today on Talk of the Nation, alongside Brendan Nyhan who recently did reseach at the University of Michigan which came to the same conclusion.

One of the themes they discussed was that certain beliefs are harder to give up than others. It's especially those that are espoused by our "team" that we refuse to reconsider. What's interesting about this tribal dynamic is that it suggests that not only do we not apply the standards of our morality to members of rival tribes, but apparently we also don't afford them the same epistemic status. Their facts aren't as good as our facts.

We're all susceptible to this trap of cognitive dissonance--in fact smart people are even more susceptible because they're better able to pull justifications out of their asses for sticking to their favorite ideas. But I propose as one partial remedy that we champion not any ideology or political position, but rather the epistemology that will lead us to one. In science admitting you're wrong is seen as progress, not as defeat (ideally). And so teamishness is avoided (ideally).

Of course, once we arrive at the best approximation of the truth, we still have to work out our values. For instance, we might choose not to punish all immigrants for the crimes of a few as a matter of principle.

1 comment:

caynazzo said...

Shermer "Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons."

Worded this way, the above sounds circular or self-refuting.

I actually prefer the way you said it.