The “illusion of explanatory depth” — believing we know more about a policy position than we really do — may account for the intensity of partisan politics. “It is not until we are asked to explain how ... a system works — whether it’s what’s involved in a trade deal with China or how a toilet flushes — that we realize how little we actually know,” write Steven Sloman and Philip M. Fernbach. That knowledge often leads to more moderate positions. Their essay first appeared Oct. 21, 2012, in the Sunday New York Times.

If we are reminded of anything this election season, it is that America is a house divided against itself. The anger and mistrust between Democrats and Republicans, liberals and conservatives, often seems as bitter as it is reflexive. Most worrisome of all, we have grown so accustomed to this divide that we no longer flinch at the brazen political attacks on either side — even when the logic underlying these attacks is hard to fathom.

Take the case of two political ads recently shown on television. One, from Mitt Romney, asserts that the employment situation in the United States “doesn’t have to be this way if Obama would stand up to China.” The other, from President Obama’s camp, implies that a Romney presidency would be bad for the coal industry, in part because Mr. Romney has a Swiss bank account.

The truth is, even experts have difficulty spelling out how changing our trade policy with China will make more than a modest dent in unemployment numbers — or how, with or without a foreign bank account, Mr. Romney’s proposed policies are likely to hurt the domestic coal industry. But that doesn’t matter.

Such attack ads work, in large part, because we don’t understand them. Such statements take advantage of a fact about human psychology called the “illusion of explanatory depth,” an idea developed by the Yale psychologist Frank Keil and his students. We typically feel that we understand how complex systems work even when our true understanding is superficial. And it is not until we are asked to explain how such a system works — whether it’s what’s involved in a trade deal with China or how a toilet flushes — that we realize how little we actually know.

In our own research we have found this pattern when people are asked to explain how political policies work. In a forthcoming article in Psychological Science, written with Todd Rogers of the Harvard Kennedy School’s Center for Public Leadership and Craig Fox of U.C.L.A.’s Anderson School of Management, we report on experiments showing how people often believe they understand what is meant by well-worn political terms like the “flat tax,” “sanctions on Iran” or “cap and trade” — even when they don’t.

That’s not much of a shocker, of course. The real surprise is what happens after these same individuals are asked to explain how these policy ideas work: They become more moderate in their political views — either in support of such policies or against them. In fact, not only do their attitudes change, so does their behavior. In one of our experiments, for example, after attempting to explain how various policy ideas would actually work, people became less likely to donate to organizations that supported the positions they had initially favored.

Interestingly, asking people to justify their position — rather than asking them to explain the mechanisms by which a policy would work — doesn’t tend to soften their political views. When we asked participants to state the reasons they were for or against a policy position, their initial attitudes held firm. (Other researchers have found much the same thing: Merely discussing an issue often makes people more extreme, not less.)

Why, then, does having to explain an opinion often end up changing it? The answer may have to do with a kind of revelatory trigger mechanism: asking people to “unpack” complex systems — getting them to articulate how something might work in real life — forces them to confront their lack of understanding.

The challenge in an election season that largely takes place in the form of 30-second advertisements and fire-up-the-base rallies is that rarely is anybody — candidate or voter — asked to explain his or her positions. American political discourse, in short, is not discourse at all.

So what can be done to turn it into one? The answer implied by our research is not that we should all become policy wonks. Instead, we voters need to be more mindful that issues are complicated and challenge ourselves to break down the policy proposals on both sides into their component parts. We have to then imagine how these ideas would work in the real world — and then make a choice: to either moderate our positions on policies we don’t really understand, as research suggests we will, or try to improve our understanding. Either way, discourse would then be based on information, not illusion.

But whether or not we citizens make this effort, our leaders should. We should demand that Mr. Obama and Mr. Romney explain how in addition to why. And the last and best shot we’ll have at hearing these “hows” will come [Monday] evening, in the final presidential debate. We will all be served if Bob Schieffer, the chief Washington correspondent for CBS News and moderator for Monday’s debate, pushes both men to explain, not just assert.

Strong opinion and vigorous debate are key parts of democracy and the foundation to American culture. Yet most people would agree that it is not productive to have a strong opinion about an issue that one doesn’t really understand. We have a problem in American politics: an illusion of knowledge that leads to extremism. We can start to fix it by acknowledging that we know a lot less than we think.


Steven Sloman is a professor of cognitive, linguistic and psychological sciences at Brown University. Philip M. Fernbach is an assistant professor of marketing at the University of Colorado’s Leeds School of Business.