When evidence will not make a difference: motivated reasoning

18 April 2011

Chris Mooney’s article (The Science of Why We Don’t Believe Science) on motivated reasoning must strike a chord with the onthinktanks‘ audience.

Reasoning is actually suffused with emotion (or what researchers often call “affect”). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we’re aware of it. That shouldn’t be surprising: Evolution required us to react very quickly to stimuli in our environment. It’s a “basic human survival skill,” explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

We’re not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn’t take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that’s highly biased, especially on topics we care a great deal about.

Mooney presents an interesting idea: that when we say we are reasoning in reality we are rationalising a decision already made.

Our “reasoning” is a means to a predetermined end—winning our “case”—and is shot through with biases. They include “confirmation bias,” in which we give greater heed to evidence and arguments that bolster our beliefs, and “disconfirmation bias,” in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.

His analysis leads to the conclusion that values condition how one looks and uses research -and facts. And that:

…paradoxically, you don’t lead with the facts in order to convince. You lead with the values—so as to give the facts a fighting chance.