By Ken Perrott 30/11/2015 1

Why is  a straightforward logical discussion so impossible?

Why do our discussion partners refuse to accept our reasoned arguments? And, if we are honest, why do we ourselves find it so difficult to accept the reasonable logic of our discussion partners?

Well, a recent article at the blog “Why We Reason” provides an answer. It is  Psychology’s Treacherous Trio: Confirmation Bias, Cognitive Dissonance, and Motivated Reasoning and reinforces what I have often felt – we are not really a rational species – more a rationalising one.

Beliefs dictate what and how we see

The article gets to the root of the matter – the psychological forces that fuel our conversations:

“While many like to believe that they have a special access to the truth, the reality is that we all see the world not as it is, but as we want it to be: Republicans watch Fox while Democrats watch MSNBC; creationists see fossils as evidence of God, evolutionary biologists see fossils as evidence of evolution; a mother sees abortion as the best thing for her daughter, and the church sees it as unholy and sinful. You get the point – our beliefs dictate what we see and how we see.”

The article goes on to discuss “a few psychological tendencies that when mixed together form a potent recipe for ignorance.”

Confirmation bias


Confirmation bias sticks out like a sore thumb when participants in discussion cherry-pick authorities and citations to support their arguments. Well, it sticks out like a sore thumb to the discussion partner anyway (who may also be cherry-picking to confirm an opposite bias).

“Confirmation bias is exactly what it sounds like – the propensity for people to look for what confirms their beliefs and ignore what contradicts their beliefs while not being concerned for the truth.”

Hard not to fall into that trap when discussing complex issues within the constraints of limited space and time. But, nevertheless, something we should attempt to avoid.

Cognitive dissonance


“Then there’s cognitive dissonance, which describes a “state of tension that occurs whenever a person holds two cognitions that are psychologically inconsistent.” “

The article provides an example:

“Leon Festinger introduced it in 1957 after he infiltrated and studied a UFO cult convinced the world would end at midnight on December 21st, 1954. In his book When Prophecy FailsFestinger recounts how after midnight came and went, cult members began to look for reasons for why the end of the world had not come. Eventually the leader of the cult, Marian Keech, explained to her members that she received a message from automatic writing, which told her that the God of Earth decided to spare the planet from destruction. Relieved, the cult members continued to spread their doomsday ideology to non-believers.

Although Festiner’s example is extreme, all of us do this everyday. Take unhealthy food; we all know that pizza is bad for us, but we still eat it. And after finishing a few slices we say “it was worth it,” or “I’ll run it off tomorrow.” Or take smokers; they know that smoking kills but continue to smoke. And after unsuccessfully quitting, they justify their failures by claiming that, “smoking isn’t that bad” or that “it is worth the risk.” Whether it’s UFO’s, food, or smoking we all hold inconsistent beliefs and almost always side with what is most comfortable instead of what is true.”

Motivated reasoning

Dilbert - mot reason

The article describes this as “our tendency to accept what we want to believe with much more ease and much less analysis than what we don’t want to believe.”

I think religious apologists often provide the most obvious examples of motivated reasoning – probably because they are often trained in philosophy and logic. They will argue that their beliefs are based on reason and not faith, and seem to enjoy constructing logical arguments for their claims which seem to be built on simple logical steps. Yet, they gloss over, or ignore, the huge jumps in logic which are inevitably part of their reasoning.

Maybe a faith-based belief reinforced by motivated reasoning is the hardest to defeat because the proponent actually believes their arguments are completely rational.

The article concludes:

“So what’s the difference between confirmation bias, cognitive dissonance, and motivated reasoning? The short answer is that there really aren’t any differences. Generally speaking, they serve the same purpose, and that is to frame the world so it makes sense to us. But there are a few nuances worth mentioning. For one, motivated reasoning is like an evil twin to cognitive dissonance in that it tries to avoid it. And for another, and I quote NYU psychologist Gary Marcus who says it perfectly, “whereas confirmation bias is an automatic tendency to notice data that fit with our beliefs, motivated reasoning is the complementary tendency to scrutinize ideas more carefully if we don’t like them than if we do.””


One Response to “The problem with reasoned discussion”

  • Thanks Ken Perrott. Great article. Reminded me of the following quote:
    Kramer wrote: We express ourselves all the time, in all sorts of ways. And we listen to one another. But we do not simply, passively receive a communication. We construct the message (and even the sender!) for ourselves, using a mix of what we have heard, what we hope we did not hear, who we are, who we think the message sender is, what our values and expectations are, what our moods and contexts are, our memories of previous interactions, etc. So, misunderstanding between two people is inevitable, no matter how much they try to communicate, no matter who they are, no matter what their relationship. This situation is inevitable, and it should be accepted rather than fought. ( Between Couch and Piano, 1997, pp 34-45)