Critical Thinking Skills: Why They Are So Difficult To Acquire

Critical thinking skills are difficult to acquire because the mind is a believing machine, as this classic psychology study demonstrates.

critical thinking

Critical thinking skills are difficult to acquire because the mind is a believing machine, as this classic psychology study demonstrates.

What is the mind’s default position to critical thinking: are we naturally critical or naturally gullible?

As a species do we have a tendency to behave like Agent Mulder from the X-Files who always wanted to believe in mythical monsters and alien abductions?

Or are we like his partner Agent Scully who applied critical thinking, generating alternative explanations, trying to understand and evaluate the strange occurrences they encountered rationally?

Do we believe what the TV, the newspapers, blogs even, tell us at first blush or do we use critical thinking processes?

Can we ignore the claims of adverts, do we lap up what politicians tell us, do we believe our lover’s promises?

It’s not just that some people do think critically and some people don’t think critically; in fact all our minds are built with the same first instinct, the same first reaction to new information.

But what is it: do we believe first or do we first understand, so that belief (or disbelief) comes later?

Critical thinking skills: Descartes vs. Spinoza

This argument about whether belief is automatic when we are first exposed to an idea or whether belief is a separate process that follows understanding has been going on for at least 400 years.

The French philosopher, mathematician and physicist René Descartes (below, right) argued that understanding and believing are two separate processes.

First, people take in some information by paying attention to it, then they decide what to do with that information, which includes believing or disbelieving it.

Descartes’ view is intuitively attractive and seems to accord with the way our minds work, or at least the way we would like our minds to work.

The Dutch philosopher Baruch Spinoza, a contemporary of Descartes, took a quite different view.

He thought that the very act of understanding information was believing it.

We may, he thought, be able to change our minds afterwards, say when we come across evidence to the contrary, but until that time we believe everything.

Spinoza’s approach is unappealing because it suggests we have to waste our energy using critical thinking to root out falsities that other people have randomly sprayed in our direction, whether by word of mouth, TV, the internet or any other medium of communication.

So who was right, Spinoza or Descartes?

Research on critical thinking skills

Daniel Gilbert and colleagues put these two theories head-to-head in a series of experiments to test whether understanding and belief operate together or whether belief (or disbelief) comes later (Gilbert et al., 1993).

In their classic social psychology experiment on critical thinking, seventy-one participants read statements about two robberies then gave the robber a jail sentence.

Some of the statements were designed to make the crime seem worse, for example the robber had a gun, and others to make it look less serious, for example the robber had starving children to feed.

The twist was that only some of the statements were true, while others were false.

Participants were told that all the statements that were true would be displayed in green type, while the false statement would be in red.

Here’s the clever bit: half the participants where purposefully distracted while they were reading the false statements while the other half weren’t.

In theory, if Spinoza was correct, then those who were distracted while reading the false statements wouldn’t have time to process the additional fact that the statement was written in red and therefore not true, and consequently would be influenced by it in the jail term they gave to the criminal.

On the other hand, if Descartes was right then the distraction would make no difference as participants wouldn’t have time to believe or not believe the false statements so it wouldn’t make any difference to the jail term.

The reason critical thinking is difficult

The results showed that when the false statements made the crime seem much worse rather than less serious, the participants who were interrupted gave the criminals almost twice as long in jail, up from about 6 years to around 11 years.

In contrast, the group in which participants hadn’t been interrupted managed to ignore the false statements.

Consequently, there was no significant difference between jail terms depending on whether false statements made the crime seem worse or less serious.

This meant that only when given time to think about it did people behave as though the false statements were actually false.

On the other hand, without time for reflection, people simply believed what they read.

Gilbert and colleagues carried out further experiments to successfully counter some alternative explanations of their results.

These confirmed their previous findings and led them to the rather disquieting conclusion that Descartes was in error and Spinoza was right.

Believing is not a two-stage process involving first understanding then believing.

Instead understanding is believing, a fraction of a second after reading it, you believe it until some other critical faculty kicks in to change your mind.

We really do want to believe, just like Agent Mulder and naturally lack the critical thinking skills of Agent Scully.

Believe first, ask questions later

Not only that, but their conclusions, and those of Spinoza, also explain other behaviours that people regularly display:

  • The fundamental attribution error: this is people’s assumption that others’ behaviour reflects their personality, when really it reflects the situation.
  • Truthfulness bias: people tend to assume that others are telling the truth, even when they are lying.
  • The persuasion effect: when people are distracted it increases the persuasiveness of a message.
  • Denial-innuendo effect: people tend to positively believe in things that are being categorically denied.
  • Hypothesis testing bias: when testing a theory, instead of trying to prove it wrong people tend to look for information that confirms it. This, of course, isn’t very effective hypothesis testing!

When looked at in light of Spinoza’s claim that understanding is believing, these biases and effects could result from our tendency to believe first and ask questions later.

Take the fundamental attribution error: when meeting someone who is nervous we may assume they are a nervous person because this is the most obvious inference to make.

It only occurs to us later, when applying critical thinking skills, that they might have been worried because they were waiting for important test results.

If all this is making your feel rather uncomfortable then you’re not alone.

Gilbert and colleagues concede that our credulous mentality seems like bad news.

It may even be an argument for limiting freedom of speech.

After all, if people automatically believe everything they see and hear, we have to be very careful about what people see and hear.

Disadvantages of too much critical thinking

Gilbert and colleagues counter this by arguing that too much critical thinking or even cynicism is not a good thing.

Minds working on a Descartian model would only believe things for which they had hard evidence.

Everything else would be neither believed or not believed, but in a state of limbo.

The problem is that a lot of the information we are exposed to is actually true, and some of it is vital for our survival.

If we had to go around applying critical thinking to our beliefs all the time, we’d never get anything done and miss out on some great opportunities.

Minds that work on a Spinozan model, however, can happily believe as a general rule of thumb, then check out anything that seems dodgy later.

Yes, they will often believe things that aren’t true, but it’s better to believe too much and be caught out once in a while than be too cynical and fail to capitalise on the useful and beneficial information that is actually true.

Or maybe by going along with this argument I’m being gullible and the harsh truth is that it’s a basic human failing that we are all too quick to take things at face value and too slow to engage our critical thinking.

I’ll leave you to ponder that one.

.

Author: Dr Jeremy Dean

Psychologist, Jeremy Dean, PhD is the founder and author of PsyBlog. He holds a doctorate in psychology from University College London and two other advanced degrees in psychology. He has been writing about scientific research on PsyBlog since 2004.

Get free email updates

Join the free PsyBlog mailing list. No spam, ever.