Illusion of Truth Effect: Repetition Makes Lies Sound True

The illusion of truth effect in psychology is the tendency to believe false information if it is repeated often enough. 

Illusion of truth

The illusion of truth effect in psychology is the tendency to believe false information if it is repeated often enough.

The illusion of truth effect, is very simple: people are more likely to believe something, the more often it is repeated to them.

With repetition, it is easier for the human mind to process a statement relative to other competing ideas that have not been repeated over-and-over again.

Repetition is used everywhere to persuade people, in advertising, politics and the media, and it certainly works.

Examples of the illusion of truth effect

We see ads for the same products over and over again.

Politicians repeat the same messages endlessly (even when it has nothing to do with the question they’ve been asked).

Journalists repeat the same opinions day after day.

Can all this repetition really be persuasive?

It seems too simplistic that just repeating a persuasive message should increase its effect, but that’s exactly what psychological research finds (again and again).

Repetition is one of the easiest and most widespread methods of persuasion because of the illusion of truth effect.

In fact it’s so obvious that we sometimes forget how powerful it is.

People rate statements that have been repeated just once as more valid or true than things they’ve heard for the first time.

They even rate statements as truer when the person saying them has been repeatedly lying (Begg et al., 1992).

That is how powerful the illusion of truth effect is.

And when we think something is more true, we also tend to be more persuaded by it.

Several studies on the illusion of truth have shown that people are more swayed when they hear statements of opinion and persuasive messages more than once.

How the illusion of truth effect works

The illusion of truth effect works at least partly because familiarity breeds liking.

As we are exposed to a message again and again, it becomes more familiar.

Because of the way our minds work, what is familiar is also true — hence the illusion of truth.

Familiar things require less effort to process and that feeling of ease unconsciously signals truth, this is called cognitive fluency.

As every politician knows, there’s not much difference between actual truth and the illusion of truth.

Since illusions are often easier to produce, why bother with the truth?

Reversing the illusion of truth

The exact opposite of the illusion of truth is also true.

If something is hard to think about, then people tend to believe it less.

Naturally this is very bad news for people trying to persuade others of complicated ideas in what is a very complicated world.

Some studies have even tested how many times a message should be repeated for the maximum effect of the illusion of truth.

These suggest that people have the maximum confidence in an idea after it has been repeated between 3 and 5 times (Brinol et al., 2008).

After that, repetition ceases to have the same effect and may even reverse.

Because TV adverts are repeated many more times than this, advertisers now use subtle variations in the ads to recapture our attention and avoid the illusion of truth backfiring.

This is an attempt to avoid the fact that while familiarity can breed liking, over-familiarity tends to breed contempt.

When the illusion of truth fails

Repetition is effective almost across the board when people are paying little attention, but when they are concentrating and the argument is weak, the effect disappears (Moons et al., 2008).

In other words, it’s no good repeating a weak argument to people who are listening carefully — then the illusion of truth does not operate.

But if people aren’t motivated to scrutinise your arguments carefully then repeat away with abandon—the audience will find the argument more familiar and, therefore, more persuasive.

This suggests we should remain critical while watching TV adverts or the illusion of truth effect will creep in under our defences.

You might think it’s better to let the ads wash over you, without thinking too much, but just the reverse is true.

Really we should be highly critical of the illusion of truth otherwise, before we know it, we’re singing the jingle, quoting the tag-line and buying the product.

When the argument is strong, though, it doesn’t matter whether or not the audience is concentrating hard, repetition will increase persuasion and the illusion of truth effect works.

Unfortunately, I find it’s often people with the best arguments who don’t take advantage of the illusion of truth.

Persuading groups

When people are debating an issue together in a meeting, you can see a parallel effect.

When one person in a group repeats their opinion a few times, the other people think that person’s opinion is more representative of the whole group (see my previous article: loudest voice = majority opinion).

The same psychology is at work again: to the human mind there is little difference between appearances and truth.

What appears to be true might as well actually be true, because we tend to process the illusion as though it were the truth.

It’s a depressing enough finding about the human ability to process rational arguments, but recent research has shown an even more worrying effect.

We can effectively persuade ourselves through repetition — which takes the illusion of truth to new heights.

A study has shown that when an idea is retrieved from memory, this has just as powerful a persuasive effect on us as if it had been repeated twice (Ozubko et al., 2010).

The aspiring sceptic, therefore, should be especially alert to thoughts that come quickly and easily to mind—we can easily persuade ourselves with a single recall of a half-remembered thought.


Author: Jeremy Dean

Psychologist, Jeremy Dean, PhD is the founder and author of PsyBlog. He holds a doctorate in psychology from University College London and two other advanced degrees in psychology. He has been writing about scientific research on PsyBlog since 2004. He is also the author of the book "Making Habits, Breaking Habits" (Da Capo, 2013) and several ebooks.

Get free email updates

Join the free PsyBlog mailing list. No spam, ever.