Real Leaders

Repetition Makes it True. Repetition Makes It True. Do You Believe me Now?

Full length of young woman positioning final magenta block in grid against white background

Whenever you hear something repeated, it feels more real when you hear it repeated. In other words, repetition makes any statement seem more trustworthy. So anything you hear will feel more accurate each time you hear it again.

Do you see what I did there? Each of the three sentences above conveyed the same message. Yet each time you read the next sentence, it felt more and more trustworthy. Cognitive neuroscientists like myself call this the “illusory truth effect.”

Go back and recall your experience reading the first sentence. It probably felt strange and disconcerting, perhaps with a tone of outrage, as in “I don’t believe things more if they’re repeated!” 

Reading the second sentence did not inspire such a strong reaction. Your reaction to the third sentence was tame by comparison.

Why? Because of a phenomenon called “cognitive fluency,” meaning how easily we process information. Much of our vulnerability to deception in all areas of life – including misinformation – revolves around cognitive fluency in one way or another. 

Now think about how rumors spread in your organization’s grapevine. It works on the same principle. Employees hear a rumor – say about a proposed headquarters move, just like Elon Musk’s move of Tesla’s HQ to Texas. It feeds into their fears, which is a very cognitively fluid part of our minds. 

They repeat the rumor, and it goes around, and then they keep hearing it from others. It begins to seem more and more authentic, regardless of reality. Before you know it, those who want to stay where they are looking for another job, even though you might never have intended to move your headquarters! 

Fortunately, we can learn about these mental errors, which helps us address misinformation and make our workplaces more truthful.

The Lazy Brain

Our brains are lazy. The more effort it takes to process information, the more uncomfortable we feel about it, and the more we dislike and distrust it. 

By contrast, the more we like specific data and are comfortable with it, the more we feel that it’s accurate. This intuitive feeling in our gut is what we use to judge what’s true and false. 

Yet no matter how often you heard that you should trust your gut and follow your intuition, that advice is wrong. You should not trust your gut when evaluating information where you don’t have expert-level knowledge, at least when you don’t want to screw up. Structured information gathering and decision-making processes help us avoid the numerous errors we make when we follow our intuition. And even experts can make serious errors when they don’t rely on such decision aids.

These mistakes happen due to mental errors that scholars call “cognitive biases.” The illusory truth effect is one of these mental blindspots; there are over 100 altogether. These mental blindspots impact all areas of our life, from health and politics to relationships.

Other Important Cognitive Biases

Besides illusory truth, what are some other cognitive biases you need to beware of to protect your organization from misinformation? If you’ve heard of any cognitive biases, you’ve likely heard of the “confirmation bias.” That refers to our tendency to look for and interpret information in ways that conform to our prior beliefs, intuitions, feelings, desires, and preferences, as opposed to the facts. 

Again, cognitive fluency deserves blame. It’s much easier to build neural pathways to information that we already possess, especially when we have strong emotions; it’s much more challenging to break well-established neural pathways if we need to change our minds based on new information. Consequently, we instead look for information that’s easy to accept, which fits our prior beliefs. In turn, we ignore and even actively reject information that doesn’t match our beliefs. 

Moreover, the more educated we are, the more likely we are to engage in such active rejection. After all, our smarts give us more ways of arguing against new information that counters our beliefs. That’s why research demonstrates that the more educated you are, the more polarized your beliefs will be around scientific issues that have religious or political value overtones, such as stem cell research, human evolution, and climate change. Where might you and your team be letting your smarts get in the way of the facts?

Our minds like to interpret the world through stories, meaning explanatory narratives that clearly and straightforwardly link cause and effect. Such stories are a balm to our cognitive fluency, as our mind continually looks for patterns that explain the world around us in an easy-to-process manner. That leads to the “narrative fallacy,” where we fall for convincing-sounding narratives regardless of the facts, especially if the story fits our predispositions and our emotions. 

Do you ever wonder why politicians tell so many stories? How about the advertisements you see on TV or video advertisements on websites, which tell rapid visual stories? How about salespeople or fundraisers? Sure, sometimes they cite statistics and scientific reports, but they spend much, much more time telling stories: simple, straightforward, compelling narratives that seem to make sense and tug at our heartstrings. 

Now, here’s something that’s actually true: the world doesn’t make sense. The world is not simple, clear, and compelling. The world is complex, confusing, and contradictory. Beware of simple stories! Look for complex, confusing, and contradictory scientific reports and high-quality statistics: they’re much more likely to contain the truth than the easy-to-process stories.

Fixing Our Brains

Unfortunately, knowledge only weakly protects us from cognitive biases; it’s essential but far from sufficient.

What can we do? You can use decision aid strategies to address cognitive biases to defend your organization from misinformation.

One of the most effective strategies is to help your employees and yourself build up a habit of automatically considering alternative possibilities to any claim you hear, especially claims that feel comfortable. Since our lazy brain’s default setting is to avoid questioning claims, which requires hard thinking, it helps to develop a mental practice of going against this default. Be especially suspicious of repeated claims that make you feel comfortable without any additional evidence, which play on the illusory truth effect and the confirmation bias combined.

Another effective strategy involves cultivating a mental habit of questioning stories in particular. Whenever you hear a story, the brain goes into listening and accepting mode. Remember that it’s very easy to cherry-pick stories to support whatever position the narrator wants to advance. Instead, look for specific hard numbers, statistical evidence, and peer-reviewed research to support claims.

More broadly, you can encourage employees to make a personal commitment to the twelve truth-oriented behaviors of the Pro-Truth Pledge by signing the pledge at ProTruthPledge.org. These behaviors stem from cognitive neuroscience and behavioral economics research in the field called debiasing, which refers to counterintuitive, uncomfortable, but effective strategies to protect yourself from cognitive biases. Peer-reviewed research has shown that taking the Pro-Truth Pledge effectively changes people’s behavior to be more truthful, both in their statements and in interactions with others.

These quick mental habits will address the most fundamentally flawed aspects of our mind’s tendency to accept misinformation.