The Best Disinformation Confirms What You Already Believe
We love being right so much that disinformation leverages that against us.
In 1938, Orson Welles broadcast his radio adaptation of "The War of the Worlds." Supposedly, millions of Americans fled their homes in terror, convinced that Martians had invaded Earth. In reality, almost no one panicked. Most listeners knew they were hearing a play.
Yet we keep telling this story of mass hysteria.
Why?
Because it confirms what we want to believe about how easily other people fall for lies.
The best disinformation doesn't try to convince you of something new. It confirms what you already suspect, then hands you the "evidence" to prove it.
Russian operatives understood this during the 2016 election. They didn't create America's political divisions. They just told each side exactly what it wanted to hear.
They just told each side exactly what it wanted to hear.
This same psychology shapes every part of our lives. Our need to feel right doesn't just make us vulnerable to false information.
We miss warning signs about our children because we want to believe they're fine. We ignore relationship problems because facing them is painful. We stick with failing strategies because changing course feels like failure.
The Russians didn't just manipulate an election. They revealed how easily we manipulate ourselves.
How the Confirmation Trap Works
Through fake Facebook accounts and carefully crafted ads, Russian operatives fed conservative Americans stories about liberal corruption and fed liberal Americans stories about conservative racism. Both sides shared the content eagerly because it felt true.
The technique worked because the Russians found what psychologists call the "confirmation bias sweet spot." Instead of trying to change minds, they reinforced existing beliefs. They arrived pre-wrapped in the emotional language each tribe already used. They confirmed each group's worst fears about their enemies and best beliefs about themselves.
Conservative Americans saw posts about Hillary Clinton's supposed crimes and thought, "I knew it." Liberal Americans saw posts about Trump supporters' alleged racism and thought, "I knew it." Neither side questioned the sources because the content matched what they already believed.
This isn't stupidity. It's human nature. Our brains evolved to make quick decisions with incomplete information. If you heard rustling in the bushes and assumed it was a predator, you lived. If you stopped to gather more evidence, you might become food.
The same mental wiring that kept our ancestors alive now makes us perfect targets for anyone with a lie to sell.
Why We Remember What We Want
When we encounter information that supports our views, our brains treat it differently than contradictory evidence. We remember supporting details with crystal clarity while contradictory evidence dissolves from memory.
Stanford researchers demonstrated this in the 1970s. They brought together students with strong opinions about the death penalty and showed everyone the same evidence. One study claimed capital punishment deters crime, another said it doesn't.
The results were striking. Supporters of the death penalty said the pro-death penalty study was solid and dismissed the other as flawed. Opponents did the reverse. After reviewing both studies, people felt even more certain they were right.
When researchers brought participants back weeks later, people remembered the details that supported their views perfectly but had forgotten the contradictory evidence. One death penalty opponent could quote statistics from the study he agreed with but vaguely recalled "some problems with the methodology" in the other study without remembering what those problems actually were.
We conform first, then find reasons why we're right. We don't carefully weigh evidence and reach conclusions. We pick our team, then our brains do the work of making us feel smart about it.
This happens everywhere. Sports fans see biased referees. Political partisans see media bias. Investors see market manipulation when their stocks fall. We're not lying to ourselves as we genuinely perceive the world through the lens of what we want to believe.
When Certainty Becomes Blindness
We love feeling right. Not only is it an important feeling, but there’s a deep-seated need to be right. Being right reassures us that we understand the world and our place in it. It gives us a sense of control, security, and confidence.
Psychologists call it "motivated reasoning." To us, it just feels like being right. A die-hard football fan watching their team get penalized might never see a fair call. They see biased referees, corrupt officials, or evidence that the league favours their rivals. Show them instant replay from multiple angles and they'll find new reasons why the call was wrong. Their loyalty isn't just emotional as it literally changes what they perceive.
The 2008 financial crisis showed how dangerous this certainty can become. Bankers and regulators believed housing prices could only go up. When economists warned about risky lending and a housing bubble, the warnings got dismissed or explained away. The believers had too much invested in their rosy predictions. When reality finally broke through, it was too late. Of note, confirmation bias wasn't the only cause, but it turned warning signs into background noise.
This selective memory works like a mental film editor, cutting scenes that don't support the narrative. The more we invest in a belief, the harder we defend it. We seek out sources that validate our views and avoid those that challenge them. We remember supportive details and forget contradictory ones. We twist unclear facts to fit our story.
The same mental machinery that makes us loyal fans becomes dangerous when weaponized. Nazi propagandists understood this perfectly. They didn't force beliefs on Germans. They gave people who already felt superior the "evidence" they craved.
Germans who wanted to believe in their racial superiority found confirmation everywhere: manipulated crime statistics showing Jewish criminality, staged photographs of "typical" Jewish features, carefully selected stories of German heroism. The propaganda worked because it told people what they already suspected about themselves and their enemies.
When contradictory information emerged, believers had ready explanations. Explanations such as enemy propaganda, Jewish lies, and foreign jealousy were distributed. Each piece of disconfirming evidence became proof of how deep the conspiracy ran. The terrifying efficiency wasn't in changing minds. It was in confirming existing prejudices and giving people permission to act on them.
The same pattern repeats across centuries, just faster. Medieval Europeans blamed outsiders for plagues. Romans dismissed threats to their empire. Americans in the 1950s saw communist plots everywhere. Each generation finds the "evidence" that confirms their fears about enemies and their beliefs about themselves.
Social media hasn't created this psychology. It simply made it faster and more precise. The algorithm learns what you believe and feeds you more of it, creating what researchers call "echo chambers" where each tribe lives behind its own walls, armed with its own facts.
The Disinformation Sweet Spot
Disinformation exploits our confirmation biases.
The technique succeeds because disinformation fills the confirmation bias sweet spot. It arrives pre-wrapped in the emotional language your tribe already uses. It confirms your worst fears about your enemies and your best beliefs about yourself. It comes with sources that look credible and statistics that seem legitimate. By the time fact-checkers debunk it, the lie has already travelled around the world and settled into millions of mental fortresses.
Consider how anti-vaccine disinformation spreads. It's not spamming random people with generic health misinformation. It targets parents who are already worried about loading chemicals into their children's bodies and then builds them a community around it. It's an echo chamber where members share false studies linking vaccines to autism, tragic testimonials from other parents, and elaborate conspiracies with pharmaceutical companies. The news appears plausible as it validates their already existing fears and gives them identity as vigilant parents holding out against a corrupt state.
Disinformation spreaders know this, so they craft lies specifically designed to go viral within target communities.
In 1930s Germany, pamphlets spread the same kind of lies. Today, it's videos.
The method changes but the psychology doesn't.
We don't fall for disinformation because we're stupid. We fall for it because it tells us we were right all along.
Where to now?
We can't rewire our brains, and we can't escape the need to feel right. Confirmation bias isn't a bug in human thinking. It's a feature that helped our species survive.
But we can recognize when it's being weaponized against us. When information confirms exactly what we already thought, when it makes us feel vindicated and superior, when it arrives perfectly packaged for our political tribe… That's when our alarm bells should ring loudest.
The War of the Worlds broadcast didn't fool America in 1938. But we've been fooling ourselves about it ever since, because the story confirms our biases about other people's gullibility.
Maybe the real victory isn't being right. Maybe it's being wrong a little less often than yesterday.