What Pete Buttigieg Teaches Us About Disinformation
Defeating disinformation isn’t just about fact-checking, but about style, tone, and building bridges
“Trust in our institutions and each other is the glue that holds societies together... Social media platforms have made it easier for conspiracy theories and lies to spread. We need to rebuild networks of trust, both online and offline, and that starts with honest conversations and local engagement.”
When Pete Buttigieg shared this message at the Boston Book Festival, he wasn't just stating the obvious. He was highlighting what makes fighting misinformation so difficult today.
Facts alone don't cut it anymore.
We've all seen it happen. Someone shares a completely false claim online, people rush to correct it with facts and links. They call the original poster an idiot and silences them. Yet the falsehood spreads anyway. Why? Because we're not just fighting bad information. We're up against identity, emotion, and tribal belonging.
This isn't just a liberal or conservative problem. It's a human one.
In this article, we'll break down practical techniques for countering the information crisis regardless of political leaning.
Why Facts Alone Don't Work
In 2016, the Oxford Dictionary declared "post-truth" its word of the year. Since then, our collective struggle with information has only intensified. But why do falsehoods spread so easily, and why are they so resistant to correction? The answer is far more complex and ugly than policy changes and fact checking.
Identity Before Accuracy
People don’t process information like neutral judges. Research from Yale’s Dan Kahan shows that we judge facts based on whether they fit our group identity. This “identity-protective cognition” means we accept ideas that match our beliefs and push back against facts that challenge them.
Ironically, the more informed or engaged someone is, the better they are at defending their side. Even if it means rejecting the truth.
The Role of Emotions
Disinformation, misinformation and malinformation sticks because it connects emotionally. Outrage, comfort, and hope are powerful motivators regardless of who receives the information. Cognitive scientist Hugo Mercier notes in his research on reason-based choice that people often believe things because they feel right, not because of evidence strength.
While some studies have identified a "backfire effect" where corrections strengthen false beliefs, recent research has found this effect isn't universal. As researchers like Brendan Nyhan have clarified, backfire happens under specific conditions rather than as a general rule.
Echo Chambers and Platform Dynamics
Social media and partisan news create "echo chambers" where people mostly hear views that match their own. Inside these bubbles, seeing others believe something makes us more likely to believe it too
This is a process called social proof.
But this isn't just about psychology. Platform algorithms optimize for engagement rather than accuracy. Furthermore, economic incentives reward content that triggers emotional responses rather than the truth. As media scholar Zeynep Tufekci has documented, these structural factors amplify misinformation regardless of individual psychology.
Trust Is the Real Crisis
At the heart of disinformation is a crisis of trust. When people lose faith in institutions, they turn to sources that reflect their identity and beliefs. Harvard professor Archon Fung, who studies democratic governance and public information, puts it bluntly: "Facts don't speak for themselves. They are carried by messengers, and if we don't trust the messenger, we won't trust the message."
This means effective communication must address trust first, facts second. That's precisely what communicators like Buttigieg instinctively understand. His approaches to countering false narratives is not just about stating facts. Its more about building trust, meeting people where they are, and respecting their identities.
Case Study 1: Hurricane Helene and Melton Disinformation (October 2024)
In October 2024, as Hurricanes Helene and Milton devastated communities, false claims spread rapidly on social media and partisan outlets.
One viral rumour insisted that anyone who received $750 in immediate disaster aid would be disqualified from further assistance. This message was amplified by prominent conservative political figures and fuelled public fear and distrust of the government’s relief efforts.
This rumour was particularly effective because it tapped into existing concerns about government bureaucracy. The false claim weaponized fear by suggesting that accepting immediate help would block future aid. It exploited the anxiety and uncertainty people already felt during a crisis.
The underlying narrative wasn't just about disaster assistance, it was designed to erode trust in government institutions. The suggestion that officials would penalize people for accepting initial help reinforced suspicions among those already sceptical of government intentions.
Buttigieg, as Transportation Secretary, responded swiftly and publicly. Instead of simply denying the rumor, he focused on the real-world impact of such falsehoods. On national television, he stated:
His response demonstrated several effective techniques:
Consequence framing: Buttigieg explained the specific harm. People might avoid applying for aid they desperately needed. This made the stakes immediate and tangible rather than abstract.
Bipartisan positioning: By highlighting cooperation with officials "regardless of party," he created space for viewers across the political spectrum to accept the correction without feeling their identity threatened.
Ethical boundaries: He established clear moral lines around spreading misinformation during a crisis without demonizing those who might have shared the falsehood unknowingly.
Following coordinated messaging from Secretary Buttigieg and other federal officials, FEMA and state agencies saw a significant surge in aid applications. They also worked to dispel rumours about aid eligibility and amounts through official channels. In the end, they ensured residents received accurate information about how to apply for assistance.
This approach worked well in a crisis scenario where immediate needs overcame partisan resistance. However, these same techniques might be less effective for issues where partisan identity is more deeply entrenched. Such examples include climate policy and election integrity. Additionally, reaching people who don't watch mainstream news channels remains a significant challenge.
Conservative commentator Guy Benson employed similar approaches when debunking viral misinformation about FEMA aid requirements during Hurricane Laura in 2020. This shows how these communication principles transcend political affiliation when applied to crisis situations.
Case Study 2: Buttigieg’s Infrastructure Messaging on Fox News’s America’s Newsroom (April 1, 2021)
In early 2021, social media and partisan commentary cast doubt on the Biden administration’s infrastructure initiatives. Critics claimed that the federal investments were not reaching communities, that the projects were overhyped, or that “infrastructure” was being redefined to hide a lack of real progress. These narratives fuelled scepticism about the government’s ability to deliver on its promises.
This type of misinformation thrives on exploiting the abstract nature of large government programs. By focusing on delays, cherry-picking setbacks, or questioning definitions, these narratives make public investments seem wasteful or ineffective.
Once people become sceptical about infrastructure progress, they're unlikely to seek evidence that would contradict their belief. Instead, confirmation bias leads them to notice only problems or delays.
On Fox News’s America’s Newsroom, Buttigieg was pressed by hosts Dana Perino and Bill Hemmer about whether Americans were truly seeing the benefits of the infrastructure bill. Instead of responding with abstract statistics or partisan talking points, Buttigieg localized his answers. He pointed to specific, tangible projects like grid upgrades in Texas and broadband expansion in rural areas, explaining how these investments directly improve daily life:
“We’re talking about roads and bridges, we’re talking about rails and transit, we’re talking about airports and ports... As you mentioned, we’re talking about things like the grid. I don’t know why anybody would say that it’s a mistake to invest in the grid after what we just witnessed in Texas... In the United States of America that is unacceptable. So yes, infrastructure includes energy infrastructure.”
Buttigieg's localization strategy works because it:
Transforms abstract debates into verifiable local realities – Moving from national statistics to "your neighborhood" makes benefits tangible.
Bypasses partisan media filters – By directing people to local officials who typically enjoy higher trust than federal figures, he creates an alternate verification path.
Frames benefits as universal – Emphasizing that "roads aren't Republican or Democratic" creates a shared identity around infrastructure that transcends partisan divides.
Evaluating the true impact of these communication strategies is challenging without controlled studies. However, Buttigieg's performance was generally seen as composed and effective. What's notable is that even outlets and commentators who typically disagree with his politics have acknowledged his communication effectiveness. This suggests his approach may indeed bridge some partisan divides in how information is received, even if it doesn't always change minds on policy.
This localization strategy offers a practical template for countering infrastructure-related disinformation.
Mike Rowe has employed similar localization techniques when discussing trade education and infrastructure needs. Rather than abstract policy discussions, Rowe consistently grounds infrastructure conversations in stories of specific communities and workers. His "Returning the Favour" program specifically highlights local infrastructure heroes, demonstrating how this communication approach works across the political spectrum.
This localization approach works less effectively for truly national infrastructure like cybersecurity or supply chain improvements where local benefits are harder to isolate.
Case Study 3: Countering Conspiracy Thinking
At the Boston Book Festival, Pete Buttigieg addressed the growing problem of conspiracy theories and pseudoscience. These narratives thrive online. They offer alternative “knowledge systems” that undermine trust in facts and experts.
Conspiracy theories create a sense of belonging and certainty for their followers. Often these are done casting doubt on mainstream institutions and suggesting hidden truths.
This presents a unique challenge: direct debunking often strengthens conspiratorial thinking rather than weakening it. When believers feel their worldview is under attack, they typically retreat further into their alternative reality.
Instead of confronting conspiracy thinking with blunt fact-checks, Buttigieg focused on rebuilding trust and fostering open dialogue. He emphasized that trust is the foundation of any shared reality, stating:
“Trust in our institutions and each other is the glue that holds societies together... Social media platforms have made it easier for conspiracy theories and lies to spread. We need to rebuild networks of trust, both online and offline, and that starts with honest conversations and local engagement.”
His approach demonstrated several key techniques:
Intellectual humility: By acknowledging the limits of his own knowledge on certain topics, Buttigieg created space for dialogue rather than triggering defensive reactions.
Incremental trust-building: Rather than expecting immediate belief change, he advocated for "networks of trust" built through small, consistent interactions over time.
Questioning techniques: Instead of directly challenging beliefs with "How can you believe that?", he recommended questions like "What led you to that view?" and "What information would change your mind?"
Research across psychology and communication studies strongly support the effectiveness of these techinques.
Recent research published in Nature Human Behaviour demonstrates that intellectual humility significantly increases perceived trustworthiness, belief in research, and willingness to follow recommendations. Across five studies with over 2,000 participants, scientists described as intellectually humble were consistently trusted more, and their findings were more likely to be accepted, regardless of their gender or race.
Similarly, incremental trust-building has been shown in experimental and organizational studies to foster greater cooperation. This creates psychological safety compared to one-off interactions. The same applies to open ended questions.
However, this trust-building approach requires significant time and patience. It's not a quick solution to viral information. It works best in personal relationships rather than mass media environments.
Additionally, some conspiracy beliefs are so deeply entrenched that even the most skilful may have limited impact.
The Buttigieg Playbook: Techniques for Countering Disinformation
Across these case studies, consistent patterns emerge in how Buttigieg successfully counters problematic information. His approach is rather a set of specific, replicable techniques that address how misinformation actually works in human psychology and social dynamics.
1. Stay Measured, Stay Heard
When tensions rise, keep your voice calm and steady. Say "I understand your concern" instead of matching anger with anger. This prevents the conversation from becoming a shouting match where facts get lost.
2. Find Common Ground First
Start with "I think we both want safer communities" before diving into disagreements. Establishing shared values creates a foundation for discussing difficult facts.
3. Show Real-World Consequences
Instead of saying "That's false," try "If people believe that, they might not get the help they need." Making misinformation relevant to real lives is more effective than abstract corrections.
4. Localize Big Issues
Bring national debates down to local impact. "How is this affecting our neighbourhood?" turns abstract policies into verifiable realities people can check themselves.
5. Ask Curious Questions
Replace "Where did you hear that nonsense?" with "I'm interested in where you learned about that." Genuine curiosity opens doors that confrontation slams shut.
Building Disinformation Resistance Through Connection
The battle against misinformation isn't primarily about facts. It's about identity, emotion, and tribal belonging.
Effective communicators across the political spectrum succeed by meeting people where they are. Then listening to them before, and building bridges. These techniques work because they acknowledge a fundamental truth: people aren't just information processors, but social beings with identities and emotional needs.
But these approaches do have their limits. They require time and patience. They work better in personal conversations than viral social media. And they cannot fully overcome the structural challenges of platform issues that amplify sensational content over nuanced truth.
Yet when we focus exclusively on technical solutions such as fact-checking, content moderation, and media literacy, we miss the deeper opportunity. The most promising path forward combines structural reforms. With human connection strategies, we can create the psychological safety necessary for reconsidering beliefs.
Ultimately, changing minds isn't something we do to others.
It's a process we can only facilitate when people feel secure enough to reconsider their beliefs on their own terms.