Why Lying Works Systemically (And Why Everything We're Doing About It Is Wrong)
It's not actually as obvious as people say it is.
In 2020, Facebook published transparency reports showing the platform removed over 3 billion fake accounts across the first three quarters. Mark Zuckerberg highlighted these numbers in earnings calls and public statements. Investors and policymakers treated the figures as evidence of progress.
During the same period, researchers at New York University tracked misinformation about election fraud spreading across Facebook. False claims about mail-in ballots reached millions of users. Many of these posts came from authentic accounts, not the fake ones being removed. The content spread through the same algorithmic mechanisms that surface any viral post.
So, platforms invest billions in content moderation. They partner with fact-checking organisations and they banned accounts that violate policies.
But, misinformation persists as people still fell for lies.
The standard responses treat this as a problem of execution. Users need better media literacy to identify false claims. Platforms need better detection systems to label misleading content. Moderators need to move faster to remove violations.
Each response assumes the current approach works in principle and just needs better implementation.
But the evidence suggests otherwise. Misinformation spreads through the same infrastructure that spreads accurate information. The mechanisms that make content go viral, such as emotional resonance, novelty, rapid sharing, systematically favour lies over truth.
This creates structural advantages that operate faster than moderation, reach further than fact-checking, and compound beyond what individual media literacy can address.
Part of the problem operates at the level of system design. The interventions operate at the level of individual behaviour and specific content. Media literacy teaches people to verify sources. Algorithmic amplification decides what sources they see before they can verify anything. Fact-checking corrects false claims. Viral spread gives those claims temporal primacy that makes corrections arrive too late. Content moderation removes violating posts. The infrastructure that made them viral remains unchanged.
Understanding why lying works systemically requires examining what platforms optimise for and why that optimisation structurally advantages false information over accurate information.
The Perfect Storm With You At The Eye
Understanding why lying works requires examining three layers of vulnerability.
The first is psychological. Human cognitive architecture evolved for different information environments. The shortcuts that served small group survival become catastrophic weaknesses at internet scale.
The second is linguistic. Language creates meaning through shared context that dissolves online.
The third is systemic.This is what this article will talk about. Platforms and coordinated campaigns weaponise the first two layers with industrial efficiency.




