An Incomplete Tactical Taxonomy Of Disinformation
Disinformation is much a capability as it is a tool

"What does your platoon actually do?" I asked.
The cadet hesitated. "Attack the enemy, sir?"
I sighed. This was my third TEWT (Tactical Exercise Without Troops) with these cadets, and I'd heard enough vague plans.
"That's not specific enough," I said. "I understand you're using 'attack' as shorthand, but in a TEWT, we need precision. Tell me exactly what your platoon will do. Establish a support by fire position? Clear the position? Be specific with your tasks."
The room got quiet. The word "attack" sounded decisive but said almost nothing about what would actually happen on the ground.
This moment from my time as an Army instructor shows a problem we now face with disinformation. We use big, vague words like "disinformation campaign" without getting specific about tactics or techniques. It's like saying "the enemy armour will advance" without explaining how they'll be employed. Are they conducting a penetration, executing a feint, or performing reconnaissance?
Military professionals identify exactly what they're doing. Are they fixing forces to hold our attention? Are they manoeuvring to exploit a gap? Each requires a different response. Specificity isn't just helpful. It's essential.
Yet when talking about disinformation, we rarely move beyond spotting it to naming exactly how it works. We lack words that clearly separate an adversary who "fabricates" content from one who "amplifies" existing content, or operations meant to "polarize" from those designed to "confuse."
The current talk about disinformation lacks tactical precision. By creating a detailed taxonomy of specific actions, effects, and purposes, we can move beyond vague concerns to clear understanding. This allows for better detection and more effective countermeasures.
This article introduces a tactical taxonomy of disinformation. This is the start of a framework that names the specific techniques, outcomes, and objectives used in cognitive warfare. Just as military doctrine gives us precise language for physical operations, this taxonomy offers a structured vocabulary for understanding operations in the information space.
It is far from complete, but this a start in the complexity that is disinformation.
The Problem of Imprecise Language
When we hear "disinformation" in the news, what exactly does that mean? Is it a false story created from scratch? An authentic image shown out of context? A half-truth twisted to mislead? Without clear terms, we lump vastly different techniques under one vague label.
This lack of precision has real consequences. When platform moderators, security analysts, and everyday citizens can't name specific tactics, they struggle to respond effectively. You wouldn't use the same defensive measures against a sniper that you would against a bombing. Yet we often try to counter all "disinformation" with the same broad approaches.
Consider these examples: One adversary creates completely fabricated stories about vaccine side effects. Another selectively edits genuine political speeches to change their meaning. A third amplifies extreme but authentic voices to increase divisions. A fourth mimics legitimate news sites to lend credibility to false stories. All might be labelled "disinformation," but each uses distinct tactics requiring different countermeasures.
Our current language fails us. When analysts report "disinformation detected," it's like a scout reporting "enemy activity" without specifying whether it's reconnaissance, an ambush, or a major assault. The response should differ based on these details.
Just as military doctrine evolved beyond vague terms like "attack" and "defend" to precise tactical language, our approach to cognitive warfare needs similar precision. We gain defensive advantage through specific identification. You counter fabrication differently than you counter amplification.
To bring order to this confusion, we need a structured way to talk about cognitive warfare. This framework organizes disinformation tactics into three categories:
Actions (Verbs): The specific techniques adversaries employ, what they actually do (fabricate, amplify, mimic, etc.)
Effects (Nouns): The immediate outcomes these actions create in target audiences (confusion, mistrust, polarization, etc.)
Purposes: The tactical objectives these operations serve (discredit, distract, demoralize, etc.)
This classification system emerged from studying actual disinformation campaigns and identifying their common patterns. It applies military thinking, where clear distinction between actions, effects, and purposes has proven essential for effective planning and response.
Actions: The Tactics of Disinformation
This list provides a precise vocabulary for identifying specific disinformation tactics. Each action verb captures a distinct technique, allowing analysts to move beyond vague terms like "disinformation" to name exactly what adversaries do.
To use this list, identify which specific actions you observe in suspected disinformation campaigns. For example, rather than simply noting "foreign influence," you might recognize that adversaries "fabricate" false evidence, "amplify" existing divisions, and "mimic" trusted sources.
Adapt – To modify narratives or tactics in response to countermeasures.
Alter – To modify existing content to mislead or distort.
Amplify – To spread information widely to increase its impact.
Censor – To remove or block access to certain information.
Clone – To replicate websites, accounts, or content to mislead.
Confuse – To deliberately complicate or obscure the understanding of facts.
Coordinate – To synchronize actions across multiple actors or platforms.
Counterfeit – To create convincing yet entirely false entities, credentials, or artifacts.
Deceive – To intentionally mislead or misinform.
Deflect – To shift blame or focus away from the disinformation actor.
Deploy – To strategically place bots, trolls, or agents to propagate disinformation.
Dilute – To reduce the impact of credible information by flooding it with irrelevant or contradictory content.
Disguise – To conceal the origin or intent of the information.
Distribute – To spread disinformation across multiple platforms or channels.
Embed – To integrate false narratives into legitimate content.
Erase – To systematically remove historical or digital records that contradict the disinformation.
Exaggerate – To overstate or inflate details in a narrative to make it more compelling or alarming.
Exploit – To take advantage of crises, vulnerabilities, or preexisting biases.
Fabricate – To create entirely false information or narratives.
Flood – To overwhelm platforms or channels with high volumes of content.
Forge – To counterfeit documents, credentials, or identities.
Frame – To present information in a way that misleads or biases perception.
Hijack – To take over discussions, hashtags, or movements to redirect focus.
Imitate – To mirror the language, style, or tone of trusted sources.
Incite – To provoke specific actions, such as protests or violence.
Infiltrate – To covertly enter or join a network or community to spread false narratives.
Leak – To release confidential or sensitive information strategically.
Manipulate – To alter perceptions or emotions to achieve strategic goals.
Mimic – To imitate credible sources or formats to appear legitimate.
Monitor – To observe target reactions and adjust campaigns accordingly.
Obfuscate – To create confusion or ambiguity around facts.
Redirect – To steer conversations or attention to unrelated topics.
Repurpose – To reuse legitimate content for deceptive purposes.
Sabotage – To deliberately disrupt or damage systems or processes.
Seed – To plant narratives or ideas within a population for later growth.
Suppress – To block or silence specific narratives or dissenting voices.
Target – To focus on specific individuals, groups, or institutions for disinformation efforts.
Effects: The Outcomes of Disinformation
This list names the specific outcomes or effects (nouns) that disinformation tactics create in target audiences, helping analysts identify what happens after adversaries deploy their action verbs.
To use this list, identify which noun effects you observe in a target population.
Alienation – The process of isolating or estranging a group or individual from a larger community.
Amplification – Magnifies small issues into significant problems, distorting perceptions.
Anger – Provokes hostility and emotional reactions that cloud judgment.
Animosity – Encourages hostility or adversarial relationships.
Apathy – Reduces motivation to act or engage due to information overload or cynicism.
Complacency – Encourages inaction by instilling a false sense of security or inevitability.
Confusion – Creates uncertainty and disrupts the ability to discern truth.
Delegitimization – Undermines the credibility or authority of individuals, institutions, or ideas.
Delegation – Shifts responsibility or blame onto unintended targets.
Dependency – Creates reliance on misleading or manipulated information.
Desensitization – Reduces emotional or moral responses to repeated exposure to disinformation.
Destabilization – Weakens the stability of systems, governments, or communities.
Distraction – Shifts attention away from critical issues or actions.
Distrust – Weakens relationships between communities, allies, or governing bodies.
Division – Splits groups into factions, reducing cohesion.
Erosion – Slowly diminishes the strength of relationships, policies, or beliefs.
Exploitation – Takes advantage of societal fears, biases, or vulnerabilities.
Fear – Instills anxiety or panic among a population.
Fracturing – Causes long-term splits within alliances, communities, or relationships.
Fragmentation – Breaks apart unified efforts or shared narratives into disjointed elements.
Insecurity – Instills doubt in personal or collective safety.
Isolation – Alienates individuals or groups from broader communities.
Misalignment – Diverts goals or objectives within a group, leading to inefficiency.
Mistrust – Erodes trust in individuals, institutions, or systems.
Normalization – Makes extreme or unethical ideas appear acceptable.
Obfuscation – Makes the truth harder to identify by creating ambiguity.
Overload – Floods the information space, making it difficult to process or prioritize facts.
Panic – An acute, widespread sense of alarm that disrupts public order.
Paralysis – Renders decision-making or action impossible due to conflicting information.
Polarization – Deepens divisions within a society or group.
Stagnation – Prevents progress or productive discourse by paralyzing action.
Subversion – Gradually undermines values, norms, or societal cohesion.
Purposes: The Objectives of Disinformation
This list identifies the tactical objectives behind disinformation campaigns, explaining why adversaries deploy specific actions to create particular effects.
Amplify – To exaggerate or emphasize specific narratives or events for impact.
Confuse – To overwhelm with conflicting narratives, reducing clarity.
Deflect – To redirect blame or accountability away from the disinformation actor.
Delegitimize – To erode the perceived legitimacy of an institution or individual.
Demoralize – To erode morale, hope, or confidence in individuals or groups.
Destabilize – To undermine political, social, or economic stability.
Discredit – To tarnish the credibility of individuals, institutions, or ideas.
Disseminate – To spread information, whether false or misleading, widely and strategically.
Distort – To alter facts or narratives to fit a specific agenda.
Distract – To shift focus away from critical issues or actions.
Divert – To intentionally redirect focus away from critical issues or objectives.
Divide – To fragment unity within populations or alliances.
Disrupt – To interfere with operations, communication, or societal functioning.
Exploit – To take advantage of crises, divisions, or vulnerabilities.
Incite – To provoke actions such as violence, protests, or unrest.
Influence – To steer thoughts, beliefs, or actions toward a desired outcome.
Instil – To embed specific emotions, such as fear, mistrust, or anger, in a population.
Intimidate – To instil fear or compliance in a target population, pushing them to alter behaviour.
Isolate – To separate groups or individuals from the broader social or political fabric.
Manipulate – To influence perceptions, decisions, or actions for strategic advantage.
Normalize – To condition the acceptance of falsehoods or unethical practices.
Obfuscate – To obscure the truth or create confusion.
Polarize – To amplify divisions and create hostility between groups.
Prepare – To set conditions for future actions, such as broader campaigns or operations.
Radicalize – To push individuals or groups toward extremist ideologies or actions.
Recruit – To attract supporters or participants for ideological, political, or operational purposes.
Sabotage – To deliberately disrupt or damage systems, processes, or relationships.
Suppress – To silence dissenting voices or alternative perspectives.
Undermine – To weaken trust, authority, or functionality.
Where To From Here?
The framework presented here is not exhaustive, but it offers a starting point for more precise analysis and more effective countermeasures. In the complex domain of cognitive warfare, the ability to name and categorize techniques with precision gives defenders a crucial advantage.
This tactical taxonomy provides a practical framework for analyzing and countering disinformation. By breaking down cognitive warfare into specific actions, effects, and purposes, we create a more precise way to discuss what's happening in the information environment.
This taxonomy is just the first step and does need refinement. After all, cognitive warfare does not only belong to military commanders.
You are part of it too.