The Military-Attention Complex — Defense Spending On Influence Operations
The infrastructure of state-funded influence operations is more elaborate and better-funded than most civilians realize. Let's build the full picture.
The Taxonomy of State Influence Operations
It helps to separate the different instruments in play:
White operations are attributed — state-funded media that openly identifies its funding source. RT (Russia Today) carries a legal disclosure. Voice of America and Radio Free Europe/Radio Liberty are US-funded. CGTN is Chinese state media. These are legal and, in theory, transparent. The problem is that "state-funded" doesn't mean "propagandistic in an obvious way." The most effective state media does actual journalism — reporting that can be fact-checked and relied upon — so that its strategic narratives are embedded in a context of credibility.
Gray operations operate through cutouts — apparently independent media organizations that receive state funding through intermediaries that obscure the origin. A website in, say, Slovakia that appears to be a local conservative outlet but is funded through a chain of shell companies to a Russian-aligned media conglomerate is a gray operation. These are much harder to track, and they're proliferating. Dozens of such networks have been identified across Europe and the Americas.
Black operations are entirely covert — fake personas, coordinated inauthentic behavior, synthetic content, false-flag operations, and full narrative fabrication. The Internet Research Agency, the Russian troll farm exposed by the Mueller investigation, was a black operation. Its budget was approximately $25 million per year — a tiny fraction of Russian defense spending, yet it demonstrably affected the information environment in the 2016 US election cycle.
The US operates its own versions of all three, with the CIA's influence operations, State Department strategic communications, and USAID democracy programming occupying the white-to-gray spectrum, and classified programs occupying the black spectrum. The Snowden revelations documented GCHQ (British intelligence) developing specific capabilities for "influence and information operations" including the ability to "discredit, deny, disrupt, degrade, deceive" targets through online means.
The Budget Problem
Getting precise numbers on influence operation spending is inherently difficult because much of it is classified, and because the line between "public diplomacy" and "influence operations" is deliberately blurred.
What we can estimate:
The US State Department's Bureau of Global Public Affairs and associated programs: approximately $700 million annually in unclassified appropriations. The Broadcasting Board of Governors (now USAGM, US Agency for Global Media): approximately $800 million annually. This covers Voice of America, Radio Free Europe, Radio Free Asia, and related outlets. Classified military psychological operations (run through SOCOM's Civil Affairs and MISO units): unknown, but the overall SOCOM budget is approximately $13 billion annually, and information operations are a core capability.
Russia's RT network receives approximately $300–400 million annually in state subsidy. Russia's overall information operations budget, including military-run units, is estimated by NATO analysts at $500 million–$1.5 billion annually.
China's overseas influence apparatus — CGTN, Xinhua, state media placement fees paid to local outlets, United Front Work Department activities, Confucius Institutes — is harder to estimate but multiple researchers put it in the $6–10 billion range annually. China is the most aggressive investor in influence operations by budget.
The total across all state actors: plausibly $20–50 billion per year globally, with significant uncertainty in both directions. This is meaningful comparison: the UNHCR (UN refugee agency) annual budget is approximately $10 billion. States are spending more on shaping what people think than on protecting people displaced by the conflicts that shaped narratives help cause.
The Cognitive Infrastructure Being Targeted
Understanding what influence operations are actually targeting illuminates why this matters at civilizational scale.
Influence operations don't primarily target beliefs about facts. Trying to convince people of specific false facts is brittle — the facts can be checked. What they primarily target is:
Epistemic trust: confidence in institutions, media, and expertise. Undermine trust in all information sources, and people retreat to identity-based information communities. Confused people who can't evaluate information are easier to mobilize through fear than confident people who can check claims.
Social cohesion: the willingness of a population to cooperate across difference. Amplify every existing social division — racial, religious, economic, regional — and you raise the friction cost of political cooperation, making collective action harder for the target population.
Threshold of engagement: how much evidence people require before they update their beliefs. Flooding the information environment with conflicting narratives raises the cognitive cost of forming accurate beliefs. Manufactured doubt — the "firehose of falsehood" approach documented in Russian operations — doesn't require you to be believed; it requires only that you be confusing enough that people give up trying to evaluate.
These targets are cognitive. The attack is on the thinking apparatus of civilian populations. And unlike physical infrastructure, cognitive infrastructure is invisible, doesn't have a clear point of failure, and is rebuilt slowly.
The Feedback Loop With Democratic Governance
Here's the civilizational implication. Democratic systems require voters who can form accurate beliefs about political reality in order to make effective choices. That's the underlying premise. A functioning democracy where voters are systematically fed false or distorted information about the choices they face is not a functioning democracy in any meaningful sense — it's a ritual of legitimation without the substance.
Influence operations, at their most effective, don't just change individual votes. They change the information environment within which political discourse happens. They make certain facts legible and others invisible. They make certain coalitions possible and others impossible. They make certain political figures seem like legitimate options and others seem beyond the pale.
Over time, systematic influence operations against a democratic population can shift that population's politics in ways that serve the interests of foreign powers — without the population being aware that this is happening. This is the central security concern that has emerged in the post-2016 years. It's real. The empirical evidence for influence operations affecting political beliefs is contested but growing, and the mechanism is clear even if the magnitude is debated.
The Domestic Dimension
A crucial point that often gets lost: influence operations are not only directed outward, at adversaries. They are also directed inward, at domestic populations.
Every government that conducts influence operations abroad also conducts information management at home. The US military's prohibition on running domestic information operations is a legal constraint that has been regularly circumvented in practice. The Russian government's domestic information environment — primarily through control of television, which reaches 80%+ of the Russian population — is tightly managed. China's domestic information environment is the most controlled large information system in history.
But even liberal democracies engage in domestic narrative management. Government communications offices, official spokespersons, strategic declassification of intelligence to shape media coverage, pressure on platforms — these are not neutral information activities. They are information operations directed at domestic populations, with interests the government is trying to advance.
The line between "public communication" and "propaganda" is genuinely blurry, and navigating that blur honestly requires the kind of media literacy that is not widely distributed.
The Military-Attention Complex As A System
The systemic insight is this: just as Eisenhower observed that military spending creates its own perpetuation — contractors need contracts, generals need missions, Congress members need defense jobs in their districts — attention-shaping spending creates its own perpetuation.
Intelligence agencies build influence operation capabilities. Those capabilities require analysts, technologists, linguists, behavioral scientists. Those personnel have careers invested in the importance of their work. Contractors who build the platforms and tools have revenues invested in continued contracts. Policymakers who fund these capabilities have strategic doctrines invested in their value.
The result is a self-reinforcing system in which the infrastructure of influence operations grows independently of whether those operations are actually achieving strategic objectives. The classified literature on the effectiveness of influence operations is mixed at best — there's genuine uncertainty about how much they actually move the needle on population beliefs. But the spending continues to grow, because the complex that produces it has interests independent of its effectiveness.
This is the military-attention complex. It lives in every major state. It shapes the information environment of every connected person. And the knowledge that it exists — the ability to hold "who made this and what do they want me to believe?" as a live question — is one of the most important cognitive tools available to anyone trying to think clearly in the current century.
Distributing that knowledge widely is itself an act of strategic importance. A population that understands influence operations is not a population that can easily be turned against itself or manipulated into supporting its own exploitation. That's the civilizational stakes.
Comments
Sign in to join the conversation.
Be the first to share how this landed.