Think and Save the World

How Propaganda Works: And Always Has

· 6 min read

Ellul's Foundational Argument

Jacques Ellul's 1962 book Propaganda: The Formation of Men's Attitudes remains the most rigorous and disturbing treatment of the subject in existence. Ellul was not a media critic or political commentator — he was a sociologist and philosopher deeply concerned with how modern technological society shapes human behavior in ways that undermine genuine freedom.

Ellul's first major argument is definitional: propaganda is not defined by its content (false vs. true information) but by its function (bypassing rational judgment to produce predetermined attitudes and behaviors). This means that true information can be propaganda, and false information can be ordinary communication. The distinction is purpose and method, not factual accuracy.

His second major argument concerns the audience. Ellul argues that propaganda is not primarily aimed at the uneducated masses — that's a comforting myth. Modern propaganda targets educated, media-engaged people specifically because they consume more information, are more ideologically integrated, and have more confidence in the rationality of their conclusions. The propaganda that is most dangerous to any individual is the propaganda that matches their existing worldview — it doesn't feel like external manipulation, it feels like confirmation.

Ellul also distinguishes between agitation propaganda (designed to move people to immediate action, typically through fear and outrage) and integration propaganda (designed to maintain conformity and commitment to existing social arrangements over time). Both are present in any modern society, though integration propaganda is far less visible because it operates through the normal functioning of culture, education, and media rather than through obvious campaigns.

The Cognitive Mechanisms

Modern cognitive science has identified the specific psychological mechanisms that propaganda exploits. Understanding these is not just intellectually interesting — it's practically useful.

The Illusory Truth Effect is among the most robust findings in cognitive psychology. Repetition increases perceived truth. Hasher, Goldstein, and Toppino demonstrated this in 1977, and it has been replicated many times since. In a 2015 experiment by Pennycook and colleagues, even a single prior exposure to a statement increased its perceived truthfulness, regardless of whether the statement was actually true. The effect persists even when participants are explicitly told that the source of repeated information is unreliable. Familiarity-based processing operates below conscious evaluation.

This is why repetition is the most fundamental propaganda technique. Not because people are fooled by any individual instance of a claim, but because the accumulated effect of encountering the same claim across multiple sources, formats, and contexts gradually shifts the baseline sense of its plausibility. By the time a propagandistic claim has been repeated hundreds of times, the original evaluation has been swamped by familiarity.

Emotional Arousal and Cognitive Narrowing — high states of emotional arousal (fear, rage, disgust, contempt) narrow attentional focus and accelerate information processing at the expense of accuracy. This is a feature of the evolved threat-response system, not a defect: in the ancestral environment, a predator doesn't give you time for deliberation. But propaganda that induces fear or outrage exploits this mechanism to produce rapid credence in threat-confirming information.

Jonah Berger's research on what makes content go viral shows that high-arousal emotions (anger, anxiety, awe) dramatically increase sharing behavior relative to low-arousal emotions (contentment, sadness). This creates a structural incentive in information environments for content that produces emotional arousal — which is also the content most likely to bypass careful evaluation.

Social Proof — Robert Cialdini's classic research shows that people use others' behavior as a guide to appropriate behavior in uncertain situations. In information contexts, this translates to: widespread apparent belief in a claim increases personal credence in it. This mechanism can be gamed by creating the appearance of consensus through bot networks, coordinated campaigns, and selective amplification of sympathetic voices.

Dehumanization — psychologist Nick Haslam's research on dehumanization distinguishes between animalistic dehumanization (denying uniquely human attributes — culture, morality, rationality) and mechanistic dehumanization (denying human nature attributes — warmth, emotion, individuality). Both lower the moral threshold for harm to the target group. The historical relationship between propaganda-driven dehumanization and mass atrocity is not coincidental — dehumanization appears to be a prerequisite for organized violence against civilian populations.

Why It Works Across History

The persistence of propaganda across radically different societies, technologies, and historical periods is not an accident. It works because it exploits cognitive and social features that are not historical contingencies — they're structural features of human psychology.

Human beings are social animals whose survival has always depended on group membership. We are exquisitely sensitive to signals of group identity, ingroup/outgroup distinctions, and the opinions of those around us. These sensitivities made adaptive sense in small-scale social environments. In mass societies with sophisticated media, they make us vulnerable to manipulation by anyone who can credibly claim to speak for the group, define the enemy, and narrate the threat.

The technology changes. The printing press, radio, television, and now social media each created new propagandistic possibilities by expanding the reach and speed of message distribution. But the underlying cognitive targets remain constant. The Athenians used theater and public ceremony. Rome used triumphs and monuments. The Catholic Church used architecture, imagery, and liturgical repetition. The Nazis added radio and film. The current era adds algorithmic amplification, targeted micro-messaging, and real-time feedback loops. Same mechanisms, new delivery systems.

What changes with technology is primarily the scale and speed of the effect and the difficulty of identifying the source. Traditional propaganda required identifiable institutions (governments, churches, mass media organizations) that could be, in principle, held accountable. Contemporary propagandistic operations can be conducted by small teams operating pseudonymously across national borders, making attribution nearly impossible and accountability nonexistent.

Propaganda From Your Own Side

This is the part that most treatments of propaganda avoid because it's politically awkward: the propaganda you're most vulnerable to is the propaganda that comes from people who share your values and worldview.

Ellul makes this point directly: the propaganda of enemies is easy to dismiss because you're already primed to distrust them. The propaganda of allies is nearly impossible to evaluate critically because it confirms what you already believe and comes from sources you trust. The person who is most confident that they are immune to propaganda — because they've read the right books, belong to the right political tribe, and are appropriately skeptical of the mainstream — is often the most thoroughly propagandized.

This plays out in predictable patterns across the political spectrum. Every political orientation has its propaganda ecosystem — the sources that confirm and amplify the worldview, the enemies who are routinely dehumanized, the threats that are systematically inflated, the inconvenient facts that are systematically minimized. The fact that you can see this clearly on the other side and struggle to see it on your own is not a personal failing — it's the expected operation of motivated reasoning.

The historical record on this is unambiguous. Every major propagandistic campaign in the 20th century had educated, thoughtful, well-intentioned people inside its target population who were completely convinced of its legitimacy. Not because they were stupid but because the propaganda was targeting their existing beliefs, using their trusted sources, and working through the same cognitive mechanisms it works through in everyone.

What Recognition Actually Does

Knowing about propaganda doesn't make you immune. This is a hard truth that media literacy education often glosses over. Dan Kahan's research on the "smart idiot" effect shows that higher analytical ability and science literacy actually increase the partisan gap in beliefs on politicized empirical questions — because smarter people are better at constructing rationalizations for their preferred positions, not better at reaching accurate conclusions against their ideological preferences.

What awareness does is create a moment. A gap between the stimulus (the emotionally loaded message) and the response (credence, sharing, action). That gap is not guaranteed to produce good reasoning. But it makes good reasoning possible in a way that pure automatic response does not.

Specifically: awareness of propaganda techniques creates the possibility of catching certain signals — emotional arousal before evaluation, convenient timing, one-sided framing, discouragement of alternative sources — and using them as prompts to slow down. Not to dismiss automatically (that's just mirror-image credulity) but to hold conclusions tentatively while seeking more information and alternative perspectives.

The goal is not to become a skeptic of everything — that's its own cognitive failure mode. The goal is calibrated credence: being appropriately uncertain about things that are genuinely uncertain, appropriately confident about things that have been thoroughly examined, and maintaining awareness of your own motivated reasoning as a real and ongoing distortion.

Propaganda exploits the human desire for certainty, for clear enemies, for a coherent narrative about what's happening. The partial antidote is learning to be comfortable with complexity and uncertainty — which is itself a cognitive skill that requires development. The alternative is being a more sophisticated participant in whatever propaganda ecosystem you happen to be embedded in, convinced the whole time that you've escaped it.

Cite this:

Comments

·

Sign in to join the conversation.

Be the first to share how this landed.