Think and Save the World

How the history of propaganda reveals what power fears most — independent thought

· 10 min read

Neurobiological Dimensions

How propaganda exploits the brain. The brain is optimized for survival, not truth. It rapidly identifies threats and allies. It notices what stands out. It trusts familiar sources. It's vulnerable to: - Pattern recognition run amok: The brain sees patterns everywhere, even where none exist. Propaganda creates false patterns: "See how these events are connected? They're all part of the conspiracy." - Emotional arousal: Strong emotion (anger, fear, outrage) makes information stick. Propaganda emphasizes emotional stimuli. Facts stick less than feelings do. - In-group/out-group thinking: The brain is wired to favor the group and distrust outsiders. Propaganda exploits this: "Our group is good. That group is dangerous. Don't listen to them." - Source credibility over content: The brain tends to trust who says something more than what is said. Propaganda uses trusted sources to deliver false messages. Neuroplasticity and propaganda. Propaganda works by repeated exposure. Neural pathways encoding the propagandistic message strengthen with each repetition. The message becomes automatic, unquestioned. When propaganda is consistent and ubiquitous, it literally rewires how people think. The neural pathways supporting critical evaluation of that message atrophy. People stop questioning it. It feels like truth because that's the only pathway available. The role of dopamine in belief. Propaganda that aligns with pre-existing beliefs triggers dopamine release. Confirming information feels good. Contradictory information triggers threat response (activates amygdala). This is not a character flaw. It's brain chemistry. But it means propaganda is more effective when it confirms existing beliefs than when it tries to create new ones from scratch.

Psychological Dimensions

Identity-protective cognition. People filter information through identity. Facts that threaten identity are dismissed. Facts that support identity are accepted. Propaganda works by attaching itself to identity. "Believe this or you're not really one of us." "People who disagree are not like us." Once a belief becomes identity, it's nearly impossible to change through evidence. Changing the belief feels like betraying the group. The backfire effect and belief updating. When presented with contradictory evidence, people often don't update their beliefs. Sometimes they dig in deeper. This isn't stupidity. It's motivated reasoning. Your brain is trying to preserve identity and group status. Directly attacking a belief attacks identity and status. Effective counter-propaganda doesn't attack the belief directly. It offers an alternative identity ("You can believe the truth and still be part of the group") or reframes the threat. Cognitive biases in propaganda susceptibility. People susceptible to propaganda tend to have certain cognitive patterns: - Proportionality bias: They assume big effects must have big causes. Propaganda offers big cause (conspiracy) for complex outcomes. - Clustering illusion: They see patterns in random data. Propaganda points to connections that aren't really there. - Availability heuristic: They overweight recent, vivid information. Propaganda manufactures viral, emotional content. - Dunning-Kruger effect: They're confident in domains where they're not expert. Propaganda exploits this by offering false expertise. These are not flaws of stupid people. They're universal human tendencies that propaganda exploits across all intelligence levels.

Developmental Dimensions

Propaganda vulnerability across lifespan. Children are more vulnerable to propaganda because they have less experience evaluating sources. They believe what adults tell them. Adolescents develop capacity to evaluate claims but sometimes overestimate their ability. They're often highly persuadable by peer-based propaganda. Adults vary widely. Some develop strong resistance to propaganda (through education, experience, practice). Others never develop it and remain vulnerable throughout life. Older adults sometimes become more vulnerable to propaganda due to cognitive changes, social isolation, or reduced confidence in evaluating information. Building resistance to propaganda. Resistance to propaganda is not innate. It's developed through practice: - Consuming information from diverse sources - Deliberate evaluation of claims (who says this? what's their incentive?) - Learning about how propaganda works - Regular exposure to contradictory perspectives - Practice changing your mind when evidence warrants Children raised in environments that encourage questioning develop stronger propaganda resistance. Children in authoritarian environments (where questioning is punished) develop weaker resistance. Developmental stage and propaganda tactics. Different developmental stages are vulnerable to different propaganda tactics: - Children: Appeals to authority ("experts say"), appeals to emotion (stories) - Adolescents: Appeals to identity and peer belonging, us-vs-them framing - Adults: Appeals to experience and self-interest, false evidence presented as expert consensus - Older adults: Appeals to status threat ("things aren't like they used to be"), nostalgia Effective propaganda targets the developmental stage's vulnerabilities.

Cultural Dimensions

Propaganda and cultural variation. Different cultures are more or less vulnerable to different propaganda techniques: - High-context cultures are more vulnerable to subtle, emotionally-laden propaganda; less vulnerable to direct logical appeals - Low-context cultures are more vulnerable to explicit false claims; less vulnerable to implied meanings - Honor cultures are vulnerable to propaganda attacking group honor - Dignity cultures are vulnerable to propaganda attacking individual dignity - Cultures of trust (where authority figures have been reliable) are vulnerable to propaganda from authorities - Cultures of skepticism (where authority has been unreliable) are harder to propagandize but vulnerable to anti-authority messaging Propaganda and inequality. Propaganda is more effective in unequal societies. When people have little power and few resources, they're more vulnerable to simple explanations for complex problems. Propaganda offers a villain ("the elites," "the immigrants," "the conspiracy") rather than requiring understanding of complex systems. More equal societies are harder to propagandize because people have experience managing complexity and have more access to education and information. Propaganda in colonized populations. Colonial powers used propaganda as a tool of control. They propagandized colonized people about their own inferiority. They propagandized metropolitan populations about their civilizing mission. Decolonization required counter-propaganda: alternative narratives about colonized peoples' capacities and the colonial project's harm.

Practical Dimensions

How to recognize propaganda. Propaganda typically includes: - Emotional appeals without evidence: Tries to make you feel strongly rather than think clearly - Simplification of complexity: Complex problems reduced to simple villains or solutions - Us-vs-them framing: Your group good, other group bad; no nuance - Appeals to authority without expertise: Famous people saying things outside their expertise - Absence of contradictory information: Only hearing one side - Appeals to majority or inevitability: "Everyone believes this," "You can't stop this" - Circular reasoning: Using the claim as evidence for itself - Ad hominem attacks: Attacking people who disagree rather than addressing arguments The presence of these doesn't automatically mean something is propaganda. But multiple indicators suggest it. Media literacy practices. Building resistance to propaganda requires deliberate practice: 1. Source evaluation: Who is producing this? What are their incentives? Are they credible on this topic? 2. Claim evaluation: Is this claim supported by evidence? Could someone reasonably disagree? 3. Alternative perspectives: What would someone who disagrees say? Do I understand their position? 4. Emotional awareness: Am I being manipulated emotionally? What emotion is this trying to trigger? 5. Self-knowledge: What am I predisposed to believe? What would I want to believe even if it weren't true? Counter-propaganda strategies. Addressing propaganda is harder than creating it, but possible: - Pre-bunking: Teaching people about propaganda tactics before they encounter propaganda - Inoculation: Exposing people to mild propaganda and showing them how it works - Alternative narratives: Offering better stories that fit the facts - Building trust in institutions: When people trust credible institutions, propaganda is less effective - Empowering critical thinking: Teaching people to evaluate claims themselves rather than relying on authorities

Relational Dimensions

Propaganda in relationships. On small scale, propaganda appears in relationships as emotional manipulation, information control, and loyalty demands. A partner who demands you believe certain things and punishes questioning is using propaganda tactics. The damage is relational: you can't trust your own perception, you can't think for yourself, you experience constant anxiety about saying the wrong thing. Propaganda between groups. When groups compete, propaganda emerges. Each group propagandizes its members and attacks the other group. This is particularly intense in conflict. The damage is to inter-group relationship: each group becomes dehumanized to the other, making cooperation impossible. Counter-propaganda through relationship. The most effective counter to propaganda is relationships with people in the other group. When you know someone personally from a group that propaganda says is evil, propaganda loses credibility. This is why propagandists work to prevent inter-group contact. It's also why post-conflict reconciliation requires relationship-building. Propaganda and community. Communities vary in propaganda resilience. Communities with: - Strong internal relationships (people know and trust each other) - Diverse information sources (not all getting news from one place) - Practice of respectful disagreement (you can argue without rupture) - Access to education and critical thinking ...are more resistant to propaganda. Communities with weak relationships, information monopolies, and conformity pressure are vulnerable to propaganda.

Philosophical Dimensions

Propaganda and truth. Propaganda is fundamentally about preventing clear thinking about truth. It doesn't necessarily require lies. It can use true facts selectively or framed misleadingly. The philosophical question: if propaganda uses true information but misleadingly, is it propaganda? Most would say yes. Truth telling is not the same as honest communication. Propaganda and freedom. Freedom requires capacity to think. If your thinking is hijacked by propaganda, are you free? This is not rhetorical. If you've been shaped by propaganda since birth to believe certain things that serve others' interests, did you ever actually choose those beliefs? Epistemology and propaganda. Propaganda undermines epistemology. It teaches you not to question, to believe authorities uncritically, to dismiss contradictory evidence as "lies by the other side." A population sophisticated about epistemology (how we know what we know, what counts as evidence) is harder to propagandize. A population that thinks naively about how knowledge works is vulnerable.

Historical Dimensions

The history of propaganda. Propaganda emerged as a systematic discipline in the 20th century. Edward Bernays pioneered propaganda theory in the 1920s. Nazi Germany industrialized propaganda. The Cold War was fought largely through competing propaganda. Digital technology has made propaganda both easier (cheaper to distribute, can target individuals) and harder to control (harder for single authority to monopolize the message). Propaganda in different regimes. Authoritarian regimes use propaganda extensively because they lack other ways to maintain control. Democratic regimes use propaganda more subtly (through advertising, framing) but also extensively. The difference is less about amount and more about capacity to counter-propagandize. In democracies, competing propaganda allows for some correction. In authoritarian systems, single-source propaganda goes unchallenged. The information environment transformation. Pre-digital propaganda was broadcast: same message to millions. Digital propaganda is targeted: different messages to different people based on data. This is more effective (more personalized) and more pernicious (you see reinforcing messages while believing you're independently discovering truth).

Contextual Dimensions

Propaganda in crisis. During crisis (war, pandemic, economic collapse), propaganda becomes more effective. People are more emotional, less able to think clearly, more willing to accept simple solutions and strong leaders. This is why governments use propaganda extensively during crisis and why opposing propaganda requires institutional strength to maintain during crisis. Propaganda in inequality. In unequal societies, propaganda becomes more necessary for maintaining the system. If reality is perceived as unjust, propaganda must either explain why it's just or convince people they can't change it. Reducing propaganda requires either reducing inequality or increasing people's access to alternative information. Propaganda in information environments. In information-scarce environments, propaganda goes largely unchallenged. In information-abundant environments, propaganda is one voice among many, harder to dominate completely. But information abundance creates new vulnerabilities: people don't know what to believe with so many sources, making them sometimes more vulnerable to well-organized propaganda.

Systemic Dimensions

Propaganda ecosystems. Propaganda doesn't work alone. It works within an ecosystem that includes: - Production: Creating the propaganda (ad agencies, think tanks, political campaigns) - Distribution: Spreading it (media, social networks, peer-to-peer) - Amplification: Making it seem bigger than it is (bot networks, coordinated sharing, news coverage of propagandistic claims) - Defense: Protecting it from counter-propaganda (claiming alternative sources are "fake news," dismissing critics as biased) Weakening propaganda requires targeting all elements, not just one. Institutional propaganda. Institutions propagandize. Corporations propaganda about products. Governments propaganda about policies. Schools propaganda about history. This is not necessarily bad (teaching people your country's perspective is partly universal), but it requires awareness and counter-balance. An informed population knows that every institution has a perspective and seeks alternative perspectives deliberately. The propaganda-truth arms race. As societies get better at detecting propaganda, propaganda gets more sophisticated. As propaganda gets more sophisticated, detection requires more sophistication. This arms race can be broken only by building populations that are inherently more resistant (through education, critical thinking, diverse information sources) rather than trying to out-propagandize propagandists.

Integrative Dimensions

Propaganda as integrated attack on epistemology. Propaganda doesn't just make you believe false things. It attacks your capacity to know anything reliably. It teaches you not to trust evidence, not to trust experts, not to trust your own reasoning. Once epistemology is undermined, propaganda becomes much more effective. Counter-propaganda as epistemological restoration. Effective counter-propaganda is not counter-propaganda at all. It's education: teaching people how to evaluate evidence, how to recognize bias in themselves and sources, how to think systemically about complex questions. This requires long-term investment in education and culture, not short-term messaging campaigns.

Future-Oriented Dimensions

Propaganda in the digital age. AI-generated content will make propaganda both more powerful and more obvious (if people know to look for it). Deepfakes will make verification of evidence harder. Algorithmic targeting will make propaganda more personalized. The question is whether civilization will build epistemological defenses (media literacy, source verification, institutional trust) faster than propaganda technology advances. The civilizational stakes. A civilization awash in unchallenged propaganda cannot think clearly about its future. It will make decisions based on false beliefs about its situation, capabilities, and options. A civilization with strong epistemology and weak propaganda can think its way toward flourishing. A civilization drowning in propaganda will stumble toward crisis while believing it's making progress. ---

References

1. Bernays, Edward L. "Propaganda." Ig Publishing, 1928. 2. Arendt, Hannah. "The Origins of Totalitarianism." Harcourt, Brace and Co., 1951. 3. Herman, Edward S. and Noam Chomsky. "Manufacturing Consent: The Political Economy of the Mass Media." Pantheon Books, 1988. 4. Sunstein, Cass R. "Republic.com 2.0." Princeton University Press, 2007. 5. Cialdini, Robert B. "Influence: The Psychology of Persuasion." HarperBusiness, 2009. 6. Cook, John and Sander van der Linden. "Inoculating Against Misinformation." Science, 2021. 7. McIntyre, Lee. "The Liar's Dividend: Philosophers and Engineers on How to Make Climate Change Misinformation Stop Working." Columbia University Press, 2021. 8. Hosansky, David and Kate Starbird. "Propaganda as Subcommunication in Online Discussions." Journal of Information Technology & Politics, 2020. 9. Wardle, Claire and Hossein Derakhshan. "Information Disorder: Definitions, Manifestations, and Interventions." Council of Europe, 2017. 10. Vosoughi, Soroush et al. "The Spread of True and False News Online." Science, 2018. 11. Pomerantsev, Peter. "This Is Not Propaganda: Adventures in the War Against Reality." PublicAffairs, 2019. 12. Pennycook, Gordon and David G. Rand. "Fighting Misinformation on Social Media Using Crowdsourced Judgments." Proceedings of the National Academy of Sciences, 2019.
Cite this:

Comments

·

Sign in to join the conversation.

Be the first to share how this landed.