Think and Save the World

What decolonizing the mind actually means — Ngugi wa Thiong'o

· 8 min read

1. Neurobiological Substrate

Cognitive capacity and system modeling. Your working memory can hold about 7 items. Your attention can process roughly 40 bits per second. This creates a hard ceiling on how complex a system you can consciously model. Complicated systems stay below this ceiling. Complex systems exceed it. No amount of intelligence increases this ceiling—everyone hits it. The pattern-detection problem. Your brain evolved to detect patterns. It's excellent at finding structure. When you encounter a complex system, your brain generates patterns whether they're real or just noise. You see trends that aren't there. You find correlations that vanish. You generate explanations that feel true but are false. This is neurobiological—not a character flaw. The confidence paradox. Your brain's pattern-detection system is coupled to confidence. When your brain detects a pattern, confidence fires up. You feel certain. But that confidence signal doesn't calibrate to accuracy. Strong pattern detection produces confident wrongness regularly. Stress and complication. Under stress, your capacity for complicated analysis shrinks. You shift to heuristics and pattern-matching. Problems that were manageable become overwhelming. Complexity becomes indistinguishable from complication.

2. Psychological Mechanisms

The desire for reduction. Your mind wants to reduce everything to rules. Complexity violates this. Complex systems don't follow rules consistently. Your mind experiences this as uncomfortable, so you unconsciously force-fit them into rule-based frameworks. You simplify complex systems into causal narratives that feel true even when they're false. The expert fallacy. Experts in complicated domains (engineering, medicine, accounting) develop confidence that transfers to complex domains (policy, markets, ecosystems). The fact that you can master a complicated domain creates false confidence about complex ones. Prediction bias. Humans are absurdly confident about prediction. Even in domains where prediction is provably hard (weather, markets, human behavior), experts and non-experts alike make confident predictions. This isn't rational. It's a psychological need to feel in control. The causation assumption. Your mind assumes cause-and-effect. When two things happen together, you assume one caused the other. This works for complicated systems. It fails regularly for complex systems. You attribute market movements to news events that didn't actually cause them. You attribute health improvements to interventions that didn't cause them. Correlation feels like causation always.

3. Developmental Unfolding

Early learning and procedure. Children learn by finding patterns and building procedures. "If I do this, that happens." This works for learning complicated systems. Teaching children requires making complicated systems seem less complicated until they have enough cognitive capacity to learn them. Adolescence and systems thinking. Around early adolescence, capacity for understanding larger systems emerges. You can start to see how systems contain multiples perspectives, how things connect in non-obvious ways. But this capacity is fragile. It needs cultivation or it atrophies. The education of complication. School teaches the complication paradigm. Learn rules, solve problems, pass tests. This works for complicated knowledge (math, science facts, procedural skills). It fails for complex understanding (how markets work, how societies change, how people actually think). Adult sophisticated thinking. Mature thinkers develop the ability to hold complicated problems lightly while accepting complex ones genuinely. They don't oversimplify or overanalyze. This is rare and requires years of practice noticing when complexity is being forced into complication frameworks.

4. Cultural Expressions

Western rationalism and mechanism. Western thinking privileges mechanistic, rule-based understanding. This comes from physics and engineering successes. It extends complication frameworks to complex domains. Western institutions try to solve complex problems through complicated procedures. This fails predictably. Systems thinking traditions. Some cultural traditions developed language for complexity. Chinese philosophy, Daoist thought, and many indigenous frameworks treat reality as irreducibly complex. These frameworks were sometimes dismissed as non-rational. They were actually more realistic about complexity. Scientific paradigms and reduction. Science works by reducing complex systems to simpler components. This is powerful for understanding mechanisms. But it can create false confidence that the system's behavior is determined by component behavior. The whole exceeds the sum of parts in complex systems in ways reduction can't capture. Religious and mystical complexity. Religious and mystical traditions often model reality as irreducibly mysterious. Not knowable through reason. This reflects something true about complex systems: there are limits to knowability. Organizational complication bias. Organizations love procedures because they reduce uncertainty. But in complex environments, procedures fail. Organizations keep adding procedures anyway because procedures feel like control.

5. Practical Applications

The diagnostic question. Before applying any problem-solving approach, diagnose: complicated or complex? Complicated diagnosis: - Multiple parts with understood relationships - Can be diagrammed - Similar problems have similar solutions - Expertise helps - Procedures work Complex diagnosis: - Elements interact unpredictably - Can't be fully diagrammed - Similar situations have different outcomes - Expertise helps but doesn't determine outcomes - Procedures constrain flexibility For complicated problems: - Build expertise - Document procedures - Standardize what works - Reduce variation - Use best practices For complex problems: - Build sensing capacity - Plan multiple scenarios - Maintain flexibility - Embrace experimentation - Distribute decision-making - Accept some unpredictability The worst mistake: Treating complex problems with complicated approaches. You create elaborate procedures that break against reality. You build systems optimized for control that can't adapt. Example: pandemic response. COVID was complex. Different regions had different conditions. Populations responded unpredictably. Interventions had unexpected effects. Yet governments defaulted to complicated approaches—standardized procedures, central control, rigid protocols. Better approach: sensing systems, local adaptation, multiple simultaneous strategies, rapid feedback loops. Switching gears. The same person can solve complicated problems well and handle complex ones terribly. The challenge is knowing when to switch approaches. Most people don't switch. They apply their trusted framework to everything.

6. Relational Dimensions

Communication and clarity. Explaining complicated things requires clarity and detail. You build shared mental models. Explaining complex things requires different approach. You sketch patterns. You name uncertainties. You invite participation in sense-making. Argument and disagreement. In complicated domains, disagreement is usually about facts or procedures. Once you get the facts right, agreement follows. In complex domains, disagreement is often about different perspectives on unpredictable systems. You can both be right from different angles. Teaching and credibility. In complicated domains, experts can teach confidently. They have knowledge others lack. This builds legitimate authority. In complex domains, "expert" authority is limited. Lived experience counts for more. Humility is necessary. Collaboration structures. Complicated work benefits from hierarchy and specialization. Experts in different domains collaborate following procedures. Complex work benefits from flatter structures. Local knowledge matters. Adaptation requires less hierarchy.

7. Philosophical Foundations

Epistemology of complexity. What can we know about complex systems? Full prediction is impossible. But pattern recognition and scenario planning work. This requires different epistemology than "know the rules, then predict." Reduction and emergence. Reductionism works for complicated systems: understand the parts, understand the whole. But emergence—properties that appear at the system level and can't be explained from parts alone—is real in complex systems. Determinism versus possibility. Complicated systems behave deterministically—given conditions, outcomes follow. Complex systems have irreducible openness—multiple paths remain possible. Knowing limits. The philosophically mature position is: some things I can know completely. Some things I can understand partially. Some things exceed knowability. Wisdom lies in knowing which is which.

8. Historical Antecedents

The rise of mechanistic thinking. Newton and the scientific revolution created the framework: complex reality could be fully understood through mechanism and mathematical law. This worked brilliantly for physics and mechanics. It extended to everything else and created systematic overconfidence. Systems theory emergence. In the 20th century, systems theory emerged specifically to address complexity. Von Bertalanffy, Wiener, and others recognized that mechanistic models failed for complex systems. Systems theory remains relatively marginal in thinking culture. Cybernetics and feedback. Cybernetics developed frameworks for understanding systems with feedback: adaptive systems, self-organizing systems, systems that respond to their own outputs. These frameworks remain underutilized in most policy and planning. Complexity science emergence. Santa Fe Institute and others developed complexity science as a rigorous field. It studies how patterns emerge from interaction, how systems self-organize, how creativity emerges. Still not widely taught or applied.

9. Contextual Factors

Technology and complication bias. Modern technology tends to be complicated: thousands of parts, intricate relationships. We've become good at complicated. Complex problems (social, ecological, behavioral) don't yield to complicated approaches. But we keep applying them anyway. Bureaucracy and complication. Institutions create procedures, rules, hierarchies—complicated structures. These work for complicated problems but fail for complex ones. Yet institutions can't easily become less structured because structure creates predictability leaders need. Speed and complexity. Quick decision-making assumes some predictability. Complex situations are hard to decide quickly. But slow institutions can't respond to complexity either. Scale and complexity. The larger a system, the more likely it's complex. Yet we keep applying complicated-domain logic to large-scale problems.

10. Systemic Integration

Technology systems as hybrid. Most modern systems are hybrid: complicated hardware running on complex organizational behavior, which interacts with complex markets, which interface with complex human psychology. Design failures usually come from treating the complicated parts as if they determined the whole. Organizations and environment. Every organization exists in an environment it doesn't fully control. As environment complexity increases, organizations need more flexibility. But organizations tend to increase procedures, reducing flexibility. Knowledge fragmentation. Specialization creates deep complicated-domain knowledge. But nobody understands the complex whole. This creates failure when parts interact unpredictably. Feedback loops and adaptation. Complex systems require feedback loops that allow quick adaptation. Complicated systems need feedback too—but for optimization, not adaptation.

11. Integrative Synthesis

The distinction between complexity and complication is one of the highest-value thinking tools available. It immediately clarifies what approach will work. It explains why many well-intentioned plans fail. It suggests what to pay attention to in unfolding situations. Using this distinction changes how you think about: - Problem-solving: Different tools for different problem types - Learning: When to study deeply versus when to remain flexible - Leadership: When to build procedures versus when to trust judgment - Planning: When to detail extensively versus when to plan for surprise The thinker who conflates these thinks they're optimizing when they're actually guaranteeing failure.

12. Future-Oriented Implications

As systems become larger and more interconnected, complexity increases. Future problems will be more complex, not less. Yet institutional capacity for complex thinking is declining. Education pushes complication. Technology enables complicated thinking. Procedures make us good at following rules. The people who develop genuine complex-system thinking—who can sense patterns, hold multiple perspectives, adapt quickly, tolerate uncertainty—will have outsized influence on what gets solved and what doesn't. ---

Citations

1. Holland, John H. "Hidden Order: How Adaptation Builds Complexity." Basic Books, 1995. 2. Waldrop, M. Mitchell. "Complexity: The Emerging Science at the Edge of Order and Chaos." Simon & Schuster, 1992. 3. Kauffman, Stuart A. "The Origins of Order: Self-Organization and Selection in Evolution." Oxford University Press, 1993. 4. Stacey, Ralph D. "Complexity and Creativity in Organizations." Berrett-Koehler, 1996. 5. Cilliers, Paul. "Complexity and Postmodernism: Understanding Complex Systems." Routledge, 1998. 6. Lansing, J. Stephen. "Priests and Programmers: Technologies of Power in the Engineered Landscape of Bali." Princeton University Press, 1991. 7. Prigogine, Ilya, and Isabelle Stengers. "Order Out of Chaos: Man's New Dialogue with Nature." Bantam, 1984. 8. Johnson, Neil F. "Simply Complexity: A Clear Guide to Complexity Theory." Oneworld Publications, 2007. 9. McGilchrist, Iain. "The Master and His Emissary: The Divided Brain and the Making of the Western World." Yale University Press, 2009. 10. Simon, Herbert A. "The Sciences of the Artificial." MIT Press, 1996. 11. Weick, Karl E. "Sensemaking in Organizations." Sage, 1995. 12. Manson, Mark A. "Everything Is F*cked: Why Our Brains Are Wired to Worry, Struggle, and Overthink—and What We Can Do About It." HarperCollins, 2019.
Cite this:

Comments

·

Sign in to join the conversation.

Be the first to share how this landed.