Think and Save the World

Pre-Mortem Analysis: Imagining Failure Before It Happens

· 8 min read

The Optimism Problem Is Structural

Optimism bias — the tendency to overestimate the probability of good outcomes and underestimate bad ones — is not a personality flaw. It's a feature of human cognition that appears across cultures and age groups. Tali Sharot's research at University College London documented it extensively: people consistently believe they are less likely than average to experience illness, accident, divorce, or professional failure, even after being shown the actual statistics.

This bias isn't irrational in an evolutionary sense. Optimism likely correlates with persistence, which correlates with survival. But in a complex world where poor planning has compounding consequences, optimism bias is a liability that needs to be actively managed.

The standard advice — "be more realistic," "consider the risks" — doesn't work at the level of planning processes, because it doesn't change the social dynamics that actually suppress honest risk assessment. You can tell people to surface concerns all you want. If the culture of the meeting punishes the person who raises doubts, the concerns won't surface.

Gary Klein recognized this. His background was in naturalistic decision-making — studying how experts actually make decisions under real conditions, not how they make decisions in lab settings. What he saw consistently was that planning sessions suffer from shared overconfidence: the group converges on the plan and the plan's advocates set the tone. Dissenters go quiet. The plan moves forward as though concerns didn't exist.

The pre-mortem is a process intervention, not an attitude adjustment. It changes what's rewarded in the room.

How Klein's Pre-Mortem Works

Klein's original formulation, published in the Harvard Business Review in 2007, is straightforward:

Step 1: Set up the frame. Gather the team. Tell them to assume that it's some future date — typically one to two years out — and the project has failed completely. Not partially failed. Failed spectacularly. Whatever it was trying to achieve, it didn't.

Step 2: Individual brainstorm. Each person, independently and silently, writes down every reason they can think of for why the failure occurred. The independence matters — it prevents anchoring on the first reason someone voices.

Step 3: Round-robin sharing. Go around the room. Each person shares one reason at a time. This continues until all reasons are on the table. No reason gets dismissed in this phase.

Step 4: Review and integrate. Look at the list. Which failure modes are most likely? Which are most catastrophic if they occur? Which can be designed around?

The output is not a list of reasons to abandon the project. It's a list of risks that are now explicit and therefore manageable. The team can redesign the plan to prevent the most probable failures, build monitoring systems to detect early warning signs, and make explicit contingency plans for the scenarios they can't prevent.

What Klein found in practice: pre-mortems regularly surface concerns that planning sessions never would. The temporal reframe — "this has already failed" — gives people license to say what they actually think. The person who's been quietly nervous about the vendor relationship finally says it. The team member who doubts the timeline finally voices it. The organizational assumption that everyone has been treating as settled turns out to be contested.

This is information that was always there. The pre-mortem creates the conditions for it to exist in the room.

The Psychology of the Temporal Shift

Why does reframing from future-hypothetical to past-certain make such a difference?

Several mechanisms are in play:

Authority shift. When you say "this might fail," you're speculating, and speculation can be dismissed. When you say "this failed," you're reporting, and reporting carries more epistemic weight even when both statements are fictional. The pre-mortem puts everyone in the role of expert analyst rather than anxious worrier.

Elaboration effect. Psychological research on counterfactual thinking shows that imagining a failure in the past produces richer and more specific failure narratives than imagining it in the future. The past has texture — it's more concrete in the imagination. This produces better analysis.

Social permission. In a forward-planning context, raising concerns is dissent. In a pre-mortem context, raising concerns is the assignment. The social valence flips. People who would have stayed quiet become the most valuable contributors.

Optimism bypass. The optimism bias operates on future projections. When you're imagining what might happen, the bias systematically pushes your estimates toward good outcomes. When you're "explaining" a past failure, you're not generating probability estimates — you're constructing a causal narrative. The optimism bias has less purchase there.

The Inversion Connection

Charlie Munger — Warren Buffett's longtime partner at Berkshire Hathaway — has a principle he calls inversion: Invert, always invert. He borrowed it from the mathematician Carl Jacobi, who advised working problems backward rather than forward.

The idea is that many problems are easier to solve when approached from the direction of failure. Instead of asking "what does success look like?" ask "what would guarantee failure, and how do I avoid that?" Instead of asking "how do I become smarter?" ask "what would make me stupider, and how do I stop doing that?"

The pre-mortem is the applied version of inversion in planning contexts. It operationalizes the question "what would cause this to fail?" in a way that produces concrete, actionable answers.

Munger's own application of this principle spans everything from business strategy to personal conduct. At Berkshire, they routinely ask "what are we doing that's stupid?" before they ask "what should we be doing?" The failure landscape is often clearer than the success landscape, because failures tend to have identifiable causes while success often involves luck that can't be engineered.

The practical insight: for any significant project or decision, spend at least as much time analyzing failure scenarios as success scenarios. Most planning processes spend 90% of time on the success case and 10% (if that) on failure cases. Invert that ratio for your most important decisions.

Running a Personal Pre-Mortem

The team version of the pre-mortem is well-documented. The personal version gets less attention, but it's equally powerful.

You're deciding whether to take a significant personal action: change careers, move cities, end a relationship, start a business, make a major purchase. The excitement of the possibility is real. But the analysis so far has been dominated by best-case thinking.

Run the pre-mortem alone:

Sit down with a blank page. Write the date at the top — set it 12-24 months in the future. Then write:

"It's [future date]. I made the decision to [X]. It was a serious mistake. Here's what happened and why it went wrong:"

Now write. Write everything. Don't filter. Write the things that are embarrassing to admit. Write the things you've been avoiding thinking about. Write the concerns other people have raised that you've been dismissing. Write the assumptions you've been treating as certainties that might actually be wrong.

Most people can fill a page or more. The things that are hardest to write — the ones your hand slows down for — are almost always the most important.

What to do with the output:

Sort your failure scenarios by two criteria: probability and severity. High probability / high severity scenarios need to be designed around before you proceed. If you can't design around them, you need to factor them into your decision honestly.

Ask for each failure scenario: Is this a recoverable failure? Some bad outcomes, while painful, are recoverable. Others are not. The unrecoverable failures — financial ruin, permanent health damage, destroyed relationships — deserve special weight regardless of their apparent probability.

Ask: What early warning signs would tell me this failure is happening? The pre-mortem isn't just pre-decision analysis — it's also the source of your monitoring system. If you proceed, you now know what to watch for.

Finally: Does knowing this change my decision? Sometimes you do a pre-mortem and realize the risks are manageable and you proceed with clearer eyes. Sometimes you realize you'd been rationalizing a decision you'd already emotionally made and the actual picture is much darker. Both outcomes are valuable.

Integration with Red Team Thinking

The pre-mortem has a close cousin in military and intelligence analysis: the red team. A red team is a group explicitly tasked with opposing the main group's analysis — finding the flaws in the plan, challenging the assumptions, attacking the strategy.

The CIA uses red teams. Military planners use them. Well-run corporations use them. The theory is the same as the pre-mortem: the main planning group will develop shared blind spots, and you need a structure that creates permission and incentive to challenge those blind spots.

The difference: the red team is adversarial (it's against your plan), while the pre-mortem is collaborative (everyone is trying to improve the plan together). For most personal and team contexts, the pre-mortem is more accessible and creates less defensiveness. For high-stakes institutional decisions, full red-team analysis may be warranted.

If you're in a leadership role, you can build a lightweight version of red-teaming into any decision process: before finalizing any significant decision, assign someone (or a small group) the explicit role of finding everything wrong with the proposed plan. That person's job is protected and their critique is required. This changes the social dynamics enough to surface what would otherwise stay hidden.

Why This Matters at Scale

Most large-scale human disasters — organizational, political, military — share a common feature: the warning signs were there, the concerns existed, and the processes in place did not surface them in time to change the outcome.

The Challenger space shuttle disaster. The 2008 financial crisis. The Bay of Pigs invasion. Kodak's failure to respond to digital photography. In nearly every case, people inside the system had doubts. Those doubts did not reach the decision-makers in a form that altered the plan.

This is not primarily a problem of intelligence or information. It's a problem of process. Planning processes that reward confidence and penalize doubt will systematically suppress the information needed to make good decisions.

The pre-mortem is a process fix. It's cheap — it costs one meeting, or one hour of your own time. It's high-leverage — the failure modes it surfaces, caught early, can be designed around at a fraction of the cost of encountering them in reality.

Organizations that institutionalize it — that run a pre-mortem before every significant project — build something more valuable than any single analysis: they build a culture where thinking clearly about failure is normal rather than taboo. Where raising concerns is the assignment rather than the breach of confidence.

That culture doesn't just improve individual projects. It makes the whole organization smarter over time.

The pre-mortem is not pessimism. It's respect for reality. And it's one of the most actionable, highest-return thinking investments you can make.

Cite this:

Comments

·

Sign in to join the conversation.

Be the first to share how this landed.