Think and Save the World

How To Disagree With Yourself

· 7 min read

Why This Is Rare

Disagreeing with yourself — really doing it, not performing it — requires something that cuts against how minds work.

The brain is a prediction machine and a pattern-matcher. It forms beliefs, then processes new information through those beliefs. This is cognitively efficient: you can't evaluate every new input from scratch against all available evidence every time. Beliefs serve as compression — they let you process the world faster.

The cost is confirmation bias. Once a belief is formed, new information gets evaluated through it. Consistent evidence is accepted quickly. Contrary evidence is scrutinized more heavily, remembered less reliably, and explained away more readily. This isn't a flaw in bad thinkers. It's baseline human cognition. Smart people are often better at it — they're more sophisticated at generating plausible-sounding explanations for why the contrary evidence doesn't count.

There's also an identity dimension. Beliefs are not just propositions we evaluate. They're part of who we are. To seriously consider that a belief you hold is wrong is to seriously consider that the self who held it was mistaken — which activates social threat responses not unlike the activation produced by physical danger. The psychological literature on motivated reasoning (Ziva Kunda, Dan Kahan, Jonathan Haidt) consistently shows that when identity-linked beliefs are challenged, the response is often defensive rather than investigative.

This is the baseline problem. Disagreeing with yourself requires actively working against these defaults. It requires treating your own beliefs with the same skeptical rigor you apply to claims you're predisposed to reject.

Almost nobody does this reliably. The people who do are easy to identify: their positions change when evidence changes, they can accurately represent views they disagree with, they acknowledge genuine uncertainty rather than projecting confidence they don't have, and they hold their views with a kind of provisional quality — committed enough to act on them, loose enough to revise them.

What Steelmanning Actually Involves

The steelman principle — present the strongest version of an opposing view, not a weakened version — is simple to state and difficult to practice.

Its opposite, the strawman, is nearly universal in public discourse. Political commentators don't represent the best version of the opposing position. They represent the version easiest to mock or dismiss. This feels like winning an argument while actually avoiding one.

Genuine steelmanning requires three things:

Charitable interpretation. When an opponent's view can be read in a stronger or weaker form, read the stronger. If their argument has a missing premise that would make it valid, supply it. Don't exploit gaps in articulation as if they were gaps in the underlying logic.

Engagement with best sources. Your understanding of a view should be based on its most rigorous proponents, not its worst ones. If you want to understand the case for strong immigration restriction, read the best academic economists and legal scholars who make it, not talk radio. If you want to understand the case for drug decriminalization, read the public health literature, not advocacy websites. The point is not to cherry-pick favorable versions — it's to engage the view at its most developed form.

The recognition test. When you've steelmanned a position, ask: would the position's most intelligent advocates recognize your representation as fair? If they would feel it accurately represents their view, you've done it. If they would feel misrepresented, you haven't.

Applied to your own positions: steelman yourself against yourself. If you were your most intelligent, best-informed critic, what's the case you'd make?

Finding Your Weak Points

Most intellectual positions have identifiable structural weaknesses. The discipline is locating them honestly in your own positions rather than only in others'.

Empirical weak points. What evidence would, if true, change your mind? If you can't name this, you hold your belief unfalsifiably — which means it's a commitment, not a conclusion. Good empirical beliefs specify in advance what would update them.

Assumption weak points. What are you assuming that you haven't examined? Beliefs rest on premises, which rest on further premises. Tracing the chain back to its foundational assumptions often reveals points of genuine uncertainty that the surface-level belief obscures.

Selection weak points. What evidence have you not looked at? When you investigated your belief, did you look for supporting evidence, contrary evidence, or both? Most self-guided research looks primarily for confirmation. What's the most serious study or argument that cuts against your position, and what do you actually think of it?

Social origin weak points. Did you form this belief in an environment where everyone around you held it? Political beliefs formed in politically homogeneous communities, religious beliefs formed within specific faith traditions, economic beliefs formed in particular industries — these are all candidates for views that have never been genuinely tested by encounter with their strongest opposition.

Emotional investment weak points. Where do you feel certain? Not just confident — certain, defensive, heated when challenged? These high-emotional-temperature zones are exactly where the motivated reasoning is concentrated. The intensity of the feeling is evidence that something is identity-linked rather than purely epistemic.

The Devil's Advocate Applied to Your Own Views

The devil's advocate role — arguing for a position you don't hold, specifically to find its weaknesses — has a long history in intellectual practice. The Catholic Church used it formally in canonization proceedings: the promotor fidei (promoter of the faith) was charged with arguing against the proposed sainthood to ensure the case for it could withstand scrutiny.

Applied internally, devil's advocacy looks like setting aside your actual view and constructing the best possible case against it — not to be convinced, but to find the points where your position is actually weakest.

A structured version of this:

1. State your belief clearly and specifically. "I believe X." 2. List your top three reasons for believing X. 3. For each reason, construct the strongest possible counterargument — the one that most seriously challenges that reason specifically. 4. Rate your response to each counterargument: Does it fully answer the challenge? Partially? Does it not actually have a good answer? 5. Identify which elements of your position are most vulnerable. What would it take to resolve them?

This is not an exercise in self-undermining. It's an exercise in intellectual honesty. Position 1 (belief held with unconsidered certainty) and Position 2 (belief held with genuine understanding of its weak points and strongest counterarguments) are very different epistemic states, even if the conclusion is the same.

The Difference Between Self-Doubt and Self-Examination

This distinction matters because conflating them makes the practice seem psychologically dangerous.

Self-doubt is an emotional state. It's the feeling of "maybe I'm wrong because I'm not smart enough, not reliable enough, not equipped to judge this." It's diffuse, undermines confidence across domains, and is triggered by social threat. It doesn't improve reasoning — it impairs it, by making the evaluator anxious and prone to either defensive rigidity or reflexive capitulation.

Self-examination is an epistemic practice. It's asking specific, answerable questions about specific beliefs. "What evidence supports this? What evidence would challenge it? Have I engaged the strongest counterarguments? What am I assuming?" It's applied to particular propositions, not to the self broadly. It strengthens reasoning by making the reasoning process visible and correctable.

Self-doubt says: I might be wrong because I'm flawed. Self-examination says: This belief might be wrong because of these specific identifiable reasons, and here's how I'd test it.

The first is psychologically destabilizing without intellectual payoff. The second is occasionally uncomfortable but produces more accurate beliefs.

People sometimes avoid self-examination because it feels like self-doubt. The solution is precision: keep the examination targeted to specific propositions and specific reasoning, not global self-evaluation.

What Changes When You Practice This

People who practice rigorous self-disagreement develop a recognizable epistemic profile:

Calibrated confidence. Their certainty tracks the strength of evidence rather than the strength of feeling. High confidence on things with strong, examined support; explicit uncertainty on things that are genuinely uncertain.

Revised positions over time. Their views change when evidence changes. This is not weakness — it's the correct response to new information. People who never revise are not more principled; they've committed to conclusion-first thinking.

Better at hearing criticism. When someone challenges their view, the first response isn't defensiveness but genuine inquiry: is this a challenge I've already considered, or is it pointing to something I haven't examined? If the former, they can engage it directly. If the latter, they take it seriously.

More persuasive. Paradoxically, holding views with acknowledged uncertainty and demonstrated awareness of counterarguments is more persuasive than holding them with aggressive certainty. Audiences can sense whether a person has genuinely grappled with the hard parts of their position or is just performing confidence.

The Civilizational Stakes

Democracies, markets, and scientific communities all depend on a substrate of citizens and participants who can reason about contested questions in ways that track reality rather than just reinforce priors.

When that capacity erodes — when populations lose the ability to hold beliefs provisionally, engage opposing views charitably, and update when evidence demands it — the result is not just epistemic. It's political. Epistemic tribalism turns political disagreement into pure power contests, because there's no common ground of reasoning to appeal to. Each tribe has its own facts, its own sources, its own reality.

The practice of disagreeing with yourself is not just cognitive hygiene. It's the individual-level practice that makes collective reasoning possible. A society of people who can genuinely disagree with themselves has something to work with. A society of people who can't has only negotiation between impermeable positions — and eventually, force.

The stakes for this are exactly that high.

Cite this:

Comments

·

Sign in to join the conversation.

Be the first to share how this landed.