Why "I was wrong" is a power move — personally and politically
The Architecture of the Ego Trap
Before you can understand why "I was wrong" is a power move, you have to understand why it feels like the opposite.
The ego's job — at the level of evolutionary psychology — is to maintain a stable sense of self. Consistent enough to function. Protected enough to survive socially. For most of human history, being caught in a serious error wasn't just embarrassing; it could mean losing status in the group, which in pre-agricultural societies meant losing access to resources, mates, and protection. The stakes of being wrong were material, physical, tribal.
That hardware hasn't been updated. Your nervous system still treats social humiliation as a cousin of physical threat. When you're wrong about something and someone challenges you, your body's threat-response activates before your prefrontal cortex can get a word in. That's why people get defensive before they even consciously choose to. The defensiveness is faster than the decision.
This means the people who successfully say "I was wrong" aren't people without an ego. They're people who've learned to work with the lag — to recognize the defensive impulse, let it pass through without acting on it, and then respond from a more deliberate place. That's not a moral achievement. It's a trained skill. Which means it can be learned.
---
What "I Was Wrong" Actually Signals
In social and political contexts, the ability to say you were wrong communicates a precise set of things:
You can distinguish between your identity and your positions. This is rarer than it sounds. Most people have fused their beliefs, opinions, and past decisions with their sense of self. To challenge their position is to challenge them. When you disaggregate — when you can hold "I believed X, X turned out to be wrong, and I am still a coherent person" — you demonstrate psychological maturity that operates above the average baseline.
You're tracking reality, not narrative. The person who can't be wrong is maintaining a narrative: the story of how everything they've done makes sense and was justified. That narrative management takes enormous cognitive and social resources. The person who admits error isn't managing a narrative — they're just watching what's true and reporting it. That's much cheaper and much more reliable.
You're not playing defense against the future. If you've never admitted a mistake, every future mistake becomes doubly threatening — it could reveal that your prior admissions of infallibility were false. You're now in debt to your past self. Admitting early and often means you owe nothing to that debt. Each error is standalone. None of them are existential.
You're safe to disagree with. This is perhaps the most underrated consequence. When you've demonstrated that being wrong doesn't destroy you, people start telling you things. They bring you real information — including bad news — because they've seen that you won't shoot the messenger or deny the data to protect your ego. This is how good leaders actually find out what's happening in their organizations. The ones who can't be wrong only ever hear what their team thinks they want to hear.
---
The Political Case: Escalation Commitment and the Cost of "I Was Right"
In 1976, Barry Staw published a study called "Knee-deep in the Big Muddy" that formalized what people intuitively knew but couldn't quite name. He found that when decision-makers had personally committed to a failing course of action, they systematically allocated more resources to it than outside observers would — not because of strategic reasoning, but because reversing course required admitting they were wrong. The failing investment became a monument to their original judgment, and abandoning it meant acknowledging that judgment was flawed.
He called it escalation of commitment to a failing course of action. Everyone else calls it throwing good money after bad.
The Vietnam War is the canonical case. By 1967, enough evidence existed that the military strategy wasn't working that significant voices inside the U.S. government were saying so. But the presidents who'd committed to the war — who'd told the American public it was necessary and winnable — couldn't absorb the cost of saying they were wrong. So the war continued for six more years. 58,000 Americans and between one and three million Vietnamese people died in the gap between what the data said and what the leaders could bring themselves to acknowledge.
That gap has a name. It's called the inability to say "I was wrong." And it doesn't just live in Washington. It lives in boardrooms, in family systems, in international negotiations, in the decisions of every leader who has ever mistaken their ego for their constituents' welfare.
The leaders who changed history by reversing course — and owned the reversal publicly — are consistently remembered better than those who doubled down. Nixon going to China. De Klerk reversing apartheid policy. Churchill's post-war acceptance of welfare-state provisions he'd opposed. In each case, the admission that conditions had changed and that new information required new positions was received not as weakness but as statecraft.
The public, it turns out, is more sophisticated than politicians give them credit for. They know people change their minds. What they don't forgive easily is being lied to about it.
---
"Flip-Flopper" and the Weaponization of Consistency
The political attack of "flip-flopper" is one of the most destructive memes in modern democracy. It conflates two entirely different things: (1) opportunistic position-switching driven by polls and donor pressure, and (2) genuine updating based on new evidence or changed circumstances.
Those are not the same. One is cowardly. The other is the most basic function of intelligent governance.
The rhetorical genius of "flip-flopper" is that it collapses this distinction. It makes consistency the proxy for integrity, when integrity actually has nothing to do with consistency. Integrity is the alignment between your stated values and your actions. You can be entirely consistent — say the same thing for forty years — and be completely without integrity. You can change your position every time new evidence arrives and have enormous integrity.
The attack works because it's cheap. You don't have to evaluate whether the position change was warranted — you just label the change itself as the sin. And because most media has neither the incentive nor the bandwidth to adjudicate whether a position change was intellectually honest, the attack lands without scrutiny.
The antidote — and it exists — is leaders who pre-empt the attack by narrating the change themselves: "I used to believe X. Here is what changed my mind. Here is the evidence. Here is why the new position is correct." That takes the attack's ammunition. You can't hit someone with "you changed your mind" if they've already said it themselves, explained it clearly, and invited people to evaluate the reasoning.
That's not spin. That's transparency. And transparency is a power move.
---
The Personal Practice: Building the Muscle
Saying "I was wrong" at scale — in public, on significant things — requires a capacity that most people have to build deliberately, starting small.
The daily reckoning. At the end of each day, identify one thing you got wrong. Not catastrophically wrong — just something where your prediction, judgment, or behavior missed. Say it out loud. Or write it down. "I thought the meeting would go one way. I was wrong about that." This isn't self-flagellation; it's calibration. You're training yourself to locate error without it activating threat-response. The more ordinary error-recognition becomes, the less the big admissions feel like cliff-edges.
The public small one. Find an opportunity each week to say "I was wrong about that" in front of one other person. Not about a trivial thing, but not about a catastrophic thing either — medium. Notice what happens. In most cases, the reaction is the opposite of what the ego predicted. People don't pounce. They often respect you more. The data contradicts the threat-model. Over time, your nervous system updates.
Separating the admission from the solution. One reason people resist admitting error is the fear that admitting it means they have to immediately have the answer to what they should have done instead. But you don't. "I was wrong about this" and "I know exactly what the right answer is" are separate statements. You can say the first without the second. In fact, saying "I was wrong and I'm not sure yet what the right path is" is more honest and more trustworthy than any certain-sounding overcorrection.
The re-entry after a long avoidance. Some errors have been carried too long to address without significant conversation. If you've been wrong about something for years — a belief, a behavior, a relationship dynamic — the admission might need to be bigger and more deliberate. That's not a reason to avoid it. It's a reason to plan it. Consider: what needs to be named, to whom, in what setting? What acknowledgment of the duration of the error is warranted? The longer it's been, the more the admission needs to include something about the time — "and I've known this for longer than I've been willing to say."
---
The Asymmetry of Trust
There is a compounding dynamic worth naming explicitly.
Every time you refuse to say you were wrong when you were, you make a small withdrawal from the trust account. People notice. They don't always say anything — but they file it. They start calibrating for it: "He's not going to own this, so I have to add some buffer when I take his word for things." "She'll never say she was wrong, so when she's confident I have to discount it slightly because her confidence isn't tracking anything real."
Every time you say clearly "I was wrong," you make a deposit. And this one has better interest rates, because it's relatively rare. It stands out. It updates people's models of you in the direction of trustworthy.
The person who has made ten deposits and zero withdrawals has a very different trust balance than the person who has made fifty withdrawals and zero deposits. The gap is enormous. And it compounds. The person who has spent years avoiding admissions has to spend enormous social capital managing the consequences of that avoidance — the incomplete trust, the gap between their self-perception and others' perception, the defensive energy required to maintain the position that they're never wrong.
The person who admits freely is operating with a surplus. They don't have to manage the gap, because there isn't one.
---
The World-Scale Stakes
Imagine a generation of political leaders — not imaginary saints, just ordinarily capable people — who had genuinely internalized that admitting error was a power move rather than a liability.
A president who, six months into a military engagement that isn't working, says: "The intelligence we used to justify this was wrong. The strategy is not achieving the stated goals. We are changing course." The domestic political cost is real but manageable. The lives saved are not abstract.
A trade negotiator who, three rounds into a failing agreement, says: "Our initial position on this clause was wrong. Here is what the evidence shows. Here is the revision." The other side, expecting the usual theater, gets something real instead. The negotiation moves.
A finance minister who, two years into an austerity program that is demonstrably increasing poverty without reducing debt, says: "The model we used was wrong. Here is what we missed. Here are the corrections." The policy changes. The damage that would have accumulated over three more years of commitment-escalation doesn't happen.
None of these require superhuman leaders. They require leaders who have practiced the ordinary human skill of saying "I was wrong" enough times that it doesn't feel like annihilation.
That's the world we're not living in yet. It's not because we lack the information or the intelligence. It's because we've built political cultures that punish admissions and reward false consistency. We've taught leaders that the only safe move is to never be seen to be wrong — and then we wonder why they double down into catastrophe.
The fix is not structural in the first instance. It's cultural. It starts with enough people, in enough rooms, demonstrating that you can say "I was wrong" and not only survive but gain trust. When that demonstration repeats enough times in enough places, the culture shifts. What is currently punished becomes respected. What is currently treated as weakness becomes recognized as its true nature: a form of courage, and a prerequisite for good judgment.
That shift doesn't require legislation. It doesn't require new institutions. It requires the decision, person by person, to practice the thing.
---
Practical Exercises
Exercise 1: The Error Inventory List five things you were wrong about in the last year — beliefs, predictions, judgments, decisions. Write a sentence for each: what you believed, what was actually true, and what evidence caused the update. Read the list back. This is not a shame exercise — it's a competence inventory. People who can generate this list quickly are people who are actually learning.
Exercise 2: The Out-Loud One Choose one item from that list and tell someone about it. Not someone you need to apologize to — just someone in your life. "I thought X was going to happen. I was wrong. Turns out Y." No spin. No "but here's why it made sense at the time." Just the fact of the error and what's true. Notice the response. Notice what it feels like afterward.
Exercise 3: The Reframe Test Next time you find yourself constructing a mental defense of a past decision you now doubt — notice it. Ask yourself: "If I weren't protecting my past self, what would I conclude from the available evidence?" The answer to that question is the honest position. The construction is the ego work. Practice noticing the difference.
Exercise 4: The Leadership Audit Think of the leaders you most respect — in any domain. Count how many times you've heard them say "I was wrong," "I changed my mind," or "I should have done that differently." Then count the same for leaders you distrust or who made catastrophic errors under their watch. The pattern is not random.
Exercise 5: Modeling for Others If you have children, employees, students, or anyone who learns from watching you: the next time you make a mistake in front of them, name it. Out loud. Clearly. "I got that wrong. Here's what I'm changing." You are not modeling fallibility — you are modeling that fallibility is survivable and correctable. That is one of the most important things a person in a position of influence can teach.
---
If every leader on earth practiced this — if "I was wrong" stopped being political suicide and started being the mark of a serious person — we would not end all conflict. But we would end the category of conflict that exists solely because someone couldn't bring themselves to say four words. That category is larger than most people want to admit.
Comments
Sign in to join the conversation.
Be the first to share how this landed.