The Difference Between Opinions, Beliefs, And Knowledge
The Vocabulary We're Missing
Language shapes thought. When your vocabulary doesn't distinguish between types of epistemic claim, your thinking can't make those distinctions either.
Most common language treats "I think," "I believe," "I know," and "I feel" as rough synonyms. People deploy them based on emotional emphasis rather than epistemic precision. "I know the economy is going to crash" means something quite different from "I believe the economy shows signs of stress" — but in everyday speech they're indistinguishable in practice.
The result is that conversations about contested claims become strange: people with very different levels of justification for their positions use identical confidence language, making it impossible to sort stronger from weaker claims based on how they're expressed. Everyone sounds certain. Nobody can find the common ground of shared evidence because nobody is being precise about what kind of claim they're making.
Building clearer epistemic vocabulary — genuinely using different language for opinion, belief, and knowledge — changes the texture of reasoning. You start being able to say things like "I'm expressing a preference here, not a claim about what's true" and mean it distinctly.
Opinions: The Underrated Category
Opinions are underrated. This sounds counterintuitive in a culture that seems to have an excess of opinion, but hear it out.
The problem isn't that people have opinions. The problem is that people don't know they're having opinions — they mistake them for knowledge. The person who says "that policy is disastrous" usually doesn't experience themselves as expressing a preference shaped by their values, their risk tolerance, their particular circumstances, and their aesthetic sense of how society should be organized. They experience themselves as reporting a fact.
Properly understood, opinions are legitimate. You're allowed to have preferences about music, food, governance styles, relationship structures, career paths, architectural aesthetics. These preferences can be informed, they can be sophisticated, they can be better or worse — but they're primarily calibrated to what works for you, not what's objectively true.
The liberating thing about treating opinions as opinions is that you can hold them without needing to win the argument. If someone else disagrees about what makes a good novel, or a good government, or a good city, they're not wrong in the way they'd be wrong about a factual claim — they're differently positioned, with different values and experiences.
This doesn't collapse into relativism. Some opinions are more considered, more internally consistent, more grounded in broad experience. But the mode of engagement with opinions is different from beliefs: you share them, you listen to others', you update through new experience, and you don't need your opponent to concede.
Beliefs: The Contested Middle Ground
Beliefs are claims about how the world is, held with varying degrees of justification, and invested with varying amounts of identity and emotion.
The core challenge with beliefs is calibration: does the degree of conviction match the evidence? Most people's beliefs are miscalibrated — held with more certainty than the evidence warrants, and updated more slowly and less completely than rational updating would require.
The research on belief formation is sobering. Studies on confirmation bias show that people actively seek out information that supports existing beliefs and discount or ignore information that challenges them. Studies on belief perseverance show that beliefs can survive the complete destruction of the evidence that originally created them — people who are told "the study you were shown was false" often continue to hold the belief the study generated.
What makes a belief well-founded?
Justification: there is evidence for the belief, and you can state what that evidence is. Not "everyone knows" or "it seems obvious" — actual evidence.
Proportionality: the degree of confidence matches the strength of the evidence. High-quality, replicated, peer-reviewed evidence justifies high confidence. A single anecdote, one article, secondhand information — much lower confidence.
Revisability: you know what would change your mind. This is the key test. If you can't specify any evidence that would cause you to revise the belief, it's not functioning as a belief about reality — it's functioning as an identity marker. Identity-beliefs are especially resistant to updating because challenging them isn't experienced as a factual correction but as an attack on the self.
Acknowledging uncertainty: a well-calibrated believer knows the difference between what they believe strongly (highly justified) and what they believe tentatively (weakly justified) and represents each accordingly.
The Classical Definition of Knowledge
The classical philosophical definition of knowledge is "justified true belief." Three conditions must be met:
1. It's true: the thing you know is actually the case in reality 2. You believe it: you actually hold it as true (not just performing belief) 3. You're justified in believing it: you have good reasons, not just lucky guessing
This definition held reasonably well until 1963, when Edmund Gettier published a three-page paper that produced counterexamples. The simplest: suppose you see what looks like a sheep in a field and form the justified true belief "there is a sheep in the field." Unknown to you, what you saw was actually a large white dog — but there is, in fact, a sheep hiding behind a rock. You have a justified true belief (there IS a sheep in the field), but you don't seem to have knowledge because your justification wasn't tracking the actual sheep.
Philosophers have spent decades patching the definition, adding conditions about causal connection, "no false lemmas," sensitivity to truth, and more. No fully satisfying solution has emerged.
What this tells us practically: even our best attempts to define knowledge reveal its messiness. "Knowing" is not a clean binary. It's a complex relationship between a mind and a fact, mediated by evidence and reasoning processes that can be more or less reliable.
This should produce epistemic humility — not paralysis, not relativism, but a clear-eyed appreciation that even things we "know" could turn out to require revision.
The History of Things We "Knew"
The history of science is substantially the history of things we knew that turned out to be wrong, or right but for the wrong reasons, or right in limited domains but not universally.
We knew the Earth was at the center of the solar system. We knew diseases were caused by bad air. We knew stomach ulcers were caused by stress. We knew the continents were fixed. We knew sleep was primarily about rest rather than active neural processing.
These weren't fringe beliefs. They were held by educated, intelligent, well-informed people who had excellent reasons for their confidence given the evidence available to them. The lesson is not "don't trust science." The lesson is: even well-founded knowledge is provisional. Better evidence can overturn the best current understanding. This is a feature, not a bug.
The person who understands this history holds knowledge with a particular quality of confidence: firm enough to act on, humble enough to revise. Not the paralytic "I can't know anything" and not the defensive "this is settled, don't question it." Something in between.
Practical Tools for Better Epistemic Hygiene
1. Label your claims when you make them.
"My opinion is that this is a poor design" vs. "I believe our customers prefer simpler interfaces — here's the data" vs. "We know from our A/B testing that simpler interface drove 12% higher conversion." Three very different claims. Label them.
2. Ask what kind of claim you're making before arguing.
When you feel strongly about something, pause and ask: is this opinion, belief, or knowledge? For each: - Opinion: who does it affect if others disagree? Probably nobody. Share and move on. - Belief: what's your evidence? What would change your mind? Is your confidence calibrated to the evidence? - Knowledge: what's the quality of your evidence? Have you checked for disconfirming evidence? Are you holding it with appropriate humility?
3. Practice the update.
When you encounter evidence against a belief, notice your response. Knee-jerk dismissal is the signal that the belief is functioning as identity, not as hypothesis. The practice is to say "that's interesting" and actually engage with whether the evidence is relevant. This is hard. It gets easier.
4. Distinguish primary from secondary sources.
Much of what people "know" is secondhand — a summary of a summary of a study, a media account of a research finding, a friend's report of an expert's opinion. Each step in that chain introduces error. The further you are from the primary source, the lower your justified confidence should be.
5. Use confidence language precisely.
"I know" should be reserved for things you have strong, direct evidence of. "I believe" for things you think are true but acknowledge are fallible. "I think" or "in my experience" for opinions. "I'm not sure but" for weak beliefs. This vocabulary discipline is not hedging — it's precision.
Why This Matters at Scale
When opinion is treated as knowledge, debate becomes pointless — two "facts" colliding rather than perspectives in dialogue. When belief is treated as knowledge, updating becomes impossible — you'd need to abandon "fact" to change your mind. When knowledge itself isn't held with appropriate humility, science becomes dogma rather than method.
A society where people are clear about what kind of claim they're making is a society where: - Genuine disagreement (different values generating different opinions) can be recognized and respected - Factual disputes can be resolved by going to evidence rather than rhetoric - New knowledge can be integrated without existential threat to identity
None of this is trivial. The clarity of a thousand individual minds about what kind of claim they're making aggregates into something that looks like a more functional epistemic commons — a shared space where evidence and reasoning can do their work.
That shared space is hard to build and easy to destroy. It starts with each person being honest with themselves about whether what they're expressing is opinion, belief, or knowledge — and adjusting their confidence, their grip, and their openness to revision accordingly.
Comments
Sign in to join the conversation.
Be the first to share how this landed.