How to Find Your False Beliefs (Without Creating a Paradox)
- Spencer Greenberg
- 2 hours ago
- 12 min read

Short of time? Read the key takeaways
❓ You can be rationally confident in many beliefs while expecting some to be wrong. The Preface Paradox highlights that believing each claim is true does not require believing that all your claims are true at the same time.
🧠 Belief is not binary but graded. When you say you believe something, you usually mean you have high confidence, not certainty. Treating beliefs as degrees of confidence dissolves apparent contradictions and better reflects how evidence actually works.
🔍 Certain features make beliefs more likely to be mistaken. Incentives, emotional attachment, unclear definitions, complex arguments, and many interacting variables each raise the chance of error and justify revisiting how confident you are.
📉 Good epistemic hygiene means adjusting confidence, not flipping beliefs. By regularly testing important beliefs and recalibrating how sure you are, you can reduce false beliefs without falling into paralysis or self-contradiction.
There's a paradox from philosophy that seems to have profound consequences for our ability to be rational. It is known as The Preface Paradox, and it goes like this.
Elise is an author who has written a non-fiction book. She has meticulously researched every claim she makes in the book, and only includes claims she rationally believes are true. That means she rationally believes the first claim she makes is true, as she does the second claim, and the third, and so on. Thus, she rationally believes all the claims in the book are true. And yet, since this is not her first book, and she has witnessed the careers of her contemporaries, experience tells her that it is not plausible to publish a book of this size (with so many factual claims and interesting avenues of research!) and get everything right. So, in the preface, she writes:
I wish to thank everyone who gave valuable feedback and criticisms. The errors this book undoubtedly contains are not their fault and are present in spite of their wise counsel.
She writes this in the preface because, from experience and epistemic humility, she rationally believes not all the claims in the book are true. Since both the belief that all the claims are true and the belief that not all the claims are true appear to be rationally justified, it thus appears that the author can be rationally justified in holding contradictory beliefs!
This might seem absurd. After all, the law of non-contradiction (i.e., that it is not possible for any proposition to be both true and not true) has been considered one of the most fundamental laws of the universe, since at least Aristotle, who called it “the most certain of all principles” and wrote: “it is impossible for any one to believe the same thing to be and not to be.”
But, somehow, the Preface Paradox seems to throw a wrench into that classic outlook on contradictory beliefs, by telling us that not only can you believe contradictory claims; you can be rationally justified in doing so.
We’ll tell you our preferred solution to the Preface Paradox, at the end of this article. But first, we’re going to give you tools for dealing with the Preface Paradox in your own life where it is, unfortunately, very likely to show up.
We humans are all just like the author in the paradox. By definition, we believe that each of our beliefs is true. And yet, simultaneously, we must admit that some of our beliefs must be wrong. We can’t possibly have gotten absolutely everything right - especially on complex topics where we have beliefs that go beyond our expertise or where most people disagree with us. Almost all of us have many such beliefs. So, rationally, we have to accept that we are likely wrong about many things. The trouble, though, is that we don’t know which of our many beliefs are wrong. If we knew that, we would have stopped believing them already.
So, before we offer you a solution to the paradox, here are 10 tests you can apply to any belief you hold, to help you evaluate its likelihood of being false. Applying these methods can help you reduce your error rate on topics that are important to you.
Reasons To Think Your Belief Might Be Wrong
None of the tests below will tell you by itself whether the belief in question is false, but they can each (separately and together) give you reasons to lower your confidence. Good epistemic hygiene involves applying tests like these to your important beliefs and adjusting your confidence in them accordingly.
We recommend picking an important belief of yours right now that many people disagree with you about. Then you can try applying some of these tests to that belief as you go through the rest of the article.
1. Many smart, knowledgeable people disagree with you
If you discover that your belief is something that many smart, knowledgeable people claim is false, then that (by itself) is some reason to consider reducing your confidence in it. Of course, the mere fact that people disagree with you does not tell you who is right or wrong, but the disagreement of smart, knowledgeable people provides evidence that you should be cautious about your level of confidence. Perhaps they spotted something you missed.
This raises another issue that it is good to be careful about. When beliefs are highly polarized or tied to certain groups (e.g., being ‘pro-choice’ is associated with being politically progressive), it is common to assume by default that those who disagree with you on that issue are not smart or knowledgeable. You might think they must be stupid and uninformed precisely because they disagree with you. In reality, there are often smart and knowledgeable people on most sides of most important issues, and you will likely form more true beliefs if you resist the urge to uncritically dismiss those who disagree with you, simply because they disagree with you (rather than, for example, on the merits of their arguments).

2. You have an incentive to believe it
When we have an incentive to think a certain way (e.g., financial benefit or fitting in socially), we are less likely to seek out or listen to evidence that contradicts this way of thinking. Through a process called ‘motivated reasoning’, we may even unwittingly marshal our argumentative skills to misinterpret evidence against our belief as actually being evidence for it!
For example, if you benefit socially from belonging to a political group, you might find yourself uncritically accepting information that flatters your group while more heavily scrutinizing or mocking the same kinds of claims from other political groups. Your reasoning processes can even feel objective and fair, despite actually being biased towards a conclusion that you are incentivized to maintain.
3. You would be psychologically disturbed to learn that the belief is not true
Our minds tend to veer away from thoughts that disturb us, making it less likely that we believe them, even when they are true. For example, if someone believes that their monogamous romantic partner does not fantasize about anyone else, they might be right, but they might also be wrong - and the discomfort of that latter possibility might make them avoid scrutinizing or evaluating the belief too closely. Of course, this may not be something you want to question - questioning it may be unhelpful. But if evidence starts to mount and it's important to you to know the truth, then the fact that you'd be disturbed by a certain answer can act as a barrier to seeing the truth
4. You originally came to believe for reasons that don’t have much to do with careful thinking or the weighing of evidence
Many of our beliefs are acquired long before we ever reflect on them. Maybe they’re the results of childhood experiences, habits picked up from parents or peers, emotional reactions, or one-off events. When a belief is formed in ways like these, it might be persisting simply because we haven’t yet given it careful (or perhaps any) scrutiny.
For example, if you grew up with a parent who never let you pet dogs in the street, you might come to believe that petting dogs is generally dangerous. That belief could then persist all the way into adulthood, simply because it has never been questioned carefully.
5. Your argument as to why your belief is true is long and complex
When our arguments are long and complex, it is more likely that we have made an error at some point in our thinking. That doesn't make long, complex arguments automatically wrong - they are just easier to get wrong.
For example, imagine you believe that a convicted criminal is innocent, because when you evaluate the twelve compelling-seeming pieces of evidence presented against her, you find that they each don’t hold up. In this case, the argument for innocence has many moving parts, and this increases the chance that you went wrong somewhere in your reasoning. If you were wrong about one of those pieces of evidence, it could have caused you to come to the wrong conclusion overall.
6. There are lots of possible outcomes, and your belief is that just one of them will occur
Typically (though not always), the more possible outcomes there are, the less likely it is that any particular one of them will occur. For example, when you think a particular candidate in a large field of candidates will beat out all the others, the chances that you are right will (all other things being equal) decline the more competitors there are. So, if there are 20 different views on a topic that are plausible (and all other things are equal), you're less likely to have the right view than if there are only 2 plausible views (where even just a guess gives you a 50/50 chance).
7. A large number of factors influence whether your belief will end up being true
Some things are very simple to predict. If I drop a brittle glass on a hard floor from a decent height, then I can be fairly confident that it will shatter. The mechanism is well known, and there are not many variables involved. However, when many factors influence whether or how something happens, it is really hard to be sure that you have properly taken into account all of the important ones.
For example, if you’re convinced that GDP growth will decline over the next year, then your belief depends on a great many variables interacting according to complex laws.
8. You don’t understand the arguments of those who disagree with you, or see how they could believe what they believe
When you don’t understand how an intelligent person could hold a contrary opinion, that is often an indicator that you have mainly engaged with one side of an issue, and so are less likely to have really weighed the strength of arguments on all sides.
For example, one of the authors of this article had an old philosophy professor who used to say, “When someone says ‘obviously’ or ‘clearly’ in an argument, take a moment to check whether they’ve made some big assumptions.” This is because, if you find yourself thinking that a claim is obviously true (such as that a fetus obviously is or is not a person, or that capitalism is or isn't a good system) and yet you struggle to understand why thoughtful people disagree with you, that difficulty is itself evidence that your evaluation of the evidence may be incomplete.
9. You can’t clearly explain what your belief means
In some ways, this point is the reverse of the previous one. Many scientists and philosophers have said something along the lines of “If you can’t explain it clearly, then you don’t really understand it.” From Feynman to Dennett to Singer and beyond. A corollary of this is that, when we find it hard to explain what we mean by one of our beliefs, this is some evidence that the belief has not been arrived at by processes that require understanding, such as carefully considering the evidence and making a decision based on that. It may instead be the case that we have merely become attached to an idea or intuition for other reasons.
For instance, you might believe that humans have free will, but struggle to articulate what you mean by ‘free will’ and how it differs from related ideas like randomness or determinism. You wouldn’t be alone in struggling that way, but that lack of clarity can be a reason to lower confidence in your belief.
10. You become emotional when people disagree with you about a belief
Of course, it can be upsetting when people disagree with our important beliefs - whether those beliefs are true or false. And we may have good reason to be bothered by their perspective. The problem here is that strong emotions can put us in a reactive mode where we’re more invested in ‘defending’ our view or ‘defeating’ the opposing view than in finding the truth.
For example, perhaps you think that insurance companies should not cap health expenditures for illnesses that are usually terminal, and you become upset when challenged on this issue. The emotional reaction may well be completely understandable and valid based on your life experiences (and it does not mean you are wrong) but it does indicate that you might have slipped into a mode of discussion that has incentives other than true belief - so your confidence in your belief should be treated with some extra caution.
How To Solve The Preface Paradox
Let’s return to the paradox from the start of this article. Here’s a quick reminder: the paradox arises out of a situation that seems to show it can be rationally justifiable for an author to believe both that:
Each of the claims in her book is true
Not all the claims in her book are true
Since these two statements appear to contradict each other, the paradox appears to show that one can be rationally justified in believing a contradiction. So, how can the paradox be defused?
There are lots of suggestions in the academic literature, and arguments about this can get quite technical. Since we don’t want to get too lost in the weeds, we’re just going to focus on our preferred solution. Then we’ll tell you how it applies to your own life too. Here it is:
Belief isn’t binary. It’s not the case that you always either totally believe something or totally do not. Instead, when people say they believe something, they’re speaking imprecisely; if you pressed them, they’d often admit that they aren’t 100% certain of the belief, but saying “I believe it” is like saying “I have a high degree of confidence in it” (how high can vary, and there is no objective threshold of confidence above which something counts as ‘a belief’). If the author in the Preface Paradox was explicit about this, by saying her levels of confidence, and those beliefs were appropriately ‘calibrated’ based on the evidence available, we’d see the paradox disappear.
For example, imagine the author has done such thorough research that she believes each individual claim in her book with 99% confidence (equivalent to assigning it a 0.99 probability). Now imagine there are 100 claims in the book. What should the author’s confidence be in the full combination of all 100 claims - i.e., in the belief that all 100 of them are true? Well, to combine probabilities, you multiply them together. So, this means we need to multiply 100 instances of 0.99 together - which is the same as raising 0.99 to the 100th power (0.99100). But that is equal to 0.37. So, if the author rationally believes in each individual claim with 99% confidence, then she is rationally justified in believing the combination of all 100 claims with only 37% confidence. Thus, she is rationally justified in the following:
For each of the claims in the book, having 99% confidence that it is true (which, when speaking imprecisely, we might say just means believing each of the claims is true).
Having only 37% confidence that, together, all of the claims in the book are true (which, when speaking imprecisely, we might say just means not believing that all the claims are true).
Each individual claim is very likely correct, and yet, it's quite likely that not all the claims are correct.
Since these do not contradict each other, the paradox is resolved!
Why should you care?
Remember: the Preface Paradox shows up in our own lives, too. By definition, you believe each of your beliefs is true, and yet you also believe far too many things to be certain you have no false beliefs. So, it would be reasonable to also believe that not all your beliefs are true.
In the face of these facts, if you want to avoid having inconsistent, contradictory beliefs like the ones in the Preface Paradox, we recommend embracing a non-binary view of belief. Instead of thinking of all claims as things you totally believe or totally do not, think of them as things you have varying degrees of confidence in.
How confident should you be in each of your beliefs? Well, part of the answer to that question comes from the 10 properties of beliefs that we listed above. If you want to root out your false beliefs or engage in critical thinking practices that will tend to give you more true beliefs, you can consider those 10 properties and consider downwardly adjusting your confidence in any given belief when you discover that it has any of them.
If you found the topics in this newsletter interesting, you might enjoy our Nuanced Thinking mini-course. It’s free and takes about 20 minutes to teach you about 3 common binary thinking traps and the nuanced thinking techniques you can use to combat them.



