top of page

Accepting Your Error Rate

Most people underestimate how often they are wrong. Not only is there a common human tendency to overestimate one’s own abilities, but beliefs have the property that they feel right to us when we focus on them. So even if we admit that we likely have a number of false beliefs, it’s easy to go on acting as though each of our individual beliefs is beyond serious doubt. Worse still, it’s disturbing to discover we’ve been mistaken about something important – especially when we’ve wasted time or effort because of the belief, or expressed the belief in front of others. As a result, we try to avoid psychological discomfort and save face in front of others. But there is a healthier way to think about wrongness: recognizing that we have an error rate.

Since we have to assume that we will be wrong sometimes, we can think of ourselves as having a frequency with which things we say are actually false (or, if we’re thinking probabilistically, a rate at which we assign high probabilities to false propositions). It may be helpful to think of this personal error rate as being context-specific: we make errors more frequently when discussing philosophy than when remarking on the weather, for instance. But if you wanted a single overall rate, you could define it, for example, as the fraction of the last 1000 claims you made that actually were not true (or were not even very nearly true). This rate will be different than, but generally quite predictive of, the fraction of your next 1000 claims that will be wrong.

Our error rate is connected to the chance that any one of our individual beliefs will be wrong, though we obviously should be much more confident in some of our beliefs than others. When evaluating the probability of a particular belief being right, there are a variety of indicators to look at. For example, we should be more skeptical of one of our beliefs if a large percentage of smart people with relevant knowledge dispute it, or if we have a strong incentive (financial or otherwise) to believe it, or if we can’t discuss the belief without feeling emotional.

Once we fully accept the fact that we have an error rate, we can think about wrongness in a new light: we can expect to be wrong with regularity, especially when reasoning about complex subjects. Once we start expecting to be wrong, it is no longer as disturbing to find that we are wrong in a particular case. These errors merely confirm our own predictions: we were right that our being wrong is a common occurrence. That way, being wrong doesn’t have to be so frightening.

Estimating our actual error rate is hard, in part because we’re wrong much more often than we notice. But nonetheless, we can benefit psychologically from remembering that we have an error rate, even if we don’t know what that rate is.

If you feel like you’re almost never wrong, you may be experiencing a serious problem: it is far more likely that you are wrong fairly regularly and are simply bad at processing that fact than it is that you're really right all the time. Put another way: failure to detect your own wrongness doesn’t imply you’re right; it indicates you’re very likely incorrect about how often you're wrong.

When you thoroughly accept the fact that you’re wrong with a certain error rate, it becomes easier to convert fear of being wrong into curiosity about when and how often you miss the mark. Whereas seeking out your reasoning failures may have scared you before, it may now seem dangerous to not seek them out: you already know that you’re going to be repeatedly wrong, so the responsible thing is to figure out when that wrongness is occurring.

Yet another advantage of thinking about your error rate is that it naturally leads to thinking about how to reduce this rate. This can be done by learning to employ more reliable procedures for forming beliefs and using these procedures to check what you previously believed to be true.

ClearerThinking has produced a number of programs that touch on this subject, including:

  • The Belief Challenger: Use thought exercises to re-examine your deepest beliefs.

  • The Political Bias Test: Learn to spot and overcome your innate political biases.

  • Understanding Bayes' Theorem :Learn the mathematical rule that governs the way evidence functions.

  • The Explanation Freeze: Discover a simple trick that'll help you deal better with uncertain situations.

Remember: you too have an error rate. You don’t need to fear being wrong. Instead, you should expect it.

(A version of this article was originally published on SpencerGreenberg.com.)

bottom of page