top of page

10 Times Scientists Admitted They Were Wrong, and What You Can Learn from Them

Updated: 1 day ago

Guest post by Hashem Elassad. Edited by Travis.



The Nobel Prize recipient and father of modern neuroscience who was integral to our understanding of the neuron (Santiago Ramón y Cajal) did not credit his success to intelligence. He pointed out that very intelligent people, like everyone else, are prone to bias. Instead, he credited his success to being able to admit errors and change his mind.  


All of us make mistakes. By being willing to admit them, we enable ourselves to improve. By doing so publicly, we model to others that making mistakes is okay. And, perhaps paradoxically, by admitting our mistakes we can often build trust, whereas adamantly denying them shows that we are not willing to learn and grow.


If we can build a healthy culture of admitting mistakes, then perhaps it will be easier for us to admit our own mistakes as well. To that end, here are 10 great cases of scientists and other experts admitting their mistakes publicly. 


1. A chapter is undermined in a best-selling book


Daniel Kahneman was a recipient of the closest thing to a Nobel Prize in economics (The Sveriges Riksbank Prize in Economic Science in Memory of Alfred Nobel) and he revolutionized our understanding of the mind. His critically acclaimed book Thinking Fast and Slow contains a great deal of insight, but it has not been immune to objections. The chapter about "social priming" (the idea that behavior can be affected by subtle cues that you’ve been exposed to) contains the following emphatic endorsement of the effect of social priming:


[D]isbelief is not an option. The results are not made up, nor are they statistical flukes. You have no choice but to accept that the major conclusions of these studies are true. More important, you must accept that they are true about you.

But this endorsement ended up drawing careful criticism from other researchers in a blog post. Kahneman’s response to this criticism was remarkable in that it freely admitted error and spoke candidly about how that error arose. For example, here is a snippet: 


I knew, of course, that the results of priming studies were based on small samples, that the effect sizes were perhaps implausibly large, and that no single study was conclusive on its own. What impressed me was the unanimity and coherence of the results reported by many laboratories. I concluded that priming effects are easy for skilled experimenters to induce, and that they are robust. However, I now understand that my reasoning was flawed and that I should have known better. [...] Clearly, the experimental evidence for the ideas I presented in that chapter was significantly weaker than I believed when I wrote it. This was simply an error: I knew all I needed to know to moderate my enthusiasm for the surprising and elegant findings that I cited, but I did not think it through.

Following the controversy, Kahneman went a step further by writing an open letter to the research community engaged in priming studies. In this letter, he urged them to conduct replication studies to address the shaky evidence base on which some of the field's findings relied. He expressed concern that a failure to openly address these issues would undermine the credibility of the field as a whole, and encouraged researchers to take doubts seriously, not shying away from them: 


To deal effectively with the doubts you should acknowledge their existence and confront them straight on, because a posture of defiant denial is self-defeating.

Kahneman's actions serve as an exemplar for how we can all deal with criticism constructively and advance our fields or our lives by embracing, rather than shying away from, the identification and correction of errors. His call for replication and his candidness about his own errors reflect an understanding that science (and reasoning more generally) is a progressive endeavor, improved not just through breakthroughs but also through the correction of mistakes. 



2. A launch Integration Manager at NASA takes blame for the Columbia Shuttle Disaster


On February 1st, 2003, the NASA Space Shuttle named Columbia disintegrated upon re-entering the Earth's atmosphere, killing all seven astronauts aboard. The tragedy was caused by a piece of foam insulation breaking off of the shuttle's external tank during launch and hitting the left wing, damaging the thermal protection system. This damage allowed hot gasses to penetrate the wing upon re-entry, leading to the shuttle's destruction. 


A public apology is not unusual in a disaster like this. What is refreshing is how much the Launch Integration Manager, Wayne, took full responsibility. No excuses given. No looking to shift blame onto others (even when others are also at fault). This extract is from an email he sent to thousands at NASA after he heard from an employee that none of the top managers took responsibility:


I cannot speak for others but let me set my record straight: I am at fault. If you need a scapegoat, start with me. I had the opportunity and the information and I failed to make use of it. I don't know what an inquest or a court of law would say, but I stand condemned in the court of my own conscience to be guilty of not preventing the Columbia disaster. We could discuss the particulars: inattention, incompetence, distraction, lack of conviction, lack of understanding, a lack of backbone, laziness. The bottom line is that I failed to understand what I was being told; I failed to stand up and be counted. Therefore look no further; I am guilty of allowing Columbia to crash.

It is admirable that Wayne was willing to do this. Even though he was not solely responsible for the event, he took responsibility for his role in it.


Admitting mistakes when nobody else will can be a daunting but deeply impactful action. It requires courage and integrity, especially in environments where the culture may not support such openness. Doing so can be a powerful way to lead by example, which can gradually help to shift the cultural norms within an organization or group towards greater transparency and accountability.


For a more in-depth exposition of the Columbia crash, check Wayne’s blog series After Ten Years.  


3.  Sharing stories of medical mistakes


Brian Goldman has experienced, first-hand, the damage and shame of a culture that denies mistakes and expects perfection. Here is a snippet from his TED Talk, in which he discusses his experiences in the medical profession:


And over the course of the next hour and a half or two, she started to feel better. And I felt really good. And that's when I made my first mistake; I sent her home. I made two more mistakes. I sent her home without speaking to my attending. I didn't pick up the phone and do what I was supposed to do, which was call my attending and run the story by him so he would have a chance to see her for himself. And he knew her; he would have been able to furnish additional information about her. Maybe I did it for a good reason. Maybe I didn't want to be a high-maintenance resident. Maybe I wanted to be so successful and so able to take responsibility that I would do so and I would be able to take care of my attending patients without even having to contact him. The second mistake that I made was worse. In sending her home, I disregarded a little voice deep down inside that was trying to tell me, ‘Goldman, not a good idea. Don't do this.’ In fact, so lacking in confidence was I that I actually asked the nurse who was looking after Mrs. Drucker, "Do you think it's okay if she goes home?" And the nurse thought about it and said very matter-of-factly, ‘Yeah, I think she'll do okay…."

Unfortunately, Mrs. Drucker passed away as a result of those mistakes.


From Goldman’s talk, we can learn about how the fear of admitting mistakes not only stifles learning but also perpetuates a cycle of concealment and repeated errors, which can have devastating consequences. 


4. A physicist announces a “discovered” planet is not real, during a conference


In the middle of getting ready for an astronomy conference, physicist Andrew Lynne discovered a huge mistake in his work. He forgot to adjust for the fact that the Earth's orbit is spherical as opposed to circular. This error would falsify a major previous “discovery” of his: a planet orbiting a pulsar. It meant that the planet he thought he had discovered wasn't really there. His discovery had already been published in the very reputable journal, Nature. At the conference stage, Lynne announced his mistake. The crowd responded with a standing ovation.


We often fear the results of admitting our mistakes. However, doing so shows integrity, and that integrity can be met with respect and appreciation.


5. Science Communicators on YouTube list their mistakes


Tom Scott and Hank Green are two of the most popular science communicators on YouTube, with a combined >17 million subscribers. They both recently posted videos admitting their errors. Tom Scott’s video lists everything he is aware of that his videos have gotten wrong in the last 10 years, while Hank Green’s video lists a few significant lies he has believed and shows him reflecting on what he has learned from them. His reflections include the following lessons worth learning:


I will run to go fact-check something that I disagree with and I will not do that with stuff that aligns with my previous conception of the world. That's just going to be a bias that we all have to deal with and live with and work through. And, yes, sometimes I feel like: What does it matter if somebody's saying a graph is saying something that it isn't actually saying, or that this graph is making people sort of active and excited and enthusiastic about making the kinds of change I want to see in my country, even if it's not right, what does it matter?
[...] But for who I am and where I sit and what I do, I have to attempt to have a very strong alliance to the truth. And the incentives of the social Internet's content recommendation systems make that hard for all of us.  And I don't know what to do about that – except to try very hard to have an alliance to the truth. And also, in a year that is going to be bad, to touch as much grass as I possibly can.

6. An SEO giant gets SEO wrong and describes how he was corrected


Rand Fishkin is one of the biggest names in the history of search engine optimization (SEO). He had argued that keyword weight can be used to determine the importance of a term in a document and to optimize the text. Another SEO expert, Dr. Garcia, responded by explaining how this won’t work. As a result, Fishkin wrote a blog post entitled "Admitting I was wrong" which explained what happened.


7. Researchers pay people to find errors in their work


Paying people who discover errors in one’s work is an increasingly popular (though still quite rare) technique used to improve the research process and make it more transparent. Some examples include:  


Donald Knuth was an early adopter of this process. He is a computer scientist who has been rewarding people who find mistakes in his books with cheques, since 1968. Due to financial security reasons related to people posting their cheques online, the reward has been changed to symbolic cheques from a fictional bank called “The Bank of San Serriffe”. You can find the names of the award recipients from 2006 to the present here (949 payments are listed at the time of writing, April 2023).


Stuart Ritchie is a Scottish psychologist who incentivizes readers to find mistakes in his books through a financial reward. In one mistake that he labeled as ‘major’ (and paid out £50 for), Ritchie had claimed that nobody had attempted a replication of Diederik Stapel’s findings before they were discovered as fraudulent. In reality, some people attempted to replicate his work before the discovery but their failed replication wasn’t published until after. 


The Red Team Challenge: A group of scientists (Nicholas A. Coles, Leo Tiokhin, Ruben Arslan, Patrick Forscher, Anne Scheel, & Daniël Lakens) offered  $3,000 for a fault-finding team to join their study. The team would consist of 5 people whose job is to find mistakes in their study. In addition,  every “critical problem” found would lead to another $100 donation to a GiveWell top charity.


8. A professor critiques 57 of his own papers in a thread on X


Professor Nick Holmes created a thread on X to critique his previous papers. The most serious critique is regarding his FMRI study:


Design was not optimal; was based too much on my behavioral expts[sic]. The effects are weak, the whole-brain analysis failed & it relied on region-of-interest analysis. I doubt it would replicate.

Holmes elaborates in a later interview:


My only FMRI [functional magnetic resonance imaging] paper so far—I wouldn’t mind if that was stricken from the record. It was all quite open and I’ve been happy to share the data on it. In my series of tweets, I said that I doubted it would replicate and that the effects are quite weak. And I hadn’t pre-specified the analysis I ended up doing in advance, which is not good science. I don’t think I need to retract it yet, although maybe someone will ask me to do so. But I don’t place great confidence in that result.

He wrote an article on the lessons he learned from creating the Twitter thread underlining the need for creating a culture that encourages admitting mistakes in the scientific community: I critiqued my past papers on social media — here’s what I learnt.   



9. Retraction Watch: Doing the Right Thing 

 

The blog retraction watch has a category called doing the right thing where good behavior (including admitting mistakes and requesting the retraction of your articles) is highlighted and celebrated.


A recent example comes from Andrew P. Anderson, a postdoctoral researcher, studying evolution and sexual selection at Reed College. When he was contacted by another research team, to inform him that a finding in his recently published paper appeared to be based on a calculation error, he went through all the emotions you might expect: unhappiness at the situation, feeling like a fraud, hoping the other research team had made a mistake, etc. But once he confirmed the error was real, he thanked them and went through the process of contacting the journal, retracting the paper, correcting it, and resubmitting. This can be a difficult process, and it’s one that plenty of people find embarrassing, but Anderson has some words of wisdom that are applicable beyond science:


People make mistakes and I’m certainly no different. There are people and procedures in place for correcting a paper. While not common, errors and retract/resubmit seem to happen more than I realized as I was looking for how to communicate with the journal about my situation. I got through it by just telling myself this is how science is supposed to work. 

This is true in all walks of life. People make mistakes. Even where errors are not common, they probably happen more than you realize, and acknowledging them and correcting them is how good reasoning works.


10. The Loss of Confidence Project: encouraging psychologists to announce that they've changed their mind


The aim of this project was to encourage psychologists to admit when they no longer stand by the results of their research. The project attracted quite a bit of attention but, interestingly, there were only 13 submissions. In an academic paper on the topic, the founders of the project observe:


[T]he survey responses we received suggest that the kinds of errors disclosed in the statements are not rare. Approximately 12% of the 316 survey respondents reported losing confidence in at least one of their articles for reasons that matched our stringent submission criteria (i.e., because of mistakes that the respondent took personal responsibility for), and nearly half acknowledged a loss of confidence more generally.

This suggests that potentially hundreds, if not thousands, of researchers could have submitted loss-of-confidence statements but did not do so. There are many plausible reasons for this, including not having heard of the project. However, we think that at least partially, the small number of submitted statements [only 13 were submitted] points to a gap between researchers’ ideals and their actual behavior—that is, public self-correction is desirable in the abstract but difficult in practice.


Conclusion


Can you rise to the ideal of public self-correction? It begins with being able to admit to yourself when you are wrong, and that can be difficult. But being able to admit being wrong is the first step to avoiding that kind of mistake in the future (rather than repeating similar mistakes). As Richard Feynman said: “[Y]ou must not fool yourself, and you are the easiest person to fool''.


If you’d like to begin this self-correction process, why not start with our tool for learning from your mistakes:




This guest post was written by Hashem Elassad and edited by Travis (from the Clearer Thinking team). For more of Hashem's work, you can head over to his LinkedIn or check out his blog.


bottom of page