top of page

How to persuade people, ethically

  • Damon Sasi
  • 42 minutes ago
  • 15 min read

Your success often depends on your ability to bring others around to your way of thinking. Whether you're leading a team, selling a product, or advocating for change, this guide will give you practical techniques to be convincing, derived from evidence-based principles of psychology, without resorting to manipulation or deception.


Persuasion is a skill like any other, and it is possible both to learn how to do it better and to do it in an ethical way. This article aims to serve as a good starting place for people who want to be more persuasive but also care about doing so ethically. It focuses on what we’re calling ‘cooperative contexts’, which are contexts in which you want to convince your audience of something that you think is likely to be true and avoid misleading tactics like these:


This bag of sugar features the words ‘Natural Brown Sugar’, but text on the back reveals that “Natural is only a brand name / trademark and does not represent its true nature”.
This bag of sugar features the words ‘Natural Brown Sugar’, but text on the back reveals that “Natural is only a brand name / trademark and does not represent its true nature”.
This tube of super glue has been made to look like it contains much more than it does, using deceptive packaging. 
This tube of super glue has been made to look like it contains much more than it does, using deceptive packaging. 

This article is structured as a set of principles, each of which highlights an aspect of what makes arguments persuasive.

Principle I: Conviction Without Overstating Confidence

Principle II. Building Bridges

Principle III. We Feel Stories More Than Facts

Principle IV. Truth-Seeking Mindset



Principle I: Conviction Without Overstating Confidence


The most important, if obvious, first step in any discussion of ethical persuasion in cooperative contexts has to be: say only what you actually believe. It might be easier to convince someone of a conclusion by lying to them. For instance, if you think your friend should break up with his partner, he might be more likely to agree if you exaggerate bad behavior from his partner - but that would be an unethical deception. To mislead someone, directly or by omission, is a hostile act, and while it might be ethically justified in certain (non-cooperative) contexts, such as when a murderer is at your door and asking whether your family are home, we'll assume here that you want to persuade people of things you believe are likely to be true.


Once you make sure that what you are saying is honest, the next step to being persuasive without deception is to show your conviction. This is a core attribute of every honest, persuasive speaker, even those who verbally express uncertainty, self-reflection, or vulnerability in their arguments. This doesn’t mean pretending to be infallible. 


So, what does epistemically honest confidence look like? How do you show conviction without feeling like a con artist or coming off as a zealot?


First, language matters. Imagine you’re selling a product to potential investors: “We will achieve X market share” comes across similarly to "Our goal is to achieve X market share." But the first is a prediction that you might not have evidence to support, whereas the second is a clear statement of intent which you can say with complete conviction. It demonstrates your commitment, and communicates “I’m making this my priority,” which is, at the end of the day, what the investors care about without offering something they (should) know you can’t promise in the first place.


The second aspect of epistemic confidence is, in a word: charisma. Persuasive speakers have a number of skills that allow them to be compelling, inspiring, or charming. We don’t have the space here to go into the finer points of charismatic speaking, which shines through everything from word choice, to tone of voice, to body language, and often contains some mix of passion, warmth, and connection with your audience. But if it’s something you’d like to improve, you could try emulating different people whose speaking style you resonate with until you find your own ‘voice’ that's inspired by their approach, and practice through improv workshops or Ultraspeaking games


With that in mind, we can move to the second aspect of cooperative persuasion:



Principle II. Building Bridges


People often feel frustrated when others reject their arguments or seem unmoved by compelling facts. This frustration usually comes from a mistaken assumption: that people’s current beliefs are unmoored, ready to shift as soon as new information is received. But beliefs aren’t just isolated claims - they’re parts of an interconnected system of lived experiences, values, and assumptions about reality.


Imagine beliefs as Lego structures. Even if two people have access to the same pieces (facts), they may build completely different models depending on what they prioritize or emphasize. And if they’re missing key pieces, no amount of reassembly will produce the same outcome. Some parts of their structure may be so reinforced by other pieces that they instinctively resist any model that excludes them or attempts to shift them around.

The ethical and practical response to this is building bridges: working to understand the foundations of someone else’s belief, noticing where their models of reality differ from yours, and gradually constructing a shared understanding. This approach emphasizes respect and curiosity over confrontation. It doesn't just work better to change people’s minds; it aligns with the principle that people deserve to be engaged as equals, not as ‘problems to fix.’


Example


Imagine Mike, a senior developer who wants his team to adopt test-driven development (TDD). But when he brings it up with them, he runs into resistance. Some are skeptical, pointing to failed process changes in the past. Others worry it would slow them down and interfere with tight deadlines.


One version of Mike goes on the offensive. In meetings, he emphasizes that any competent engineer should already be using TDD and name-drops tech leaders who promote it. He puts together a presentation full of glowing case studies that support his points while downplaying any trade-offs or contradictory data.


Alternatively, another version of Mike sits down with the team to understand where the hesitations are coming from. He shares his own doubts from when he first encountered TDD and relates his own experiences of how it ended up saving time over the long run. He asks about their experiences with past rollouts and acknowledges their concerns. Maybe instead of a full changeover, he suggests a trial run on a small feature to see how it plays out in practice.


Even if the first form of persuasion works (which, we contend, it's less likely to), the manipulation and shaming tactics can demoralize the team, and backfire if any problems in the rollout occur, quickly disillusioning people who were at first convinced and damaging trust in Mike’s judgments.


The slower, more collaborative path probably doesn’t result in instant buy-in, but it builds trust and meets others where they are. Over time, team members who are wary can see for themselves whether there are benefits to their workflow.


Even if the first approach is used in a fully honest way, it may still amount to pressuring people into agreement without convincing them in a deep way that’s robust to unexpected setbacks. The second approach respects the team member’s autonomy, invites collaboration, and allows beliefs to grow through mutual understanding.


The bridge-building approach is both more effective long-term and more ethical, because it treats others as thinking, capable agents rather than obstacles to overcome, and takes into account what's really holding them back from agreeing with you.


In practice, doing this effectively requires:


  • Not presuming you know why someone believes what they believe. Stay curious and ask questions until you can explain their position back to them so well that they’d agree you got it right.

  • Minding your language. Words can mean different things to different people. What sounds clear or persuasive to you might sound loaded, vague, or even threatening to someone else. Instead of debating whose definition is “correct,” ask how the other person understands the terms they’re using and to clarify your own usage if it may be ambiguous.

  • Predicting personal or cultural defenses. People often develop resistance to ideas that seem associated with harmful or extreme positions, even if your own views are more nuanced. Understanding these associations and the beliefs others hold against them can help immensely.


These steps don’t just inform you of the other people’s positions; they also help you focus your arguments on what will actually matter to the person you’re trying to convince, and avoid landmines you didn’t know were there. Subcultures and communities often have something like an immune system against beliefs they view as ‘stupid’,dangerous, or coming from a disliked outgroup, and it can be difficult to persuade others if they associate what you say with things they’ve heard badly argued before, or with ‘bad people.’ Sometimes, just using the wrong phrase can close the door to a real conversation before it begins.


But when people actually understand each other, and conversations focus on the things that resonate with people’s values, new ideas have a much better chance of being heard and seriously considered.



Principle III. We Feel Stories More Than Facts


Imagine a Venn diagram, with one circle containing true statements and the second containing things your audience would find convincing. You might think the target is in the middle, but even this isn’t enough for ethical persuasion in cooperative contexts; if you have ever seen someone mislead others with a technically true but selectively edited account of an event, you know that, although truth is necessary for ethical persuasion in cooperative contexts, it is not always sufficient.


A more accurate diagram would look like this:


This may seem daunting—if even true statements or facts can be misleading, how do you know if you’re in the top overlap (where only "truthful" and "convincing" meet) or have drifted into the middle (where the overlap now includes "misleading")? And why isn’t the Truthful circle contained in the Convincing one? 


You've likely heard it said that we humans are "social animals.” Genetically, we’re wired to pick up languages as toddlers, to recognize facial expressions, and to learn by imitating what others do. Similarly, we’re also wired for stories; our minds are pattern-seeking by default and enjoy narrative ebbs and flows. Many heuristics and biases come from automatically presuming causal connections in disparate events, and people often spend their free time consuming stories, whether in written, audio, video, or gaming form.


And the way the facts of a story are framed, the surrounding context of each bit of information, matters. A lot.


In the left image, a gun has been cropped out, making it look like a helpful scene. In the right image, the water canteen has been cropped out, making it look like a threatening scene. The full image (in the middle), shows a more complicated story.
In the left image, a gun has been cropped out, making it look like a helpful scene. In the right image, the water canteen has been cropped out, making it look like a threatening scene. The full image (in the middle), shows a more complicated story.

All of these factors contribute to making stories far more persuasive than simple lists of facts. One reason for this is that compelling stories are emotional superstimuli—an exaggerated version of something beyond what was present in our evolutionary environment (refined sugar and pornography are other superstimuli examples). In our day-to-day lives, we rarely experience the emotional depths and heights that a good story can evoke in us (a two hour film may show more emotionally intense events than we'd typically experience over years of our life), and strong emotions are signals that something important is happening, something we should pay attention to and commit to memory. There is even evidence that, outside of habits or impulses, most of our decisions are driven by emotion first and foremost, with careful reasoning serving (at best) to optimize for particular outcomes and (at worst) to generate post hoc explanations for why the decision makes sense.


If you give a young cigarette smoker a bunch of statistics showing it cuts life expectancy down by more than a decade, a part of their mind may file this information away, but it likely has nowhere near the weight of their lived experiences, which have daily reinforcement via positive feelings from each cigarette. Conversely, examples of the power of stories are everywhere, including major historical events; novels like Uncle Tom’s Cabin were extremely influential in the abolitionist movement. People who knew intellectually about what slavery represented became politically galvanized when able to vicariously experience it through the story. 


If you want to persuade someone of an aspect of reality you believe is missing from their worldview, learning how to arrange facts in a narrative framework (with clear cause-and-effect and emotional experiences highlighted) is extremely effective for helping others connect with what you’re saying in both an intellectual and an emotional way.


Of course, this is not only used for convincing people of true things. Narratives are often used to arrange facts in misleading ways or to manipulate people into having one-sided takes on an event. But this should not dissuade you from storytelling for persuasion!


Certainly, ethical persuasion has to avoid strategically omitting facts you know would dissuade your audience. There is a fine line between arranging facts into a narrative that efficiently addresses someone’s interests and leading them to a conclusion they wouldn’t hold if they had all the same information as you. But even if you have infinite time to share every scrap of information you have, your audience’s attention will be limited, which means that you're forced to make choices about what to include and what to omit.


It can be useful to start by asking, “How would I want to be treated if someone were attempting to persuade me?” Maybe for you, the ideal effort put into the narrative is zero. But in reality, constraints will almost always require narrowing what you share with others, so the question is not whether to omit, but what to omit. Ignoring the narrative you create by default doesn't mean there isn’t one - just that you're telling an unconscious story that’s not likely to address the most relevant points for your audience.



Principle IV. Truth-Seeking Mindset


A big obstacle to engaging in persuasion for morally conscientious people can be the worry that they’re being unethical if they attempt to persuade others of anything too ‘important.’ For example, common cultural norms consider it rude to argue against someone’s religious views, parenting choices, or career decisions. These taboos are often motivated by reasonable efforts to minimize judgmental, pushy, or manipulative behaviors. But there are mindsets you can adopt and techniques you can use to help avoid those traits.


The first and most important thing is the mindset you bring to the conversation. In Julia Galef’s book The Scout Mindset, she emphasizes that prioritizing discovering the truth (like a scout), rather than defending existing beliefs (like a soldier), ultimately helps us better achieve our goals.


This might seem counterintuitive to many who experience their uncertainty as something that demotivates them from pursuing their objectives. But Julia argues that, when we internalize the fact that truth-seeking serves our interests, we can become comfortable with uncertainty and naturally more curious about perspectives that differ from our own. Instead of viewing disagreements as battles to be won, we can approach them as opportunities to learn and refine our understanding. 


This helps us avoid overconfidence and improve our understanding of reality, which makes it easier to achieve what we care about. When we see things more clearly, we can make better decisions, identify more effective strategies, and avoid wasting energy on approaches that won't work.


A useful skill to learn is noticing what your agenda is for any given conversation. Just asking yourself, “Am I trying to persuade them of something, or am I trying to show them why I believe what I do?” may not be enough because it can be easy to fool ourselves into thinking we’re doing the latter. But you can also spend some time imagining different outcomes.


What if you’re wrong? Do you know what it would take to convince you of that? How would you feel if you were? Can you sit with that feeling, or does it cause pain or discomfort? Can you imagine it in detail, or does your mind flinch away from it?


Noticing these things can help you avoid defaulting to a Soldier mindset, which Galef describes as being motivated to defend a pre-existing belief, or to defend something that you want to believe, against any evidence that might threaten to undermine it. Going into a conversation with an explicit goal to persuade someone else of something is hard to avoid completely, but it’s important to be able to notice when that goal is influencing you in ways that make it harder to understand why the other person has different beliefs from you, which informs what to focus on in arguments.


A powerful process for this is called Double Cruxing. When you have a belief about something, there may be many supporting beliefs that feed into it. The supporting belief that most influences whether you hold a belief is called a ‘crux,’ and while there is not always just one, it can be useful to keep digging through each of your (or someone else’s) supporting beliefs until you find one that, if you learned that it was false, would cause the largest shift toward changing your mind - as opposed to ultimately being only a minor part of why you believe what you do. If both people can find a single supporting belief that, if they investigate it and find it to be true or false, would resolve their disagreement, that’s a ‘double crux.’


Example


Two managers disagree on whether remote work would be a good company policy. First, they should operationalize what they mean by ‘good policy’ and even what is meant by ‘remote working,’ to ensure they’re not talking about different things while using the same words (a lot of disagreements get resolved at this step!).


To operationalize something is to define it in practice, through some observable measure. So the manager who believes in remote work might operationalize it as “working from home by default, but easily able to come in for special projects or meetings,” while the other manager’s concept is “working from another city or country.” If they realize they disagree about what they’re even arguing about, this is an important point to clarify before debating further.


If they realize they both mean employees working from home within driving distance, it might become clearer that the thing they actually disagree on is whether remote work is better or worse for employee productivity. If so, that would be the “crux” of the argument. Or maybe the crux is whether remote work is better for attracting and retaining better talent, or reducing company expenses.


If the first is the crux for both of them, as in both agree that evidence for or against remote work improving productivity would change their mind about whether their company should adopt it, they have found a “double crux,” and can begin seeking evidence for or against it. They might decide to look into the other claims as well, but since they know they likely won’t change anyone’s minds if they’re wrong about them, they’re less important, and won’t get frustrated if they spend a lot of time proving a point that doesn’t change the other person’s mind.


In this way, Double Cruxing can be a powerful tool for orienting yourself toward truth and better understanding others and what they believe so that you can persuade them if it turns out they’re wrong… assuming you can honestly demonstrate that to them. It also helps to demonstrate that you are open to changing your mind and listening to what they believe is true and why, which tends to cause people to be more receptive themselves.


This also applies to things like selling a product or pitching a company. If you believe the thing you’re selling has a unique strength, a thing that, if it weren’t true, you’d genuinely stop promoting it, that’s the thing that should be front and center for it.



Conclusion: Persuasive Reasons to Persuade


Ethical persuasion is not just possible, it’s important for human progress. Most ambitions benefit from it, particularly those that aim to improve the world. Large-scale change often requires coordinating with others, which often requires persuasion to convince people to collaborate, fund good projects, or resolve major differences in beliefs. Of course, there's a danger that you think you're persuading people of something true and good, while you're actually persuading them of something false or harmful. But engaging in ethical perusasion helps reduce the chance of that being the case.


Examples abound of ethical persuasion being critical to the sucess of important changes. While people often think of the American Civil War as the key historical point in ending transatlantic slavery, it would likely not have occurred without the abolition movement, which used truthful narratives about the horrors of slavery to convince others of its immorality. Frederick Douglass and Sojourner Truth’s autobiographical writings and speeches were particularly powerful - they presented detailed, factual accounts of their experiences while connecting them to universal human values and rights. 


Even science relies on persuasion in order to make its impact. The campaign for hand-washing in hospitals succeeded when Ignaz Semmelweis and others presented both statistical evidence and compelling narratives about preventable deaths. Charles Darwin did not simply recount the evidence of his voyages in On the Origin of Species; he also predicted and responded to many of the objections and critiques he knew would arise from its reception to persuade others. Scientists throughout history have had to work hard to persuade others to overcome established, false beliefs, while maintaining their epistemic integrity.


The key is to approach persuasion in cooperative contexts as a collaborative process rather than an adversarial one. This doesn’t mean every attempt to persuade must involve collaboration; not all communication is interactive. But the frame that we take into a persuasion attempt has many effects on how we go about it, both subtle and obvious. The ‘convince them at any cost’ mindset can certainly be effective, but it usually trades off against ethics to get those advantages making it a bad deal for society.


When we recognize that our goal isn't to ‘win’ an argument, but to simply do our best to help others see why we believe what we believe, we can engage in persuasion that respects both our integrity and the integrity of those we're attempting to persuade. In this way, ethical persuasion becomes not just a tool for changing minds but a path toward reducing conflict and building trust, even if you still end up disagreeing.


To summarize, if you want to be better at persuading people ethically: 


  1. Show honest conviction, without overstating your confidence

  2. Build bridges and deepen your understanding through careful listening

  3. Craft compelling narratives that highlight relevant, non-misleading truths

  4. Use techniques that help orient you and others to a truth-seeking mindset


Effective, ethical persuasion also requires patience and humility. Change rarely happens in a single conversation. Just as we wouldn't expect someone to master a complex skill or solve a difficult problem in one sitting, we shouldn't expect beliefs that have been built up over the years to be easy or quick to shift.


But it’s a skill like any other, and the more you understand why people believe what they believe, the more you can use the truth to help yourself and others converge on it.


 
 
bottom of page