top of page

Want to form some new daily habits? We ran a massive study to explore which techniques work best.

Updated: Sep 13, 2023

Chances are that you have a few positive habits you’d like to form. Maybe you’d like to start exercising daily. Or, perhaps you want to spend a little time each day learning something new and useful. While our lives can improve if we form positive habits like these, it can be hard to make them stick.

There are plenty of techniques out there that may improve your chances for having a habit stick. But it’s not clear which of these work, if any, or how well they work compared to each other. That’s why we ran a massive study to explore which habit-forming techniques are most effective. Want to know which techniques we found worked best? Then read on!

Our research into forming new habits

Our first step was to figure out what kind of habits people actually wanted to form. After all, no technique can overcome a lack of motivation.

To do this, we ran a study to get an idea of some desirable habits – we gave study participants a long list of healthy habits and had them choose their favorites. These were the daily habit goals we found were most popular:

  • Reading books or the news everyday

  • Doing daily stretches

  • Practicing a skill you want to improve upon daily

  • Drinking more water every day

  • Exercising daily

  • Learning about a topic that matters to you every day

  • Cooking and eating healthy food daily

Notice that we focused solely on daily habits. This means that this study isn’t necessarily applicable to, for example, once-per-week habits or habits intended to be performed multiple times per day. We also did not focus specifically on "true" habits – repeated behaviors that occur subconsciously or with little or no effort or thought. Rather, we also include in our research "daily routines", because many of the desirable behaviors people want to form are routines rather than true habits. For example, it's unlikely you'll ever get to the point that you go to the gym daily autonomously without even thinking about it, though you might at times brush your teeth without thinking about it.

After we got the list of desirable daily habits, our next step was to run an experiment to discover which habit formation techniques would help these habits stick. We recruited people to choose one of the above habits to try to pick up over the course of four weeks. Potential participants were screened for interest in forming one of the habits we chose to target, and for willingness to remain in the study even if the habit formation attempt failed. Ultimately, we analyzed 1,256 data points from 477 people who completed at least one follow-up survey. (There are more data points than people because we counted each survey completed by a person as one data point, and most participants completed multiple surveys.)

The screening survey also required study participants to describe in detail what practicing their habit would actually entail, so that the participants would have a clear idea of what to actually do to perform the habit. This description also served as a benchmark for success. Knowing exactly what the habit entailed allowed participants to easily tally whether they successfully performed the habit each day; it would be hard to tally a vaguely defined habit! Note though that this process of choosing of a habit goal, and then clarifying precisely what success would look like, could itself be thought of as a habit-reinforcing intervention (and it may have improved habit formation on its own). However, since all participants in the study carried out these steps, we were able to control for it across participant groups.

We also did our best to make sure the enrolled participants were motivated, by applying a couple of strategies: (1) by explicitly confirming that the participants were willing to devote effort to forming the habit each day, and (2) by encouraging them to choose a habit that they were genuinely interested in forming. We also asked them to explicitly describe the concrete steps needed to perform the habit, and write out how they would know if the habit was successfully completed each day. All this effort paid off, since the participants tended to be quite motivated. As you can see from the graph below showing motivation on a 1-to-5 scale (with a higher number meaning greater motivation) across the entire length of the study, the median motivation, represented by the dashed vertical line, was 4 out of 5.

Participants who passed the screening were then randomized into one of the following groups:

  • Control With Reminders (186 data points): These people weren’t trained in any technique to boost the chances of their habit sticking. They just went for it! However, this group did receive daily email reminders over the course of the first week to help them remember to do the habit. And, like all other groups, they did go through the process of picking a habit, clarifying what specifically that habit involves, and describing what success would look like.

  • Control Without Reminders (131 data points): Members of this group didn’t use any habit-boosting techniques either. They also didn’t receive any reminders. We included this group so we could compare their results to the previous control group’s, and gather evidence about whether or not reminders by themselves are helpful. For some analyses we conducted, the groups "Control With Reminders" and "Control Without Reminders" were merged together into what we will simply call the "control group."

  • WOOP (168 data points): This group was trained in a technique pioneered by the psychologist Gabriele Oettingen. “WOOP” is an acronym for “Wish, Outcome, Obstacle, Plan”. To do it (roughly speaking), you first think of a goal that you Wish to achieve. Then, you visualize getting the Outcome from it you most want. After that, you take some time to describe an Obstacle that could get in the way of the plan. Finally, you come up with an if-then Plan to attempt to overcome the obstacle (e.g., “If I run out of veggies, then I’ll immediately buy more at the store.”). This technique has been studied quite a bit and found to improve goal achievement. It also is known in the literature by the less exciting name of “mental contrasting with implementation intentions,” since it combines two often studied interventions. If you'd like to see the exact WOOP intervention we created for this study, you can try it here.

  • Choose Five (143 data points) Participants sorted into this group were asked to choose 5 out of 22 possible habit-boosting techniques to try out over the course of the study, based on one-sentence descriptions of each technique. They also were given practice in applying each technique they chose, and some guidance for how to use them. This group enabled us to look at whether people do better using techniques they have chosen, rather than techniques that were assigned to them at random.

  • Random Five (628 data points): People who were placed in this group were randomly assigned 5 out of the 22 possible habit-boosting techniques we chose for the study, and also received guidance and practice just like the group that was able to choose their own techniques. You'll note that, by design, this group was much larger than the other groups.

The participants then received three follow-up surveys: one after their first week of practicing the habit, one after their second week, and one after their fourth week. We asked several questions in each survey to help us understand the experience of study participants, but we focused on two main outcomes in order to measure success:

  1. Number of days the participant successfully practiced their habit – out of the last 7 days for the first two surveys, and out of the last 14 days for the final survey. We divided the number in the last survey by two to make it a "per week" number, mirroring the first two surveys.

  2. Whether the participants felt they were meeting their habit formation goal by asking: “Did you practice your habit as often as you had planned to?” The possible answers were “yes” and “no.” These answers were then coded as 1 or 0, respectively.

We then used linear regression techniques to explore which habit-boosting techniques worked best, treating the habit-boosting techniques each participant used as the main independent variables. We counted each check-in per person as one data point in our main analysis, so each person who participated in the study generated at most three data points (if they performed all three check-ins). Our regression model controlled for the check-in time as well as the participants’ self-reported motivation.

This allowed us to analyze which techniques worked best, along with uncovering some other interesting surprises!

What we found

We found that the interventions (when taken altogether) seemed to have a modest overall positive effect. Specifically, people who used any habit-boosting technique boosted the number of days per week they successfully practiced the habit by 0.28 days per week (p = 0.036 by t-test) compared to the combined control group. Around 9% more people who were assigned interventions felt that they met their habit goals compared to the people in the control group, too (p = 0.0038 by t-test).

But when we dug into the details of what specifically helped people form healthy habits, we were surprised by some of the things we didn't find:

  • Choosing your own habit-boosting techniques didn’t make a clear difference: We compared people who chose five habit-boosting techniques themselves to people who were randomly assigned to five techniques. To our surprise, there wasn’t a clear difference between the two groups that we could discern by using a t-test (a statistical test to check if two groups’ means differ) to check if the number of days the habit was practiced differed (p = 0.99). If taken at face value, this result means that people weren’t good at predicting what techniques would work for them, and didn't do better with techniques they got to choose than randomly assigned ones. It could also mean that people didn’t quite understand what the interventions would involve when they chose them, and so weren’t able to execute them well – after all, they only got to read brief explanations of each technique. That being said, it's possible there was a small difference in performance for this group that we simply didn't have a large enough sample size to detect (since the bigger the sample size, the smaller the effect it is possible to reliably detect). Given our sample sizes of 143 data points in the Choose Five group and 628 data points in the Random Five group, we calculated that we’d have less than an 80% chance of detecting a mean difference of 0.57 practice days out of seven. So if getting to choose your own technique has a positive effect, but that effect is substantially smaller than this, we may simply not be able to reliably detect it given a study with this number of data points.

  • Email reminders didn’t seem to make a clear difference in boosting habit performance, either: We couldn’t see a clear difference between how often people in the control group without reminders practiced their chosen habit, and how often people in the control group that got reminders did so. The mean number of habit practice days per week was 3.34 in the group without reminders (131 data points) and 3.19 with reminders (186 data points), but this wasn’t a statistically significant difference (p = 0.50). If there was no actual difference between these groups, it would imply that reminders don’t make a difference in boosting habit formation. We thought this was really surprising, given that it seems people often forget the intentions they form. Perhaps enrolling in a study makes people less likely to forget their intentions than normal, which could reduce the apparent usefulness of reminders. It also could be there was a small improvement from getting reminders that we simply wouldn't be able to reliably detect given the size of our study, as before. We calculate that we would have less than an 80% chance of detecting a difference of 0.64 days of practice out of 7 possible days on average with this sample size. So if there was a positive benefit of email reminders much smaller than that, studies like ours may not be sufficient to find it reliably.

  • WOOP didn’t work that well: The group that was assigned WOOP completed their habits more successfully during the first week than the control group did; the control group practiced their habits 3.66 out of 7 days on average, whereas the WOOP group did so 4.21 days (a somewhat promising improvement of 0.55 extra days per week). But this difference was not quite statistically significant (p = 0.08). And even this seeming benefit of WOOP faded after the first week. After that, we couldn’t find any differences between WOOP and the control group in terms of results (p = 0.90 for week 2 and p = 0.96 for weeks 3 and 4, with 3.2 mean practice days for week 3 and 2.6 for weeks 3 and 4 in both groups). The lack of a difference between the WOOP group and the control group was surprising to us, given that WOOP has a lot of evidence favoring it in the psychological literature (indeed, we viewed it as a gold standard effective intervention that we wanted to compare the other groups to). The fact that WOOP became less effective over time wasn’t surprising -- we found a trend of decreased success over time across the board. As the weeks went on, participants tended to perform their habits fewer days per week on average, which is what we expected to happen. But WOOP's effectiveness, if it had any to begin with, unfortunately disappeared after the first week. It's unclear why WOOP didn't help, despite its track record of success in the academic literature. We can only speculate – it could be that it doesn't work well for habit formation in particular, that its effect size is too small to detect in our study, that our WOOP intervention wasn't long or intense enough to create a meaningful effect size, that the academic literature contains false positives, or that our WOOP instructions unintentionally deviated in some unknown way from other more effective WOOP interventions. Or it could have just been really bad luck for WOOP.

But enough focusing on the negative! One positive result we found was that motivation mattered: people who were more motivated tended to perform their habit more often. Not exactly surprising, but certainly important! We asked participants to rate their motivation level (for performing the habit) on a 5-point scale at the beginning of the study, and we found that each additional point of self-reported motivation corresponded to habit performance 0.26 additional times per week on average – even when controlling for the habit-forming technique each user was assigned. The fact that motivation boosted habit performance makes intuitive sense, so it’s good to see that our experiment verified it! We can't be absolutely sure that motivation caused better habit formation (since we were merely measuring motivation, not randomizing it), but it stands to reason that it probably is a cause. So if you're going to try to form a new habit, it is probably wise to choose a habit that you really feel motivated to create.

We also found a handful of habit-boosting techniques which stood out from the selection of 22 that we tested:

  • Habit Reflection: Looking back on a previous successful habit change, identifying what factors and techniques led to your success, and then devising a way to implement those same factors and techniques for the current habit you want to form. This performed by far the best of any of the 22 techniques we tested! The impressive effectiveness of this short intervention was perhaps the most interesting and surprising finding of the whole study.

  • Mini-Habit: Coming up with a very brief version of your habit you can do when you don’t have time for the full version. This technique may help you maintain a habit success streak and make things easier when scheduling is difficult. For instance, if you don't have time to go to the gym, just do 5 pushups instead as a backup plan (a fallback that is so short that you should not have any excuse to skip it). The positive effect we found for this intervention was modest, but probably not just due to chance.

  • Home Reminders: While the automated email reminders we sent didn’t make a clear impact, writing a physical note to yourself and placing it in your home or workplace to remind you of the habit might help. The positive effect was modest, and could have been due to chance.

  • Drawing on Friends and Family: Asking friends or family to support you in your habit change efforts. The evidence for this one was among the least convincing of these five, however, and may well have been the result of chance.

  • Listing Habit Benefits: Making a list of benefits the chosen habit could provide, and pondering which benefit is most important. The evidence for this technique’s efficacy for most people in our sample isn’t super convincing, either. However, the technique was more effective for people with lower motivation. More on that below.

While the evidence suggested that these five techniques might have all worked to some degree, there was one clear winner out of the bunch: the Habit Reflection technique. The following table lists the regression coefficients for each of the five habits (controlling for what other techniques were assigned). The higher the coefficient, the stronger the average effect:

We’ll discuss how to interpret these numbers in the table in more detail below, but for those who just want the punchline, here it is: Habit Reflection had the strongest effect, and is the habit-boosting technique we place the most confidence in. We’re less confident that the other three standout techniques worked, especially "drawing on friends and family." Furthermore, note that the other 17 interventions we tested did not fare as well at creating habit change as the five in the table above! And one of those five, Listing Habit Benefits, only seems to work for people with low motivation to form their habit — it didn’t work that well in general. But more on that in a bit. First, let’s cover how to interpret these numbers.

You can use the numbers in the above table to quantify how effective each habit-boosting technique was. The numbers in the second-to-right-hand column ("Days per week habit was completed") are straightforward: they’re the average number of additional days per week on which participants successfully completed the habit, compared to control group participants (when controlling for all other techniques used, and controlling for what week it was in the study). For example: people who used Habit Reflection performed their habit 0.70 more times per week compared to people in the control group on average.

The values of the left-hand column – "Whether people felt they were meeting their habit goals" – come from logistic regression, a categorical prediction technique, since asking people “Did you practice your habit as often as you had planned to?” has a binary yes/no answer. (I.e. we wanted to predict which interventions were associated with people meeting their habit goal, controlling for all other techniques each person performed.) Interpreting these values requires a little math. If you raise the natural number e (approximately equal to 2.72) to the power of what you see in the left-hand column, you get the ratio of the odds that people performing the technique felt they were meeting their goal relative to control. Looking at Habit Reflection again, the value reported in the left-hand column is 0.84. Raising e to the 0.84 power gives 2.3. This means that the odds of someone doing Habit Reflection meeting their habit goal were 2.3 times higher than someone in the control group meeting their goal (when controlling for the other variables).

We’re now in a position to make the reasoning for our punchline more clear. Bigger numbers mean an overall stronger effect, and smaller p-values mean that the results are more certain (less likely to be due to chance fluctuations). From the table you can gather that Habit Reflection was the clear winner in terms of both effect size and p-value: it improved the number of days people actually did their habits on average more than any other technique (0.70 days per week – pretty impressive for such a short intervention), and helped people feel that they had achieved their habit goals more than any other technique. It also had by far the lowest p-values of the bunch, meaning the results were the least likely to be due to sampling error (i.e., random variation due to the luck of the draw). Note that even correcting for the fact that we tested 22 techniques (i.e. the fact that we had 22 shots to find an effect, which is quite a lot!), the p-value on Habit Reflection is still impressively tiny.

Taking a close look at the rest of the table, you can see that Mini-Habits performed well when people were asked the yes/no question of whether their habit formation goals were met. However, unlike Habit Reflection, Mini-Habits only boosted the number of days its users performed their habits each week by 0.22 (with a very unimpressive p-value of p=0.20). Mini-Habits is a simplified/less sophisticated version of the "Tiny Habits" approach pioneered by BJ Fogg. In theory, the effects we found for mini-habits could be due to the nature of the technique; remember that our main measurement was asking participants how many days out of the week they performed the full version of their habit, and Mini-Habits specifically encourages participants to perform smaller versions of their habit when they don't have time to do the full one. So it might make sense that they didn’t necessarily do too well in performing full versions of the habit, since they may have been doing partial versions instead. However, it turns out that the evidence didn’t support this hypothesis; people who were assigned the Mini-Habit technique didn’t perform partial habits more days per week than those who weren’t (p = 0.50 by a t-test). It’s not clear why Mini-Habits helped people meet their habit goals without strongly influencing the number of days they practiced their habit.

As for the other three habits stand on shakier ground. Home Reminders’ low p-values for both people meeting their habit goals and increasing the number of practice days suggests that this technique has relatively strong evidence backing its efficacy. However, its effect size is also quite small. So, it may work — just not that well. Drawing on Friends and Family may also work, but the evidence for it is weak, and could easily be the result of chance.

The same could be said for Listing Habit Benefits, except for a couple of additional interesting facts. First, when we compared everyone who used a given habit-boosting technique to everyone who didn’t use that technique, Listing Habit Benefits appeared near the top in terms of efficacy. We found this strange, since its p-values and effect sizes that came from our regression model were unimpressive. However, since our model controlled for motivation, it’s possible that Listing Habit Benefits could improve habit formation simply by boosting motivation in low-motivation people. This makes sense, since that’s the entire purpose of the technique!

To test this hypothesis, we performed another regression analysis in people with a self-reported motivation of 3 or lower out of 5. The results are shown in the table below. As you can see, the days per week the habit was completed jumped quite a bit in this subgroup. This suggests that while Listing Habit Benefits could be a useful second choice technique for people with lower motivation.

The other 17 techniques we tested (besides these 54 listed above) we think either don't work for boosting habit formation (at least, in the form and format we tested them), or produce such small positive effects that we weren't able to detect them in this study, though perhaps a larger study could detect them. You can see a list of all these techniques at the bottom of this article. That's a shockingly large number of interventions to all fail! This suggests that boosting habit formation is a really hard task.

Why did people find Habit Reflection helpful?

The fact that Habit Reflection came out so far ahead of the pack in terms of helpfulness piqued our curiosity. We explored this by looking at some of the written answers that people who used Habit Reflection gave about their experiences. Two patterns emerged:

  1. Recall that Habit Reflection asks the participant to recall a specific habit they successfully developed in the past, identify what factors led to their success, and come up with a plan to implement those same factors and techniques for the current habit they want to form. Perhaps unsurprisingly, people found that the techniques they used successfully in the past worked for them again! Then again, this study was full of surprises, and what might seem obvious in hindsight would probably not be at all obvious before the study was conducted. Very few of the techniques we tested seemed to work, suggesting that Habit Reflection's effectiveness was far from certain.

  2. Some people who used Habit Reflection also mentioned that they found the effort of thinking through their situation and planning how they’d tackle forming their habit to be helpful.

We suspect that the processes of planning your own habit improvement intervention and identifying a tried-and-true technique that worked for a past habit were parts of why Habit Reflection was helpful. It is essentially a self-customizing intervention that ends up producing a different procedure for each user. However, we’re not super confident about why exactly it worked. The majority of people who used Habit Reflection didn’t provided any qualitative information relating to that technique, so there's only so much we can infer. Also, remember that each participant used several habit-boosting techniques at once. None of the participants explicitly called out Habit Reflection as the most useful technique in their qualitative responses, so attributing each respondent’s answer specifically to Habit Reflection requires some guesswork. Fortunately, you can use Habit Reflection without having to know why exactly it works.

How you can apply these findings to your own life

Putting it all together, here’s what our research suggests you should do if you’d like to form your own positive habits.

  • Check your motivation: We found that motivation played a pretty big role in how well habits stuck. So, before you embark on your habit formation journey, it might be worth coming up with a few possible habits you could perform. Ask yourself: how motivated are you to actually do each such habit regularly? It may be worth taking some time to explore the pros and cons. If you stuck with your habit over the next few months, or even years, would your life change a lot for the better? How? If you get happy or excited about the benefits of sticking with the habit, then it’s probably something you’re motivated about! If not, it may be worth thinking a bit more about whether the possible outcomes are worth pursuing, given the investment of time and effort required. If you aren't actually that motivated to create a specific positive habit, it's probably best to pick a different positive habit that you do feel very motivated about.

  • If you’re motivated, do a Habit Reflection: If you’re motivated to form your new positive habit, then you’re ready to move on to the next step: doing a Habit Reflection of what’s worked for you in the past. We found Habit Reflection to work best, so we suggest you start with that technique first. Think about positive habits you’ve tried to form in the past, and how you made them stick. Can you glean any lessons from this past success that could help with the current habit? Come up with a plan for how you will use what worked for you in the past to help you form this new habit.

  • Do it, and track it: Now that you have a habit, a plan, and an idea of how success will look, it’s time to apply the technique you developed through Habit Reflection to form your positive habit! Also, you may find it useful to keep track each day of whether you completed the habit.

  • If it doesn’t stick, troubleshoot: If the habit doesn’t seem to be sticking, take some time to figure out why. Perhaps you can find some clear problems interfering with forming your habit.

  • Switch it up: It’s possible that the habit-boosting technique you chose may not be a good fit for you. Even though we’re less confident that the other three techniques are effective, you may want to try them (either on their own, or in conjunction with Habit Reflection). Maybe you’ve been really low on time; in that case, Mini-Habits may help you keep on track. If you simply forgot to practice your habit, Home Reminders may be just what you need. Perhaps your family and friends can help motivate you if that’s what you find yourself lacking. Or if you find that you don’t have much motivation to form your helpful habit, Listing Habit Benefits could do the trick. While we’re less confident about these four techniques, they’re all fast, free, and easy to implement, so it’s worth a shot!

The habits we keep can have a big impact on our lives over time. We hope that these tools can help you effect some positive change in your life through easier habit formation. Give the process a go and see how it works for you. Happy habiting!

Caveats

While we're excited about some of the findings in this study, this research has some significant limitations and weaknesses to keep in mind.

We brought up one major one above, but it bears repeating: the p-values for many of the techniques that stood out, while lower than those of the other techniques we tested, are still pretty high, given the number of different techniques tested – each tested technique gives us another chance at potentially finding a false positive. While Habit Reflection was clearly statistically significant with a tiny p-value even adjusting for all of the hypotheses we tested, the other four standout interventions had relatively high p-values. This lowers our confidence that the other four techniques are effective (i.e. the results could be due to chance).

It's also worth noting that our p-values aren’t 100% accurate, since the data aren’t completely independent from each other. Recall that we pooled up to three data points for each participant (since each participant was asked to respond at 1 week, 2 weeks and 4 weeks, though not every participant responded to all three of these surveys). Pooling allowed us to increase our statistical power, but in theory it makes the data not completely statistically independent, since some data points came from the same people. More generally, p-values for linear regression are almost never 100% accurate, because their calculation necessarily makes certain assumptions about the structure of the data (related to assumptions of ordinary least squares regression itself, when used for hypothesis testing). It is commonplace for scientists to look past this issue, but that doesn't mean the issue is not there. Another issue is that a lot of people did not complete the whole study, and if there are biases in who completed the whole study that differ between intervention groups, that could also be a source of inaccuracy. A total of 742 people completed the screening, but only 477 completed the first follow-up, 388 the first and the second, and 316 the first through the third.

Another caveat to keep in mind is that we designed the study to test many habit-boosting practices. This allowed us to test many more techniques, but it also came at the cost of reducing the certainty one should place in our results for a couple of reasons. The first reason is that testing so many interventions increases the chance of false positives; the more hypotheses tested, the higher the likelihood that we could have seen an effect by chance (though with Habit Reflection this pretty clearly wasn't a problem). The second reason to take these results with a grain of salt is the possibility that having each person try five interventions led to some kind of interaction between them. Remember: each person in the Choose Five and Random Five group used five different techniques. On the other hand, we statistically controlled for the other techniques used to account for this problem, and randomized which techniques were grouped together for any given individual.

With these caveats in mind, we’re relatively confident that Habit Reflection was effective. One reason is because it came out on top for both our objective measurements of the number of days per week participants practiced their habits, and our subjective measurement of whether participants felt they’d met their goals. Consistency boosts confidence! Also, its effect size is clearly larger than those of the other four three habits, and its p-values are much smaller (a good sign!). Conversely, because the other four three techniques we highlighted as potentially working had effect sizes that are relatively low, their rank order isn’t consistent for their objective and subjective measurements, and their p-values are relatively high. As a result, we’re less confident that these techniques are effective. And the remaining 178 interventions probably don't work, for the reasons mentioned in the blog post.

Postscript: How doing each method compared to not doing it

If you’d like to get a better sense of how well each method performed, check out the table below. It shows how people who performed each method (or were in one of the control groups) fared against everyone who didn’t do that method. The p-value listed was calculated via t-test. The p-values in this table are different from the table above because these p-values were calculated via a t-test comparing people who did the technique versus everyone who didn’t do it; the table above has p-values for regression coefficients (and so controls for other variables, such as what other habit-boosting techniques a person was doing - since people were assigned to use multiple techniques at once - and what week it was).

A brief rundown of the techniques:

Note: this chart and glossary both use the original nicknames for the interventions that appeared in the experiment. In the discussion above and in the Daily Ritual tool, we changed the name of "spotanalysis" to Habit Reflection.

  • spotanalysis: Think of a habit you succeeded at forming in the past, think of what techniques or approaches were helpful when forming that past habit, and write down how you can apply those lessons now to your new habit.

  • homereminder: Write a physical note to yourself and place it in your home or workplace to remind you of the habit.

  • tacticbenefit: Make a list of the benefits of forming the habit to enhance motivation.

  • minihabit: Come up with with a tiny version of your habit you can do when you can't do the full one, to maintain your success streak.

  • friendsorfamily: Ask friends or family to support you in your habit change efforts.

  • evidencetrio: Increase your sense of self-efficacy by naming reasons that you'll successfully form your new habit.

  • activevisual: Visualize yourself in the act of performing your new habit, like athletes do while training.

  • contract: Make a solemn promise that you will practice your habit every day.

  • environmentchange: Come up with a strategy for how you can change the environment around you to make it easier to stick to your habit.

  • identityshift: Re-imagine yourself as the type of person who always performs your new habit, no matter what.

  • sametimedailycommitment: Perform the habit at the same time daily to make it more consistent.

  • mantra: Repeat the phrase "I CANNOT FAIL" to yourself 10 times.

  • mindfulness: Learn a simple mindfulness meditation exercise to use when stress is interfering with performing your habit.

  • rewardyourself: Come up with a reward you can give yourself every time you start performing your habit.

  • goalachievement: Visualize yourself achieving a meaningful personal goal as a result of practicing your new habit.

  • badresult: Motivate yourself by considering the possible negative consequences of failing to practice your new habit.

  • image: Look at an image of someone performing the habit.

  • WOOP: Try a scientifically supported multi-part intervention (Wish, Outcome, Obstacle, Plan) that combines mental contrasting with implementation intentions.

  • motivephrase: Pick a motivational phrase and apply it when you lack motivation.

  • visualizer: Visualize yourself after you have already successfully achieved the habit.

  • socialdeclaration: Post on social media to create a public commitment to forming your new habit.

  • failureplanning: Come up with a strategy for how to recover if you should lapse in practicing your habit.

  • pastgoal: Reflect on your success with a past goal in order to increase your motivation

bottom of page