Thinking in Bets: Making Smarter Decisions When You Don’t Have All the Facts

Thinking in Bets: Making Smarter Decisions When You Don’t Have All the Facts

We live in a world of proclamations. “I trust him with my life.” “She’ll never let me down.” “This investment is a sure thing.” These absolute statements flow easily from our lips, shaping our decisions and relationships. But what happens when we add two simple words to these declarations? “Wanna bet?”

The question cuts through the fog of certainty like a knife. Suddenly, we’re forced to examine our confidence in measurable terms. Would we stake $10,000 on that trusted friend never betraying our confidence? Would we bet our life savings on that “sure thing” investment? The mere act of quantifying our beliefs transforms them from comfortable absolutes into careful probabilities.

This transformation isn’t just a mental exercise – it’s a fundamental shift in how we process reality. Every day, we navigate through a maze of uncertainties, yet our minds crave the comfort of certainty. We build elaborate narratives about people, events, and possibilities, often forgetting that our knowledge is inherently incomplete. We forget that trust isn’t binary but exists on a spectrum, that relationships aren’t static but evolving probabilities, that even our most firmly held beliefs should be open to revision.

Consider how we update our views about people. When we first meet someone, we might assign them a basic level of trust based on initial impressions. Each interaction then becomes a data point, slightly adjusting our probability assessment of their reliability. A friend who consistently shows up on time gradually earns a higher trust probability in matters of punctuality. A colleague who occasionally overpromises and underdelivers makes us adjust our confidence in their future commitments downward.

This is Bayesian thinking in action – the continuous updating of beliefs based on new evidence. As Sharon Bertsch McGrayne explains:

Knowledge is indeed highly subjective, but we can quantify it with a bet. The amount we wager shows how much we believe in something.

Sharon Bertsch McGrayne - The Theory That Would Not Die

When we frame our beliefs in terms of bets, we’re forced to acknowledge the stakes involved in our decisions and the limits of our knowledge.

I found myself thinking about these ideas while reading Annie Duke’s “Thinking in Bets.” Unlike Nassim Taleb’s meandering explorations of uncertainty in “Fooled by Randomness,” Duke gets straight to the point. She takes these big ideas about probability and decision-making and turns them into something we can actually use.

What Did I Get Out of It?

Reading “Thinking in Bets” reminded me of Shane Parrish’s Decision by Design course. Both explore how to make better decisions in an uncertain world, but Duke brings her unique perspective from the poker table. While Parrish teaches us to create mental models and decision journals, Duke shows us how to think probabilistically about everyday choices.

The beauty of Duke’s approach is its simplicity. She takes complex ideas about probability and uncertainty and turns them into practical tools you can use immediately. No complex frameworks or elaborate processes – just a new way of thinking about the decisions we make every day.

Here’s what I learned:

The Fundamental Insight: Decisions as Bets

The core idea that changed my thinking was remarkably simple: every decision is a bet on a future we can’t see. When I choose a career path, pick a place to live, or even decide who to trust, I’m betting on an uncertain future. As Duke explains:

A bet really is: a decision about an uncertain future. The implications of treating decisions as bets made it possible for me to find learning opportunities in uncertain environments.

This isn’t just clever wordplay. Thinking about decisions as bets changes how we approach them. It forces us to acknowledge that even our most confident choices involve uncertainty. Two things determine how our lives turn out:

Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck. Learning to recognize the difference between the two is what thinking in bets is all about.

The power of this framework becomes clear when we look at how most of us make decisions. We often think we’re not betting when we choose not to act. But Duke points out:

Every decision commits us to some course of action that, by definition, eliminates acting on other alternatives. Not placing a bet on something is, itself, a bet.

This insight changed how I think about seemingly simple choices. When I decide to stay in my current job, I’m betting against all other potential career paths. When I choose to live in a particular city, I’m betting against all other possible locations. Even in relationships:

Whenever we make a parenting choice (about discipline, nutrition, school, parenting philosophy, where to live, etc.), we are betting that our choice will achieve the future we want for our children more than any other choice we might make given the constraints of the limited resources we have to allocate—our time, our money, our attention.

But perhaps the most valuable lesson is about separating decision quality from outcomes. Poker players have a word for the mistake of judging decisions by their results:

Pete Carroll was a victim of our tendency to equate the quality of a decision with the quality of its outcome. Poker players have a word for this: “resulting.” When I started playing poker, more experienced players warned me about the dangers of resulting, cautioning me to resist the temptation to change my strategy just because a few hands didn’t turn out well in the short run.

This concept of “resulting” helps explain why learning from experience is so tricky. Life isn’t like chess, where bad outcomes always mean bad decisions:

Chess contains no hidden information and very little luck. The pieces are all there for both players to see… If you lose at a game of chess, it must be because there were better moves that you didn’t make or didn’t see.

Real life is more like poker:

Poker, in contrast, is a game of incomplete information. It is a game of decision-making under conditions of uncertainty over time… You could make the best possible decision at every point and still lose the hand, because you don’t know what new cards will be dealt and revealed.

Understanding this distinction between poker and chess helped me become more comfortable with uncertainty. It taught me that a good decision process matters more than any single outcome:

What makes a decision great is not that it has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge. That state of knowledge, in turn, is some variation of “I’m not sure.”

Breaking Free from “Right vs. Wrong” Thinking

Most of us are trapped in binary thinking. We see decisions as either right or wrong, people as either trustworthy or not, investments as either good or bad. Duke shows us why this mindset holds us back:

When we think of beliefs as only 100% right or 100% wrong, when confronting new information that might contradict our belief, we have only two options: (a) make the massive shift in our opinion of ourselves from 100% right to 100% wrong, or (b) ignore or discredit the new information.

The solution isn’t to become uncertain about everything, but to embrace probability in our thinking. As Duke explains:

Any prediction that is not 0% or 100% can’t be wrong solely because the most likely future doesn’t unfold. Long shots hit some of the time.

This shift from certainty to probability changes everything. Instead of saying “I trust him completely” or “This investment will definitely work,” we learn to think in ranges and probabilities. Duke puts it perfectly:

When we move away from a world where there are only two opposing and discrete boxes that decisions can be put in—right or wrong—we start living in the continuum between the extremes. Making better decisions stops being about wrong or right but about calibrating among all the shades of grey.

The most powerful part of this approach is how it changes our relationship with uncertainty. Instead of pretending we’re sure about things, we can acknowledge what we don’t know:

What makes a decision great is not that it has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge. That state of knowledge, in turn, is some variation of “I’m not sure.”

The Belief Formation Problem

We like to think we’re rational beings who carefully evaluate information before forming beliefs. Duke bursts this bubble with a surprising insight:

This is how we think we form abstract beliefs:We hear something;We think about it and vet it, determining whether it is true or false;only after that We form our belief.

It turns out, though, that we actually form abstract beliefs this way:We hear something;We believe it to be true;Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.

This revelation gets even more interesting when we consider our intelligence. Being smart doesn’t protect us from this problem – it might make it worse:

Being smart can actually make bias worse. Let me give you a different intuitive frame: the smarter you are, the better you are at constructing a narrative that supports your beliefs, rationalizing and framing the data to fit your argument or point of view.

The challenge runs deeper than just initial belief formation. Once we believe something, that belief becomes incredibly stubborn:

Once a belief is lodged, it becomes difficult to dislodge. It takes on a life of its own, leading us to notice and seek out evidence confirming our belief, rarely challenge the validity of confirming evidence, and ignore or work hard to actively discredit information contradicting the belief.

The solution? Duke suggests treating our beliefs like bets:

Truthseeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information. We might think of ourselves as open-minded and capable of updating our beliefs based on new information, but the research conclusively shows otherwise. Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.

The Challenge of Self-Serving Bias

Learning from experience should be straightforward. We make decisions, see what happens, and adjust. But Duke shows us why it rarely works this way:

The way we field outcomes is predictably patterned: we take credit for the good stuff and blame the bad stuff on luck so it won’t be our fault. The result is that we don’t learn from experience well.

This “self-serving bias” creates two problems. First:

Blaming the bulk of our bad outcomes on luck means we miss opportunities to examine our decisions to see where we can do better.

And second:

Taking credit for the good stuff means we will often reinforce decisions that shouldn’t be reinforced and miss opportunities to see where we could have done better.

This bias runs deeper than just protecting our ego. We’re wired to view ourselves in competition with others:

Our genes are competitive. As Richard Dawkins points out, natural selection proceeds by competition among the phenotypes of genes so we literally evolved to compete, a drive that allowed our species to survive… If someone we view as a peer is winning, we feel like we’re losing by comparison. We benchmark ourselves to them.

We cannot eliminate these biases. Instead, Duke suggests treating our interpretation of outcomes as a bet:

Treating outcome fielding as a bet can accomplish the mindset shift necessary to reshape habit. If someone challenged us to a meaningful bet on how we fielded an outcome, we would find ourselves quickly moving beyond self-serving bias… If we wanted to win that bet, we wouldn’t reflexively field bad outcomes as all luck or good ones as all skill.

Creating Better Decision-Making Habits

Changing how we think isn’t easy. Duke offers a practical approach based on group accountability:

Groups can improve the thinking of individual decision-makers when the individuals are accountable to a group whose interest is in accuracy.

These groups need three key elements to work:

A focus on accuracy (over confirmation), which includes rewarding truthseeking, objectivity, and open-mindedness within the group; Accountability, for which members have advance notice; and Openness to a diversity of ideas.

The power of these groups comes from how they change our relationship with uncertainty:

We win bets by relentlessly striving to calibrate our beliefs and predictions about the future to more accurately represent the world. In the long run, the more objective person will win against the more biased person.

When we’re part of such a group, we start asking better questions:

When we think in bets, we run through a series of questions to examine the accuracy of our beliefs: Why might my belief not be true? What other evidence might be out there bearing on my belief? Are there similar areas I can look toward to gauge whether similar beliefs to mine are true? What sources of information could I have missed or minimized on the way to reaching my belief? What are the reasons someone else could have a different belief, what’s their support, and why might they be right instead of me?

The betting framework makes people more willing to engage:

People are more willing to offer their opinion when the goal is to win a bet rather than get along with people in a room.

The Power of Perspective

Duke introduces skepticism as a fundamental tool for better thinking:

Skepticism is about approaching the world by asking why things might not be true rather than why they are true. It’s a recognition that, while there is an objective truth, everything we believe about the world is not true.

This skepticism becomes particularly important when information confirms our beliefs:

And we need to be particularly skeptical of information that agrees with us because we know that we are biased to just accept and applaud confirming evidence. If we don’t “lean over backwards” (as Richard Feynman famously said) to figure out where we could be wrong, we are going to make some pretty bad bets.

Thinking in bets helps cultivate this skepticism:

Thinking in bets embodies skepticism by encouraging us to examine what we do and don’t know and what our level of confidence is in our beliefs and predictions. This moves us closer to what is objectively true.

The goal isn’t just to be skeptical, but to actively seek different perspectives:

Agreeing to be open-minded to those who disagree with us, giving credit where it’s due, and taking responsibility where it’s appropriate, even (and especially) when it makes us uncomfortable.

This approach helps us avoid a common trap:

Telling someone how a story ends encourages them to be resulters, to interpret the details to fit that outcome.

Time Travel as a Decision Tool

Annie Duke offers an interesting perspective on regret. Instead of seeing it as something that happens after decisions, we can use it before:

The problem isn’t so much whether regret is an unproductive emotion. It’s that regret occurs after the fact, instead of before. As Nietzsche points out, regret can do nothing to change what has already happened.

She suggests we can make regret useful:

We just wallow in remorse about something over which we no longer have any control. But if regret occurred before a decision instead of after, the experience of regret might get us to change a choice likely to result in a bad outcome.

One practical way to do this is the 10-10-10 rule:

“Every 10-10-10 process starts with a question… [W]hat are the consequences of each of my options in ten minutes? In ten months? In ten years?” This set of questions triggers mental time travel.

This approach helps us avoid a common mistake:

Our problem is that we’re ticker watchers of our own lives. Happiness (however we individually define it) is not best measured by looking at the ticker, zooming in and magnifying moment-by-moment or day-by-day movements.

Instead:

We would be better off thinking about our happiness as a long-term stock holding. We would do well to view our happiness through a wide-angle lens, striving for a long, sustaining upward trend in our happiness stock.

Managing Uncertainty

Duke offers two powerful tools for managing uncertainty: backcasting and premortems. With backcasting, we start at the end and work backward:

We’d be better off imagining ourselves looking back from the destination and figuring how we got there. When it comes to advance thinking, standing at the end and looking backward is much more effective than looking forward from the beginning.

This reverse engineering works because:

Our decision-making improves when we can more vividly imagine the future, free of the distortions of the present. By working backward from the goal, we plan our decision tree in more depth, because we start at the end.

Then comes the premortem - imagining why we failed:

There has been a massive amount written about visualizing success as a way to achieve our goals. Because that’s such a common element in self-help strategies, conducting a premortem (with its negative visualization) may seem like a counterproductive way to succeed.

Counter-intuitively:

Negative visualization makes us more likely to achieve our goals… we need to have positive goals, but we are more likely to execute on those goals if we think about the negative futures.

Using both techniques together creates a complete picture:

Imagining both positive and negative futures helps us build a more realistic vision of the future, allowing us to plan and prepare for a wider variety of challenges, than backcasting alone.

This combination helps us maintain perspective:

When we see how much negative space there really is, we shrink down the positive space to a size that more accurately reflects reality and less reflects our naturally optimistic nature.

The power of this approach is that it forces us to consider both success and failure paths, making our planning more robust and realistic.

Learning from Experience

Duke explains why learning from experience is harder than it seems:

The way our lives turn out is the result of two things: the influence of skill and the influence of luck. The quality of our decision-making was the main influence over how things turned out. If, however, an outcome occurs because of things that we can’t control (like the actions of others, the weather, or our genes), the result would be due to luck.

This uncertainty complicates learning:

Introduction of uncertainty drastically slows learning.

The challenge is that outcomes don’t come with clear explanations:

The challenge is that any single outcome can happen for multiple reasons. The unfolding future is a big data dump that we have to sort and interpret… the world doesn’t connect the dots for us between outcomes and causes.

Duke suggests we need to be more deliberate about learning:

If we determine our decisions drove the outcome, we can feed the data we get following those decisions back into belief formation and updating, creating a learning loop… Ideally, our beliefs and our bets improve with time as we learn from experience. Ideally, the more information we have, the better we get at making decisions about which possible future to bet on.

Who Is This For?

There’s no shortage of writing about decision-making. From academic papers to blog posts, everyone (including me) seems to have advice about how to make better choices. What sets Duke’s book apart is how it bridges theory and practice.

Most discussions of probabilistic thinking remain theoretical. We’re told to assign probabilities to our beliefs, but this feels abstract and impractical. Duke’s “wanna bet” framework changes this. When we ask ourselves how much we’d actually wager on a belief, probability becomes tangible. The question isn’t “what probability would you assign?” but “how much would you bet?” This simple shift makes probabilistic thinking practical.

The book is particularly valuable for anyone who makes decisions under uncertainty - which is all of us. But it’s especially useful for those who’ve read works like Nassim Taleb’s “Fooled by Randomness” and want a more practical framework for applying those insights. While Taleb explains why we need probabilistic thinking, Duke shows us how to actually do it.

If you’re looking for complex decision-making frameworks or elaborate planning tools, this isn’t your book. But if you want to improve how you think about uncertainty and make better real-world decisions, Duke’s approach is invaluable. She takes sophisticated ideas about probability and uncertainty and makes them accessible without oversimplifying them.