Amazing point: Would your belief in something stand up to the question, “Wanna bet?” If you had to gamble significant money on that belief, would you still feel 100% about it? Or maybe more honestly 60%? This creates healthy skepticism encouraging you to seek the best information instead of just defending your belief. Now objective accuracy wins instead of argument.Combine with “The Biggest Bluff” by Maria Konnikova
A bet is a decision about an uncertain future.
Two things determine how your life turns out: the quality of your decisions, and luck. Recognize the difference between the two.
“Resulting” = when you equate the quality of a decision with the quality of its outcome.
A bad result does not mean a bad decision.
Hindsight bias is the tendency, after an outcome is known, to see the outcome as having been inevitable.
When a decision doesn’t work out, don't treat that result as if it were an inevitable consequence rather than a probabilistic one. Don't focus only on the poor outcome.
Real life consists of bluffing, of little tactics of deception, of asking yourself what is the other person going to think I mean to do.
The decisions you make in your life - in business, saving and spending, health and lifestyle choices, raising your children, and relationships - are “real games”.
Life is not like chess. Chess contains no hidden information and very little luck.
Life is more like poker. You could make the smartest, most careful decision and still have it blow up in your face.
You have to learn from the results of your decisions.
Saying “I don’t know” is a necessary step toward enlightenment.
It is a prelude to every great decision.
A great decision is the result of a good process - not that it has a great outcome.
Good decision-makers try to figure out how unsure they are, making their best guess at the chances that different outcomes will occur.
Any prediction that is not 0% or 100% can’t be wrong solely because the most likely future doesn’t unfold.
Long shots hit some of the time.
When you think probabilistically, you are less likely to use adverse results alone as proof that you made a decision error, because you recognize the possibility that the decision might have been good but luck and/or incomplete information (and a sample size of one) intervened.
Making better decisions is not about being wrong or right, but about calibrating among all the shades of grey.
Every decision commits you to some course of action that, by definition, eliminates acting on other alternatives. Not placing a bet on something is, itself, a bet.
Most bets are bets against yourself.
You are betting against all the future versions of yourself that you are not choosing.
Be a better belief calibrator, using experience and information to more objectively update your beliefs to more accurately represent the world.
This is how you *think* you form beliefs:
(1) You hear something.
(2) You think about it and vet it, determining whether it is true or false.
(3) Only after that, you form your belief.
It turns out, though, that you *actually* form abstract beliefs this way:
(1) You hear something
(2) You believe it to be true
(3) Only sometimes, later, if you have the time or the inclination, you think about it and vet it, determining whether it is, in fact, true or false.
How you form beliefs was shaped by the evolutionary push toward efficiency rather than accuracy.
Our ancestors could form new beliefs only through what they directly experienced.
It’s reasonable to presume our senses aren’t lying.
We didn’t develop a high degree of skepticism when our beliefs were about things we directly experienced, especially when our lives were at stake.
The system we already had was:
(1) experience it
(2) believe it to be true
(3) maybe, and rarely, question it later.
Our older system is still in charge.
“Wanna bet?” triggers you to engage in that third step that you only sometimes get to.
You might think of yourself as updating your beliefs based on new information, but you do the opposite, altering your interpretation of that information to fit your beliefs.
If you think of beliefs as only 100% right or 100% wrong, when confronting new information that might contradict your belief, you have only two options:
(a) make the massive shift in your opinion of yourself from 100% right to 100% wrong
(b) ignore or discredit the new information.
It feels bad to be wrong, so you choose (b).
Being smart can actually make bias worse. Blind-spot bias is worse the smarter you are.
Being smart and aware of your capacity for irrationality alone doesn’t help you refrain from biased reasoning.
People are better at recognizing biased reasoning in others but are blind to bias in themselves.
You are wired to protect your beliefs even when your goal is to truthseek.
Offering a wager brings the risk out in the open.
The more you recognize that you are betting on your beliefs (with your happiness, attention, health, money, or time), the more you are likely to temper your statements, getting closer to the truth as you acknowledge the risk inherent in what you believe.
Then you recognize that there is always a degree of uncertainty, that you are generally less sure than you thought you were, that what you believe is almost never 100% or 0% accurate.
What if, in addition to expressing what you believe, you also rated your level of confidence about the accuracy of your belief on a scale of zero to ten?
“I’m 60% confident that Citizen Kane won best picture” reflects that your knowledge is incomplete.
You can express how confident you are by thinking about the number of plausible alternatives and declaring that range.
The less you know about a topic or the more luck involved, the wider your range.
Acknowledging uncertainty is the first step in measuring and narrowing it.
It creates open-mindedness, and a more objective stance.
When confronted with new evidence, it is a very different narrative to say, “I was 58% but now I’m 46%.”
That doesn’t feel nearly as bad as “I thought I was right but now I’m wrong.”
Declaring your uncertainty in your beliefs to others makes you more a credible communicator.
By saying, “I’m 80%” and thereby communicating you aren’t sure, you open the door for others to tell you what they know. They realize they can contribute without having to confront you by saying or implying, “You’re wrong.”
Expressing uncertainty signals to your listeners that this needs further vetting.
Experience can be an effective teacher, but only some students listen to their teachers.
Experience is not what happens to a man; it is what a man does with what happens to him.
Identify when the outcomes of your decisions have something to teach you and what that lesson might be.
Any decision is a bet on what will likely create the most favorable future for you.
The future you have bet on unfolds as a series of outcomes.
Why did something happen the way it did? Bet on how you figure out what you should learn from an outcome.
If you determine your decisions drove the outcome, you can feed the data you get following those decisions back into belief formation and updating, creating a learning loop.
The challenge is that any single outcome can happen for multiple reasons.
Uncertainty drastically slows learning.
The way you field outcomes is predictably patterned: you take credit for the good stuff and blame the bad stuff on luck so it won’t be your fault. The result is that you don’t learn from experience well. “Self-serving bias” is the term
Blaming the bulk of your bad outcomes on luck means you miss opportunities to examine your decisions to see where you can do better. Taking credit for the good stuff means you will often reinforce decisions that shouldn’t be reinforced and miss opportunities to see where you could have done better.
An experienced player will choose to play only about 20% of the hands they are dealt, forfeiting the other 80% of the hands before even getting past the first round of betting. That means about 80% of the time is spent just watching other people play. Even if a poker player doesn’t learn, all that learning from watching others is just as fraught with bias.
If someone you view as a peer is winning, you feel like you’re losing by comparison. You benchmark yourself to them.
To change a habit, you must keep the old cue, and deliver the old reward, but insert a new routine.
Treating outcome fielding as a bet can accomplish the mindset shift necessary to reshape habit.
We wouldn’t reflexively field bad outcomes as all luck or good ones as all skill.
It’s easy to win a bet against someone who takes extreme positions.
Thinking in bets triggers a more open-minded exploration of alternative hypotheses, of reasons supporting conclusions opposite to the routine of self-serving bias. We are more likely to explore the opposite side of an argument more often and more seriously - and that will move you closer to the truth.
Agree to be open-minded to those who disagree eith you, give credit where it’s due, and take responsibility where it’s appropriate,
Other people can spot your errors better than you can.
Groups can improve the thinking of individual decision-makers when the individuals are accountable to a group whose interest is in accuracy.
As long as there are three people in the group (two to disagree and one to referee*), the truthseeking group can be stable and productive.
Encourage a diversity of perspectives.
You win bets by relentlessly striving to calibrate your beliefs and predictions about the future to more accurately represent the world.
Betting is a form of accountability to accuracy.
I started moaning to him about my bad luck in losing a big hand. He said, “There’s not much purpose in a poker story if the point is about something you had no control over, like bad luck.”
Where the challenge of a bet is always looming, evidence that might contradict a belief you hold is no longer viewed through as hurtful a frame. Rather, it is viewed as helpful because it can improve your chances of making a better bet.
To view yourself in a more realistic way, you need different perspectives.
Why might my belief not be true?
What other evidence might be out there bearing on my belief?
Are there similar areas I can look toward to gauge whether similar beliefs to mine are true?
What sources of information could I have missed or minimized on the way to reaching my belief?
What are the reasons someone else could have a different belief, what’s their support, and why might they be right instead of me?
What other perspectives are there as to why things turned out the way they did?
People are more willing to offer their opinion when the goal is to win a bet rather than get along with people in a room.
Telling someone how a story ends encourages them to be resulters, to interpret the details to fit that outcome.
Skepticism is about approaching the world by asking why things might not be true rather than why they are true.
Thinking in bets embodies skepticism by encouraging you to examine what you do and don’t know and what your level of confidence is in your beliefs and predictions.
It’s a recognition that everything you believe about the world is not true.
You need to be particularly skeptical of information that agrees with you, because you know that you are biased to just accept confirming evidence.
When you think about the past and the future, you engage deliberative mind, improving your ability to make a more rational decision.
If regret occurred before a decision instead of after, the experience of regret might get you to change a choice likely to result in a bad outcome.
Ask yourself a set of simple questions at the moment of the decision designed to get future-you and past-you involved.
What are the consequences of each of my options in ten minutes? In ten months? In ten years?
How would I feel today if I had made this decision ten minutes ago? Ten months ago? Ten years ago?
Think about your happiness as a long-term stock holding, not focused on day-by-day movements.
Strive for a long, sustaining upward trend in your happiness stock.
Consider a broad range of possibilities for how the future might unfold to make your best guess at the probability of each of those futures occurring.
You’re already making a prediction about the future every time you make a decision, so you’re better off if you make that explicit.
Imagine yourself looking back from the destination and figure how you got there.
Remembering the future is a better way to plan for it.
Your decision-making improves when you can more vividly imagine the future, free of the distortions of the present.
Negative visualization makes you more likely to achieve your goals.
Imagine obstacles in the way of reaching your goals.
Start a premortem by imagining why you failed to reach your goal.
Then imagine why. All those reasons why you didn’t achieve your goal help you anticipate potential obstacles and improve your likelihood of succeeding.
The likelihood of positive and negative futures must add up to 100%.
When you see how much negative space there really is, you shrink down the positive space to a size that more accurately reflects reality and less reflects your naturally optimistic nature.