Derek Sivers
How Minds Change - by David McRaney

How Minds Change - by David McRaney

ISBN: 0593190297
Date read: 2023-07-30
How strongly I recommend it: 9/10
(See my list of 360+ books, for more.)

Go to the Amazon page for details and reviews.

The psychology of how and why we believe. How we learn and change. The craft of doubt and persuasion. Insights into the social definition of truth. Epistemology. Certainty is a feeling not based on facts. Well-written with a nice balance of story-telling and deeper dives.

my notes

Reality itself, as we experience it, isn’t a perfect one-to-one account of the world around us.
The world, as you experience it, is a simulation running inside your skull.
Each animal assumes that what it can perceive is all that can be perceived.
Objective reality can never be fully experienced by any one creature.

All reality is virtual.
Consensus realities are mostly the result of geography.
People who grow up in similar environments around similar people tend to have similar brains and thus similar virtual realities.

When people who have been deaf for life receive implants that allow them to hear, they at first experience only static.
With older people, the bombardment of new sensory experiences can feel unwelcome.
They’ve been making sense of the world without sound for so long that they sometimes return the implants so they may return to the silent, yet manageable, reality they’ve always known.
For brains, everything is noise at first.
Then brains notice the patterns in the static, and they move up a level, noticing patterns in how those patterns interact.
Then they move up another level, noticing patterns in how sets of interacting patterns interact with other sets, and on and on.
When novel information arrives via the senses, it doesn’t get added to subjective reality right away.
It remains noise if it doesn’t seem to match a pattern in that layered archive of prediction.
The brain needs some repeated experience with it, like the horizontal lines for those sensory-deprived cats and the shapes and colors for the previously-blind patients.

When the truth is uncertain, our brains resolve that uncertainty without our knowledge by creating the most likely reality they can imagine based on our prior experiences.

Naive realism: the belief that you perceive the world as it truly is, free from assumption, interpretation, bias, or the limitations of your senses.
We believe we arrived at our beliefs, attitudes, and values after careful, rational analysis through unmediated thoughts and perceptions.
People find it difficult to understand how the other side could possibly see things differently when the evidence seems obvious they should.
Since subjectivity feels like objectivity, naive realism makes it seem as though the way to change people’s minds is to show them the facts that support your view,

“Cognitive empathy”: an understanding that what others experience as the truth arrives in their minds unconsciously.

When experiences don’t match our expectations, a spike in dopamine lasting about a millisecond motivates us to stop whatever we were doing and pay attention.
After the surprise, we become motivated to learn from the new experience so we can be less wrong in the future.
Surprises encourage us to update our behaviors.
They change our minds without us noticing as the brain quietly updates our predictive schemas.

In philosophy, the idea of “knowing” something doesn’t mean believing that you know something.
It means knowing something that also happens to be true.

We can’t be in a post-truth world if we never lived in a truth-filled paradise to begin with.

Epistemology is the study of knowledge itself - facts, fictions, rationalizations, justifications, rationality, logic - all of it.
Epistemology is a framework for sorting out what is true.

Knowledge comes in roughly two forms.
We can know THAT: pudding exists and trees are tall, yesterday it rained, tomorrow is Sunday. This is declarative knowledge.
We can also know HOW: the method behind break dancing or changing a tire. This is procedural knowledge.

Propositions are neither true nor false, just claims that could go either way.
Someone uses a sentence to state something that could be true.
Then that proposition is challenged by asking for its justification.
Then by the standards of propositions, this one is not justified.
Therefore, it is false.

If you claim that all swans are white, and if your justification is that all the swans you’ve ever seen have been that color, all it would take is one black swan to prove your claim false.
In this case, your claim would still be a belief, one you might hold in high confidence, but since there might be a black swan out there you’ve never seen, it doesn’t count as knowledge.

Epistemology is about translating evidence into confidence.
By taking what we believe, and then sorting through some kind of system for arranging, organizing, and classifying it against the available evidence, our certainty in a truth should go up or down.

We might reach a degree of confidence that suggests the moon controls the tides.
We might grow ever more certain it controls our dreams.

The epistemology called science seems to have won out.
In science, you treat all your conclusions as maybes.

You reach consensus on what is empirically true among what is observable and measurable.
we should always work to disconfirm our conclusions and those of others instead of confirming them, which is what we would usually rather do.

Conditions allow us to create rules for what isn’t true.

Until we know we are wrong, being wrong feels exactly like being right.

Since the brain doesn’t know what it doesn’t know, when it constructs causal narratives it fills holes in reality with provisional explanations.
The problem is that when a group of brains all uses the same placeholder, good-enough-for-now construal to plug such a hole, over time that shared provisional explanation can turn into consensus - a common sense of what is and is not true.
This tendency has led to a lot of strange shared beliefs over the centuries, consensus realities that today seem preposterous.
For instance, for a very long time most people believed that geese grew on trees.
People believed that rotting meat gave birth to flies, and that piles of dirty rags could transform into mice, and that burning logs created salamanders.

When we first suspect we may be wrong, when expectations don’t match experience, we feel viscerally uncomfortable and resist accommodation by trying to apply our current models of reality to the situation.
It’s only when the brain accepts that its existing models will never resolve the incongruences that it updates the model itself by creating a new layer of abstraction to accommodate the novelty.
The result is an epiphany, and like all epiphanies it is the conscious realization that our minds have changed that startles us, not the change itself.

When we get that “I might be wrong” feeling, we initially try to explain it away, interpreting novelty as confirmation, looking for evidence that our models are still correct.

Bistable visual illusions like the duckrabbit:
When we update, it isn’t the evidence that changes, but our interpretation of it.

Mind change is a sort of Ship of Theseus, replacing things bit by bit while at sea so as to never risk sinking the vessel.
It’s integration, not replacement.
No matter how novel, it’s never at first totally independent of previous knowledge.
It is only a reorganization, adjustment, correction, or addition.

Surviving extreme change trauma leads to an adaptive spiral of positive development, an awakening of a new self through posttraumatic growth.
A musician who had become permanently paralyzed, lives turned upside down by plane crashes, house fires, the loss of limbs, and so on:
They all say it was “the best thing that ever happened to me.”
After being diagnosed with terminal cancer, after losing a child, after a crushing divorce, after surviving a car accident or a war or a heart attack, people routinely report that the inescapable negative circumstances they endured left them better people.
They shed a slew of outdated assumptions that, until the trauma, they never had any reason to question, and thus never knew were wrong.
People report that it feels like unexplored spaces inside their minds have opened up, ready to be filled with new knowledge derived from new experiences.
Like the reconstruction that follows an earthquake.
Anything reduced to rubble won’t be rebuilt in the same, unreliable way again.
Akin to a new worldview that is “far more resistant to being shattered.”

A collection of conjectures that don’t feel like conjectures.
Our “assumptive world”, inherited and internalized from our culture:
a set of knowledge, beliefs, and attitudes that guides our actions, helps us understand the causes and reasons for what happens.
It puts the immediate present into context.
A library of if–then statements tell us what will happen in the future when we interact with the world.
It allows us to create plans.
It tells us how we ought to behave if we want to keep our friends and family members close.

After a sudden, far-reaching challenge to the accuracy of your assumptive world:
You enter a state of active learning in which you immediately and constantly consider other perspectives.
So many of the facts, beliefs, and attitudes that populated your old models of reality are replaced that your very self changes.
When you can’t escape the upending of your identity, it forces you to be somebody else, the next viable you - a stripped-down whole other clear-eyed person.

When confronted with new information that seems inconsistent with our existing priors, cognitive dissonance draws our attention to the fact that those priors may need an update.

The brain prefers to assimilate, to incorporate new information into its prior understanding of the world.
In other words, the solution to “I might be wrong” is often “but I’m probably not.”
Switching back and forth between overwriting old information and conserving what it already holds.

In science, if an experiment delivers a result that doesn’t match expectations, researchers place the anomalies aside, in a holding bucket.
They can continue to work on problems using the tools that the current model affords and come back to the bucket of abeyance later, should the anomalies accumulate and overflow.
But if it grows too heavy to ignore:
Rules of thumb are no longer useful.
Exceptions cease to prove the rule.
Variation and nuance reveal stereotypes for what they are.

How minds update:
Once we learn that something is incorrect, we also learn that the source from which we learned that thing can be incorrect, which opens us up to the idea that maybe the sources we trust could be wrong about a lot of other things.
Once we consider a reference group trustworthy, questioning any of their accepted beliefs or attitudes questions all of them.

We value being good members of our groups much more than we value being right.
We will choose to be wrong if it keeps us in good standing with our peers.
Social death is more frightening than physical death.

The average person will never be in a position where beliefs on gun control or climate change or the death penalty will affect their daily lives.
The only useful reason to hold any sort of beliefs on those issues, to argue about them, or share them with others is to convey group allegiance.
For issues about which your tribe has formed a consensus, others will use your agreement as a measure of how much they can trust you.

Scientists, doctors, and academics are not immune.
But lucky for them, in their tribes, openness to change and a willingness to question one’s beliefs or to pick apart those of others also signals one’s loyalty to the group.
Their belonging goals are met by pursuing accuracy goals.

When we are fearful, we are constantly attempting to reduce the chaos and complexity of an uncertain world into something manageable and tangible, something we can fight.

He was free to question his beliefs because he was free of the fear of ostracism.
If we feel affirmed, accepting challenging evidence or considering new perspectives poses less of a threat.
And that affirmation grows stronger if we’re reminded that we belong to several tribes.
It is rational to resist facts when one has no social safety net.

If you are interacting mostly through text, you don’t see people crying.

Truth is social.
Conspiracy theorists thrive in social groups that are not social.
Conspiracy theorists and fringe groups may hold individually coherent theories, but there is no true consensus, just the assumption of consensus.
If they hung out together, they might catch on to that, but since they rarely do, they can each keep their individual theories and still assume they have the backing of a tribe.
They never get a chance to argue face-to-face, so there is no evolution of ideas.

A real doctor won’t try and defend everything about medicine.
They’ll say, ‘Oh some of this must be wrong, but we don’t know which bits.’
Whereas a conspiracy theorist has to defend all of it.

Reason is a slave of the emotions.

We are unaware of how unaware we are.
We are the unreliable narrators in the stories of our own lives.
We observe our own behavior and contemplate our own thoughts the way an observer would another person, and then we create rationalizations and justifications for what we think, feel, and believe.
The more intelligent you are, and the more educated, the more data at your disposal, the better you become at rationalizing and justifying your existing beliefs and attitudes, regardless of their accuracy or harmfulness.

When motivated to find supporting evidence, that’s all we look for.
When the bathroom scale gives us bad news, we reweigh ourselves a few times to make sure.
When it gives us good news, we step off and go about our day.

Reasoning isn’t logic.
Reasoning is often confused with reason.
Reasoning is coming up with arguments - plausible justifications for what you think, feel, and believe.

With no one to tell you that there are other points of view to consider, no one to poke holes in your theories, reveal the weakness in your reasoning, produce counterarguments, reveal potential harm, or threaten sanction for violating a norm, you will spin in an epistemic hamster wheel.
In short, when you argue with yourself, you win.

The function of reasoning is to argue your case in a group setting.

We are social animals first and individual reasoners second.

As part of a group that can communicate, every perspective has value, even if it is wrong.
Each person needs to contribute a strongly biased perspective to the pool.
We expect to off-load the cognitive effort to a group process.
The group will be smarter than any one person thanks to the division of cognitive labor.
People almost always get the wrong answers when reasoning alone.
In groups, however, they tend to settle on the correct answers in seconds.

If people couldn’t change their minds, argumentation would have long ago been tossed into the evolutionary dustbin.

Why does social media feel like soul poison?
if people are insulated from essential group dynamics, from outside perspectives, then individuals will essentially argue with themselves.
Groups who form because of shared attitudes tend to become more adamant and polarized over time.
When we wish to see ourselves as centrists but learn that others in our group take a much more extreme position, we realize that to take the middle position, we must shift our attitude in the direction of the extreme.
In response, people who wish to take extreme positions must shift further in that direction to distance themselves from the center.
This comparison-to-others feedback loop causes the group as a whole to become more polarized over time, and as consensus builds, individuals become less likely to contradict it.

The internet makes it easier to form groups around our biased and lazy reasoning; but it also exposes us to the arguments of those outside of our groups.
With all the arguing and all the bad ideas fighting one another, and even if you remain silent, someone will voice something that resembles your private opinion, and someone will argue with them.
Even as spectators, we can realize when the weaknesses of our justifications have been exposed.

Psychology defines beliefs as propositions we consider to be true.
Attitudes are evaluations, feelings that arise about anything and thus influence our motivations.
Together, beliefs and attitudes form our values and goals.

The same message that persuaded one person would discourage another.
If elaboration leads to a positive evaluation of the reasoning behind an argument, persuasion will succeed.
If it leads to neutral or negative evaluation, the persuasion will fail.

For the unmotivated, more arguments, of any kind, even bad ones, made them more likely to support.
Instead of paying attention to the content of the arguments, they paid attention to their number.

When there’s a handy heuristic available, we will fall back on it.
We prefer to use simple cues to get the gist, including persuasive messages.

Yes, people want to be correct.
It’s just that they think they already are.

It takes fewer calories to assume if everyone around me says something is true, it is.
If I read it three times, it’s likely.
If it feels good, keep doing it.

In evaluating trustworthiness:
We ask ourselves if the speaker is an expert.
We look to see if the speaker is trying to trick us in some way.
And we look to see if the speaker agrees with the groups with which we identify.

But even from an untrustworthy source, if the argument is compelling, it will linger in your mind.
If we hear the same information in other formats or from other speakers, our reasons for discounting the ideas wash away, the persuasiveness of the message remains.
Initial rejection is often followed by a ghostly increase in agreement over time.

A message becomes more impactful when paired with popular counterarguments.
Presenting your opponent’s arguments before they do demonstrates trustworthiness by revealing you respect the audience’s intelligence.
Which side should come first? The argument most in line with the audience’s current attitude.
“I know you don’t want to go to sleep, but you have to go to school in the morning”
... is far more effective than ...
“You have to go to school in the morning, so you better go to sleep.”

Frame messages as rhetorical questions.
“Wouldn’t it be nice if (X) was legal?” encourages people to produce explanations and justifications for their attitudes.
“Do you think (X) should be legal?” merely primes people to express them as conclusions.

Face-to-face messaging is by far the most effective channel.
We are biologically hardwired to respond to the human face.

Anthony Magnabosco practices a persuasion technique so effective that over the years he has gained an audience of thousands.
As people walk by, he asks if they would be open to challenging their deeply held beliefs.
Street epistemology is where you ask questions to explore a claim someone makes because they think it is true.
Pick a claim that motivates you to behave.

It could be devastating to think something that was a rock of comfort was an illusion.

You must condition yourself to believe, to feel better, to function in life.

Any downsides to thinking something is true if you don’t have good reasons?
I question myself daily, and I’d like to not do that.

Book by philosopher Peter Boghossian about how to question people using the Socratic method.

His YouTube page where he had uploaded hundreds of conversations:
As he uploaded his videos, people started giving feedback.
‘Try this.’ ‘Do that.’ ‘Why did you ask that?’
How can I take that and improve?
A five-minute conversation would become a week’s worth of discussion about how it could have gone better.

Encouraging a person to change their epistemology was changing their minds.
It’s changing something in their brains deeper than beliefs, attitudes, and values.
He doesn’t set out to change people’s conclusions about what is and is not true, or moral, or important.

Establish rapport. Assure the other person you aren’t out to shame them, and then ask for consent to explore their reasoning.
Ask for a claim.
Confirm the claim by repeating it back in your own words.
Ask if you’ve done a good job summarizing.
Repeat until they are satisfied.
Clarify their definitions.
Use those definitions, not yours.
Ask for a numerical measure of confidence in their claim.
Ask what reasons they have to hold that level of confidence.
Ask what method they’ve used to judge the quality of their reasons.
Focus on that method for the rest of the conversation.
Listen, summarize, repeat.
Wrap up and wish them well.

Guided metacognition:
To encourage a person to think about their own thinking, but only after they’ve already used their own reasoning to produce a claim and presented its justification.
They can assess their own methods, question their reasons, and evaluate the merits of their own arguments.

Anything that can be misconstrued as “you should be embarrassed for thinking that way” will be met with anger.

Empirical, fact-based claims.

We often aren’t actually arguing, because our definitions of the terms aren’t the same as theirs.
A suitcase word like ‘true’ or ‘faith.’ You must unpack it.

Ask them to put a number on their feeling of confidence, from zero to one hundred, so they can begin to step backward into their processing and ask themselves how sure they are about that feeling.

The other person should feel exhilarated for a chance to explain themselves.

The reason they give might be just the thing that came to their brain.
But it may not actually be the real reason.
There’s something else propping up their view.

‘If you discovered that was not a good reason for holding a high degree of confidence in what you think is true?’
And if they say, ‘My confidence wouldn’t change even if I discovered that reason wasn’t good’, you’ve now basically eliminated that reason as being part of the mix, and you can just sort of rinse and repeat that as many times as necessary until you hit the real reason.

Encourage people to test the reliability of the method they typically use to “judge the quality of their reasons.”
Then ask if, using that method, those reasons still support their current level of confidence.
“Could your method also be used to arrive at completely different and competing conclusions?”
and if so:
“What does that say about the quality of the method you’re using to arrive at your belief?”

Imagine someone has looked at the same evidence and reached a different conclusion, and now a third person is looking at both their arguments.
How would that third person determine which conclusion was true?

What was the definitive event that caused you to be as certain as you are today?
Do you use the same standards for counterevidence?

Stay in their head and out of yours.
Move at their speed.
Allow for pauses.
Use their meanings and their reasoning.
Stay in their head and out of yours.

Building rapport is the most important.
No one will become amenable to learning if us-versus-them feelings abound.

When the Space Shuttle Challenger exploded in 1986, psychologist Ulric Neisser had his class of 106 people write down how they heard, where they were, what they were doing, and what they felt.
Two and a half years later, he asked them these questions again, and only 10 percent got them all right.
But the interesting thing is not that their memories were faulty; it’s that they refused to accept that their memories could be faulty.
Even after looking at their own journals and seeing the truth, one student told Neisser, “That’s my handwriting, but that’s not what happened.”
When neurologist Robert Burton read about that study, he became fascinated with the very idea of certainty itself.
He wrote a book on the topic titled On Being Certain.

Beliefs and doubts are better thought of as processes, not possessions. Neurons delivering an sensation of confidence.

When they employed deep canvassing without sharing their personal narratives, it no longer had any impact, while identical conversations that included storytelling still worked incredibly well.
It didn’t matter if the stories you share about the topic are your own or someone else’s, only that it involves someone affected by the issue at hand.
Even sharing a video of someone else telling a story was effective.
But removing this exchange from deep canvassing also removed all of its persuasive power.

Narrative transport is that feeling when you become so fully immersed in a story that you forget yourself for a moment.
For narrative transport to take place, a story must contain three features:
1. a component that keeps your attention from wandering
2. a component that consistently evokes strong emotional reactions
3. a component that evokes mental imagery

Why does transport persuade us so? Because it can eliminate counterarguing.
When we’re engaged with a story, we don’t prepare a rebuttal, because we feel swept up.
A story isn’t trying to change your mind.

“If I could produce a device right now, something like a button under a glass case, and I told you that if you flipped open the glass and pressed the button, you would change your mind on this subject, would you press it?”
“No. I would not.”

Always start by asking yourself why you want to change the other person’s mind.
Ask yourself, “Why is this important to you?”
Whatever your answer, ask again, and again.
Then share your answers with the other person.

The only way to win a debate is to avoid changing one’s own mind.
Only the “loser” of a debate learns anything new.

In 1939 men ranked “mutual attraction and love” as the fourth most desirable trait in a wife, and women ranked it fifth in a husband.
The most important trait? Women said they wanted “dependable character” in their partners, and men said they wanted “emotional stability” in theirs.
When that same research was repeated in 1977, “mutual attraction and love” had risen to number one for both men and women.

Imagine a group of people who need to get into a college classroom.
The classroom is empty, but the first person who shows up decides not to attempt to open the door and check.
The next person who shows up doesn’t want to make small talk with a stranger or make a fool of himself, so also waits.
Now a third person shows up. Normally she’d have no problem checking to see if the classroom was empty, but since there are two people already waiting, she assumes they know something she doesn’t, and bases her behavior on their behavior.
She ignores her internal signal thanks to the strength of the external one.
Now we have a cascade beginning to unfold.
Each person decides to adopt the behavior of the others depending on their personal threshold for conformity.
With each additional person, the gumption required to break the cascade increases.
To break the cascade, someone who is more brave and willing to look silly by checking the door than anyone else in the crowd so far must arrive.

Social gatherings like parties often empty out without any group coordination.
One person who is tired or bored leaves.
But if there are other tired people with low thresholds, an early leaver can encourage them to do the same.

Now imagine the psychology behind classrooms and house parties (two previous exapmles here) taking place within clusters of people throughout a culture, persuading groups of friends and coworkers to quit smoking.
A cascade of change can spread culture-wide.
Innovation researcher Greg Satell, in his book Cascades.

For an idea to spread across a network in such a way that it flips almost everyone from thinking in one way to thinking in another, you don’t need thought leaders or elites.
The crucial factor is the susceptibility of the network.
If there are enough connected people with low thresholds across groups, any shock - any person - can start a cascade that will flip the majority of the population.

Most people strongly reject the notion of randomness.
They’d rather believe the world turned out the way it did because that’s the way it was supposed to turn out.

For science to work, you must be willing to say you are wrong.
This is why we invented science in the first place.
Science is smarter than scientists, and the method is what delivers results over time.