Robert Bogue
September 29, 2025
No Comments
There are some friends that, when they suggest a book, you read it. Such was the case with How Minds Change: The Surprising Science of Belief, Opinion, and Persuasion. I wasn’t disappointed. For context, one of the most challenging things that I believe we’re faced with today is our inability to stay in a conversation until we can understand each other. It’s particularly difficult with people who hold radically different views than we do – but it seems challenging even when our perspectives aren’t that different.
Quoting Hugo Mercier from The Enigma of Reason, David McRaney explains that we evolved to reach consensus. However, a long list of authors have shared how this fundamental feature of our evolution is under attack. Cass Sunstein in Going to Extremes explains how we’re moving to more divisive beliefs. Ezra Klein explains Why We’re Polarized. Buster Benson asks Why Are We Yelling? Harriet Learner questions Why Won’t You Apologize? Douglas Stone, Bruce Patton, and Sheila Heen try to help us learn how to have Difficult Conversations.
Forms of Verbal Communication
There has been a lot of work on changing the minds of people. Therapists try to do this every day to serve their patients. The Heart and Soul of Change illuminates how some of what therapists do works and some doesn’t. Science and Pseudoscience in Clinical Psychology similarly speaks about what is supported by science and what isn’t. Motivational Interviewing exposes how an effective tool for encouraging personal change works.
There are other guides for parents, including How to Talk So Kids Will Listen & Listen So Kids Will Talk and Parent Effectiveness Training. John Gottman in The Science of Trust speaks about how to create conversations that foster continued connection and influence in an intimate relationship context. Among other challenges, he cautions against a harsh startup.
Consistently, a curious stance is helpful to engage in conversation and get the discussion going. Even in negotiations or debate, a soft, open, and friendly approach is generally called for at first.
The best possible case for face-to-face communication is Dialogue, which William Isaacs describes as an elevated state. Too often, we find ourselves convinced of our righteousness to a point that we can’t create the conditions that encourage changing of minds. Entering from an open, curious stance is a good starting point for helping change minds – whether it’s our mind or someone else’s.
Post-Truth
1+1=2. 1+2=3. These and other truths are ones I learned in elementary school or even earlier. There were some bedrock truths that we all believed in. However, in the world today, we are willing to believe whatever we want. We’ll accept our feelings as facts whether or not they contradict reality. I tend to use the Flat Earth Society and its members as my prototypical example for believing the ridiculous. I didn’t know at the time that David McRaney (the book’s author) had contributed to the documentary, Behind the Curve. In the documentary, flat Earthers prove the Earth is not flat – yet continue to believe that it is flat.
It’s not that they don’t believe in the scientific method – they use it themselves to conclusively prove the Earth isn’t flat. They distrust the organizations that claim the Earth isn’t flat. While we describe the era we’re in as “post-truth,” the truth is that we’re in an era of post-trust.
Chuck Underwood, in America’s Generations, makes the point that Generation X grew up in a time where it became apparent that institutions couldn’t be trusted. From church and scout sex scandals to corporate collusion, the sense that institutions could be trusted collapsed into a pile of rubble around our feet. Even secrets we expect, like hidden bunkers to protect our representatives in the federal government, hit us like a ton of concrete below a hotel in West Virginia. (See The Cold War Experience.) From then on, the foundations of our trust in all institutions, relationships, and even ourselves has eroded to the point of being unable to protect against the onslaught of opinions.
Our brains had been overwhelmed with information for decades. We relied on institutions to help us cope by filtering, summarizing, and condensing information into little packets we could handle. Daniel Levitin in The Organized Mind, Clay Johnson in The Information Diet, and Laura van Dernoot Lipsky in The Age of Overwhelm, among other works, tell us that we’re consuming more information in a month than our grandparents did in their lifetimes, and we’ve not evolved for that.
Trust is complicated. That’s why I wrote two master posts, Trust => Vulnerability => Intimacy and Trust => Vulnerability => Intimacy, Revisited. It’s unlikely we’re going to regain trust in institutions easily or soon. It’s more likely that our trust in institutions has been permanently damaged. It’s like we’re walking between bombed-out buildings that once held simple answers. Now, things are complicated in ways that we have no desire to comprehend.
Goose Trees
“Nature texts going back to the 1100s describe mysterious goose trees with odd fruits from which, they said, birds would form, hatch, dangle, detach, and fly away.” What, to us, seems patently ridiculous was a popular belief for hundreds of years. Why? On the expert side, no one understood migrations, and they had no better explanation for how a bird came to be when they never saw a nest. They just assumed there had to be a tree somewhere. The general public had no reason to doubt the experts. It didn’t really matter. It didn’t impact their daily lives.
The goose trees represent a belief – but not knowledge. Beliefs aren’t necessarily true, but knowledge should be. If we want to change someone’s mind, we’ve got to help them see how their beliefs may be true – but there are reasons why they may not be true.
The real problem is that most people don’t “know” they’re right. They “feel” they’re right. Just like how Lisa Feldman Barrett misinterpreted illness for love, we can mistake a feeling of knowledge for actual knowledge. (See How Emotions Are Made for her story.) In my review of The Enigma of Reason, I explained how people believe they know about automobiles until you start asking detailed questions. We can feel we know something or are better than we really are, just like Thomas Gilovich explains in How We Know What Isn’t So. Robert Burton in On Being Certain directly tackles the issue that what we know and what we think we know are very different.
Dogma and Distortions
Dogma can be anything. It can be that terrorism doesn’t happen on US soil. It can be very difficult to confront these beliefs because they’re undiscussable. Undiscussable items are relatively impervious to change. Discussing terrorism on US soil on September 10, 2001, would have been difficult. It hadn’t ever happened. On September 12, there was no question. The Black Swan event happened, and in doing so, it destroyed the dogma. However, we don’t want to have tragedies for people to change their dogma.
The whole point of How Minds Change is to discover not how weakly-held beliefs change but rather to understand how persistent beliefs change. It explains that it’s often the most deeply held beliefs that change rapidly. By looking at different approaches that seem to change these beliefs, we can find commonality and tools that can be used to drive change regardless of the situation or how deep the beliefs go.
One of the possible explanations for why dogmatic beliefs change so quickly is that normally dogmatic beliefs distort the perception of events around them. We accept experiences that confirm the beliefs – and the related beliefs – while rejecting experiences that disagree – until we can no longer do so. This is, as I described in my review of Going to Extremes, “Mount Must.” The view from there is different.
We no longer can accept views that aren’t true. We can no longer ignore our literal blind spots. (See Incognito for more.) The distortions melt away when we can accept core truths. However, getting to the core truths isn’t easy, as A Manual for Creating Atheists explains.
Self-Sealing Arguments
Anosognosia is real. It’s where one condition hides awareness of the condition itself. The clinical accounts are striking. From the mild explanations generated by Michael Gazzaniga’s subjects with a severed corpus collosum to more intense experiences, it’s a real thing. (See Noise, Incognito, and The Honest Truth About Dishonesty for more about Gazzaniga’s experiments.) The relatively distressing phantom limb syndrome is another variant of this problem, where people struggle with sensations from limbs that are missing. (See The Tell-Tale Brain, Descartes’ Error, and Capture for more.)
One argument used to imprison people in psychiatric hospitals is that they’re unable to understand their own illness because of anosognosia. (See Insane Consequences and Your Consent Is Not Required.) The problem is that often there is no evidence presented for anosognosia. An “expert’s” testimony is all it takes for a judge or a medical professional to decide that the anosognosia is real, and therefore the person isn’t able to make decisions for their own care.
Self-sealing arguments are often caused by dogmatic beliefs and the surrounding distortion, but that’s not always the case. Sometimes, the arguments are just self-sealing because that’s how the person has learned to create arguments.
Protecting Our Psychological Selves
We protect ourselves instinctively. We’ll take actions to protect our physical selves, like bracing for impact in ways that happen too fast for conscious thought. However, these defenses operate not just for our physical selves but for our psychological selves as well. In The Ego and Its Defenses, we learn of 22 major and 26 minor psychological defenses. Others in the psychoanalytical tradition, including Anna Freud, came up with different lists and counts, but the plurality of our defenses and the automatic nature with which we deploy them is universal. Leadership and Self-Deception calls it being “in the box,” and our defenses are the tendency to bring others into the box.
When we threaten thoughts that are dogma, we can activate these defenses. Dogmatic beliefs are beliefs that have no basis in fact. We believe them because others believe them without the need for direct evidence.
Persuasion
Much of helping people change their minds depends on context. If you’re trying mass media, then The Hidden Persuaders speaks of marketing approaches – as do books like Demand and Guerrilla Marketing. Robert Cialdini in Pre-Suasion and Influence and the authors of Split-Second Persuasion, Changing Minds, and Nudge speak about techniques to change peoples’ minds – but their focus isn’t on the deep-seated beliefs. It’s about preferences and things that are more malleable to gentle persuasion.
Sometimes, the belief or behavior is a bit more stubborn. Motivational Interviewing is a tool used by substance abuse counselors to help people break free from harmful addiction. This technique forms the underpinning of the approaches recommended – along with the work of another.
Leon Festinger’s A Theory of Cognitive Dissonance moves into the world of disconnects between attitudes and behavior and a place where sometimes deeply held beliefs live. (See also On Being Certain and Beliefs, Attitudes, and Values.) Festinger provides techniques and approaches to place a wedge between the belief and the behavior.
Comfortably Uncomfortable
There’s a delicate balance that we need to find to help people change their minds. We’ve got to get them comfortable enough to be open to listening. That means a psychological safety, like Amy Edmondson describes in The Fearless Organization. This enables them to move from precontemplation to contemplation in the transtheoretical model. (See Changing for Good and the Stages of Change model.) However, a certain degree of discomfort is needed to induce change. This is the challenge that Marsha Linehan directly confronted in the development of dialectical behavior therapy. (See Cognitive Behavioral Treatment of Borderline Personality Disorder).
Approaches
McRaney studied and recommends three strategies that work for deeply held beliefs.
Street Epistemology
It’s the study of why people believe what they do – and often there’s a desire to change their beliefs. Though this work started with Peter Boghossian, McRaney didn’t learn from him directly. I mention this, because Boghossian is the author of A Manual for Creating Atheists and co-author of How to Have Impossible Conversations. The community and Boghossian seem to have different views. While Boghossian is acknowledged as its founder, the community appears to have evolved in a different direction.
So, what are the steps of street epistemology (SE)? They are:
- Build and Establish Genuine Rapport
- Identify a Specific Claim to Explore
- Gauge Confidence
- Explore Reasons
- Examine Quality of Reasoning
You’ll notice that there’s no explicit challenge to a person’s belief. The objective is to plant the seed of doubt around a core belief rather than directly converting someone’s beliefs – even if that is the real objective. Deep canvassing (DC) takes SE to the next level.
Deep Canvassing
With SE, a claim is picked based on the interests of the person you’re speaking with, since the goal is simply to understand their thinking. DC has a specific, targeted issue it wants to address – and thus the issue itself is set. Also, DC is focused on moving the person’s position on an issue if possible.
The focus allows for the development of a targeted, personal story that lays out an emotional reason that supports the desired point of view. The story is then used as an icebreaker to see if they’re willing to move their perspective.
The basic framework for DC is:
- Establish rapport.
- Ask how strongly they feel about an issue on a scale of one to ten.
- Share a story.
- Ask a second time how strongly they feel. If the number moved, ask why.
- Ask, “Why does that number feel right to you?”
- Once they’ve offered their reasons, reflect back your summary and ask if you got it right. Repeat until they are satisfied.
- Ask if there was a time in their life before they felt that way, and if so, what led to their current attitude?
- Listen, summarize, repeat.
- Briefly share your personal story of how you reached your position, but do not argue.
- Ask for their rating a final time, then wrap up and wish them well.
As you can see, it has many of the same key concepts as SE, but it has more details because of the explicit focus on changing perceptions.
Smart Politics
Started as a framework for liberal and conservative politics conversations, smart politics follows a familiar pattern. Using the example of vaccine hesitancy, the steps are:
- Build rapport. Assure the other person you aren’t out to shame them and then ask for consent to explore their reasoning.
- Ask: On a scale of one to ten, how likely are they to vaccinate?
- If one, ask: Why would other people, who aren’t hesitant, be higher on that scale? If above one, ask: Why not lower?
- Once they’ve offered their reasons, repeat them back in your own words. Ask if you’ve done a good job summarizing. Repeat until they are satisfied.
As with SE and DC, the goal is to build rapport and ask for a rating on an issue and then get them to progressively explain – and therefore expose any logical inconsistencies and resolve them. It’s like Leon Festinger’s (A) Theory of Cognitive Dissonance meets James Pennebaker’s work of writing things out. (See Opening Up.) By listening empathetically, people will often start to become more open to change, and that is How Minds Change.