This article first appeared on Freakonomics radio

The interesting take-aways for LGBTI campaigners:

  • Information is used by people depending on their existing views: The same piece of information, or explainer video, or real-life story, etc. will be used to REINFORCE attitudes, including negative ones, rather than challenge them.  It was pretty clear so far that information alone doesn’t change people. This article suggests it might even be counter-productive!
  • People live in closed social circles with people who are like them, so influencing others is increasingly difficult. OK, we all know this. But while it will be near impossible to get homo/transphobic people into an LGBTI-supportive group, it is much easier to get them into an unrelated group (say on fashion or make up or cooking or traditional handicraft) that will be LGBTI supportive when the time comes. It’s much more beneficial to invest into finding these groups than trying to get your target group to come to your “obvious” platform.
  • “The equivalent of 10-year-old lab rats hate broccoli as much as 10-year-old humans do. In late adolescence, early adulthood, there’s this sudden craving for novelty … And then by the time you’re a middle-aged adult rat, you’re never going to try anything new for the rest of your life.” Don’t waste your time. Concentrate on the period in life when we’re genetically engineered to explore new territories (physical or mental).
  • One of the key barriers to change is overconfidence in their own opinions. Rather than avoid the conversation or “bust the myths” by providing the “correct information” yourself, ask people to explain their own attitude and they will start loosing confidence because chances are that they won’t be able to come up with something totally convincing. And when they are off-balance, change can happen.

Full article:

Here’s an interesting fact: legislators in several Republican-controlled states are pushing to eliminate the death penalty. Why is that interesting? Because most Republicans have typically been in favor of the death penalty. They’ve said it’s a deterrent against the most horrific crimes and a fitting penalty when such crimes do occur.

But a lot of Republicans have come to believe the death penalty does not deter crime — which happens to be an argument we offered evidence for in Freakonomics. They also say the lengthy legal appeals on death-penalty cases are too costly for taxpayers. Some Republicans also cite moral concerns with the death penalty. So, a lot of them have changed their minds.

We’ve all changed our minds at some point, about something. Maybe you were a cat person and became a dog person. Maybe you decided the place you lived, or the person you loved, or the religion you followed just wasn’t working for you anymore. But changing your mind is rarely easy. Although if you’re like most people, you would very much like other people to change their minds, to think more like you. Because, as you see it, it’s impossible for the world to progress, to improve unless some people are willing to change their minds.

On this week’s episode of Freakonomics Radio: how to change minds, or at least try to.


Robert Sapolsky is a professor of neuroscience at Stanford University. He describes himself as half-neurobiologist and half-primatologist; he studies both neurons in petri dishes and wild baboons in East Africa. Sapolsky has a lot of experience with changing his mind. He was raised as an Orthodox Jew before he decided, at age 14, that “[t]here’s no God, there’s no free will, there is no purpose.” He used to be a classical music snob; then he married a musical-theater fanatic and director. Today, he often serves as rehearsal pianist for his wife’s productions.

Sapolsky has noticed something about mind-changing: it’s easier to do when you’re younger. In a survey he put together to look at people’s preferences in food, music, and so on, Sapolsky found that people do indeed become less open to novelty as they get older. Someone who hasn’t eaten sushi by age 35, for example, likely never will. He also found that humans are not the only animals that exhibit this behavioral pattern.

“[Y]ou take a lab rat and you look at when in its life it’s willing to try a novel type of food — and it’s the exact same curve!” Sapolsky says. “The equivalent of 10-year-old lab rats hate broccoli as much as 10-year-old humans do. In late adolescence, early adulthood, there’s this sudden craving for novelty … And then by the time you’re a middle-aged adult rat, you’re never going to try anything new for the rest of your life.”

There are a lot of reasons why it may be easier to change your mind when you’re younger. It could be the fact that your brain is simply more plastic then — something scientists assumed for a long time but now are starting to question. Or it could be that your positions are less entrenched, so it’s less costly to change them.

Or it could be that the stakes are lower: the fate of the world doesn’t hinge on whether you are pro-broccoli or anti-broccoli. But as life goes on, as the stakes rise, changing your mind can get more costly.

Several years before the United States invaded Iraq, the political scientist Francis Fukuyama, signed onto a letter in support of such a move. At the time, Fukuyama was well-established as a prominent political thinker. In addition to writing a landmark book, he’s done two stints in the State Department. So his views on the Iraq War were taken seriously.

But as the invasion drew near, Fukuyama started to have second thoughts.

“My main concern was whether the United States was ready to actually stay in Iraq and convert it into a kind of stable, decent country,” Fukuyama says. “But even I was astonished at how bad the planning had been, and how faulty the assumptions were, that we were going to be greeted as liberators and that there would be a rapid transition just like in Eastern Europe to something that looked like democracy.”

In February of 2004, Fukuyama attended a dinner at the American Enterprise Institute, a conservative think tank in Washington, D.C. The featured speaker was Dick Cheney. The crowd greeted the then-vice president with a big round of applause.

“And I just looked around at the people at my table and I said, ‘Why are these people clapping?’” Fukuyama says. “Because clearly this thing is turning into a huge fiasco. And that’s the moment that I decided, you know, these people are really nuts. I mean, they’re so invested in seeing this as a success that they can’t see this reality that’s just growing right in front of their eyes.”

Fukuyama paid a heavy price for his change of heart on the Iraq War. He was seen as having abandoned the neoconservative movement and lost close friends in the process. But to this day, he is surprised that so few of the supporters of the war remain unwilling to admit it was a mistake.


There’s another factor that may contribute to our reluctance to change our minds: overconfidence — our own belief that we are right, even in the absence of evidence. Just how much unearned confidence is floating around out there?

Consider a recent study by Julia Shvets, an economist at Christ’s College, Cambridge who studies decision-making. She and some colleagues surveyed over 200 managers at a British restaurant chain. The managers averaged more than two years on the job and their compensation was strongly tied to a quarterly performance bonus. The managers were asked to recall their past performance and to predict their future performance.

Shvets found that only about 35% of the managers were able to correctly say whether they fell in the top 20% of all managers, or the bottom 20%, or another 20%block somewhere in the middle. Forty-seven percent of managers were overconfident about their standing.

And these were people who had detailed feedback about their performance every quarter, which is a lot more than most employees get. How could this be? This is where memory comes into play, or maybe you’d call it optimism — or delusion.

“People who did worse in the previous competition tended to remember slightly better outcomes. People seem to be exaggerating their own past performance in their head when this performance is bad,” Shvets explains. “So what we conclude from this is that people, when given information about their past performance, use memory selectively. They remember good outcomes and they tend to forget bad ones.”

So maybe it’s not so much that people refuse to change their minds — or refuse to “update their priors,” as economists like to say. Maybe they just have self-enhancing selective memories.


Sothere are a lot of reasons why a given person might be reluctant to change their mind about a given thing. Selective memory, overconfidence, or the cost of losing family or friends. But let’s say you remain committed to changing minds — your own or someone else’s. How do you get that done? The secret may lie not in a grand theoretical framework, but in small, mundane objects like toilets, zippers, and ballpoint pens.

Steven Sloman, a psychology professor at Brown, conducted an experiment asking people to explain — not reason, but to actually explain, at the nuts-and-bolts level — how something works.

Chances are, you probably can’t explain very well how a toilet or a zipper or a ballpoint pen work. But, before you were asked the question, you would have thought you could. This gap between what you know and what you think you know is called the “illusion of explanatory depth.” It was first demonstrated by psychologists Leonid Rozenblit and Frank Keil.

“[P]eople fail to distinguish what they know from what others know,” Sloman says. “We’re constantly depending on other people, and the actual processing that goes on is distributed among people in our community.”

In other words, someone knows how a toilet works: the plumber. And you know the plumber; or, even if you don’t know the plumber, you know how to find a plumber.

You can see how the illusion of explanatory depth could be helpful in some scenarios: you don’t need to know everything for yourself, as long as you know someone who knows someone who knows something. But you could also imagine scenarios in which the illusion could be problematic, such as in the political domain.

Sloman and his collaborator Philip Fernbach basically repeated the Rozenblit and Keil experiment, but instead of toilets and zippers, they asked people about climate change and gun control. Unsurprisingly, most people weren’t able to explain climate change policies in much detail. But here’s what’s interesting: people’s level of confidence in their understanding of issues — which participants were asked to report at the start of the experiment — was drastically reduced after they tried, and failed, to demonstrate their understanding.

“It reduced the extremity of their confidence that they were right,” Sloman says. “In other words, asking people to explain depolarized the group.”


Matthew Jackson, an economist at Stanford who studies social and economic networks, used to believe that different people, given the same kind of information, would make decisions the same way, regardless of past experiences and influences.

That, however, is not what Jackson’s research suggests. In one experiment, Jackson had a bunch of research subjects read the same batch of abstracts from scientific articles about climate change. He found that people reading the same articles could interpret the articles very differently, depending on their initial positions.

In fact, information, far from being a solution, can actually be weaponized.

“There was a group of about a quarter to a third of the subjects who actually became more polarized, who interpreted the information heavily in the direction of their priors, and actually ended up with more extreme positions after the experiment than before,” Jackson says.

In other words, a person’s priors — which are shaped by previous experiences, influences, and social networks — play a big role in shaping current beliefs and decision-making processes. Steven Sloman, the Brown professor, thinks that the third factor is particularly important.

“[W]e believe what we do because the people around us believe what they do,” Sloman says. “This is the way humanity evolved. We depend on other people.”

So if our beliefs are shaped by the people around us, one antidote to inflexible thinking is simply, balance. Unfortunately, a great many of us are quite bad at creating diverse, well-balanced networks. People are prone to surrounding themselves with people just like them.

“We end up talking to people most of the time who have very similar past experiences and similar views of the world, and we tend to underestimate that,” Matthew Jackson says. “People don’t realize how isolated their world is. You know, people wake up after an election and are quite surprised that anybody could have elected a candidate that has a different view than them.”

You can find the full Freakonomics Radio episode, “How to Change Your Mind” at Freakonomics.com. You can also listen on Stitcher, Apple Podcasts, or any other podcast platform.

Go to the profile of Stephen J. Dubner/ Freakonomics Radio

WRITTEN BY

Stephen J. Dubner/ Freakonomics Radio

Stephen J. Dubner is co-author of the Freakonomics books and host of Freakonomics Radio.