When town hall participant Karl Becker got the closing question in the second presidential debate, I was thrilled to see him ask the same question I had submitted to the openquestionforum.org the prior week. Some viewers may have laughed at the naiveté of our kindergarten-level suggestion to name something positive about your opponent. But not behavioral scientists.
It often feels impossible to change people’s minds on an issue. Most attempts to persuade backfire and make the gulf between groups of opposing views an even wider chasm, filled with toxic verbal sewerage. You’d think 21st-century educated humans might consider evidence and adjust their views accordingly. But behavioral science shows that the more facts and evidence you bring to the argument, the more adversarial things become for most humans, and the farther off you push any reconciliation.
There are many names for this phenomenon: confirmation bias; motivated reasoning; and backfire effect. Some of the earliest research into motivated reasoning even demonstrated that two rival groups watching the same video take away opposing conclusions.
But what Karl and I were getting at is a tactic known as “affirmation.” It may be one of the only ways to begin to melt rigid opinions just enough to enable some flexible discussion. Here’s what it is and how it works — whether in politics, or at the office, or in negotiations.
When we hold a point of view on an issue, it is rarely just an academic thing devoid of emotion or meaning. Usually it helps define who we are, what we believe, and which group we belong to. When someone confronts you or challenges that belief, at a below-conscious level you feel they are challenging your identity and your brain readies you for an assault on your self-esteem.
Two academics who have long studied this effect, Brendan Nyhan at Dartmouth College and Jason Reifler at the University of Exeter, discovered that if you tell people something positive about themselves, they are more amenable to changing their views on an issue. In their latest experiments, Nyhan and Reifler again find that, “Affirmation can make it easier to cope with dissonant information that one has already encountered about controversial misperceptions, relaxing people’s need to reject facts that could otherwise be threatening.”
When Drew Westen and his team of scientists looked into what is happening in the brain when you are challenged by evidence you may be incorrect, they discovered heightened activity in brain centers related to emotion, conflict, moral judgments, and reward and pleasure, but little activity in the area of the brain most closely associated with rational thought.
So our identity and beliefs are threatened, and our brain signals that the new, conflicting evidence may be painful. Meanwhile the brain sends a reward when we suppress this threatening evidence. These are powerful inhibitors to changing our minds.
Dan Kahan, who runs the Cultural Cognition Project at Yale, also studies this phenomenon and suggests that the only way forward is a kind of “disentanglement.” Cultural Cognition is a way of mapping humans on a big grid of beliefs and worldviews. When some outsider point of view challenges your worldview, you reject it immediately. It is only when you can disentangle the evidence from the identity you may make headway.
Evidence is beginning to point to modest success in bridging opposing groups by first affirming them (saying something nice or at least priming them positively), and by disentangling their identity from the issue itself.
Kindergarten questions can be rather powerful, after all.
Christopher Graves is a recent Rockefeller Foundation Bellagio Resident honoree for behavioral science, Global Chairman of opr, and chair of the PR Councils.
The source of this blog is the Harvard Business Review.