How to Counter Your Audience’s Preexisting Beliefs
Confirmation bias is the universal tendency of human beings to hang on to what they already believe in the face of evidence to the contrary. I’m not talking about intentional bias—consciously building a biased case in hopes of winning an argument. Confirmation bias is unintentional. It’s how we win our internal arguments, how we convince ourselves we’re right.

Since this is a risk communication column, I want to focus here on the implications of confirmation bias for risk communicators. Your audience members are sure to filter your warnings and reassurances through their own preexisting opinions about what’s safe and what isn’t, resisting anything you say that tries to change their views. How should this fact affect your messaging?
There’s a second part to this column, postponed to a later issue, on ways to minimize your own confirmation bias—ways to be more open to challenging information.
Confirmation bias is a system of defenses aimed at protecting you—and me, and everybody—from uncomfortable information. Here are some of its key components.
Selective exposure and selective attention are our first lines of defense against information we don’t want to know about. We try not to encounter messages that we disagree with, and if we run into them by accident we try to tune them out.
When we fail to tune out messages we don’t agree with, selective perception is a key unconscious strategy for avoiding their meaning. We simply misperceive them.
Closely related to selective perception is framing. We see new information through the frame of what we know or believe already. When a U.S. public health official states that she expects “local outbreaks” of Zika virus disease, for example, some in the audience picture widespread, devastating outbreaks, while others picture a few very small ones. Their prior opinions about how much Zika we’re likely to experience is the frame through which they perceive what the official meant by “local outbreaks.”
Selective interpretation is more conscious than selective perception and framing. If we possibly can, we find a way to interpret—opponents would say “misinterpret”—messages so they don’t challenge our preconceptions. Suppose you asked Trump and Clinton supporters to listen to a speech by either candidate, and then to tell you what they heard. In addition to a lot of selective perception, their answers would reflect a lot of selective interpretation as well. “What he (or she) really meant was….”
Our final defense is selective retention. If we can’t avoid or ignore or misperceive or misinterpret the messages that tell us we’re wrong about something, we forget them or misremember them. Try sitting down with someone you had a fight with last week to reconstruct who said what. Your recollections will be quite different, and both will be self-serving.
Not every risk message encounters confirmation bias. Sometimes your audience has absolutely no preexisting opinions, attitudes, values, or expectations relevant to your topic. You’re talking to the proverbial tabula rasa (“blank slate”), and the main barrier to the audience absorbing your message is probably apathy.
When your message is actually of interest to your audience, on the other hand, they’re likely to be testing that message against what they already know and believe and feel—and confirmation bias will rear its ugly head. Employees whose years of accident-free work have taught them they don’t need to adhere to safety procedures will deploy these defenses against your safety messaging. Neighbors whose outrage about your facility’s emissions tells them the facility is causing cancer in the community will deploy them against your reassurances. How can you overcome their confirmation bias?
Look for ways to reframe your core message so it is more compatible with your audience’s preexisting opinions, attitudes, values, and expectations.

The most important implication of confirmation bias for risk communicators—or any communicators—is this: if you possibly can, confirm something. Don’t challenge your audience any more than you absolutely have to.
I’m not saying you should change your core message. It is what it is. But look for ways to reframe your core message so it is more compatible with your audience’s preexisting opinions, attitudes, values, and expectations.
Decades ago, many construction workers and others in hazardous occupations resisted wearing hardhats, in part because they were proud of their courage and competence, and felt the hardhat requirement called both into question—needlessly so, since their experience had taught them they were capable of avoiding injury without any head protection. Strict rules forbidding the use of hardhats in low-risk parts of the worksite turned the meaning of the safety gear on its head (so to speak). The new meaning: only workers skilled enough and brave enough to work in dangerous places wear hardhats.
is a risk communication consultant and speaker. Much of his work on risk communication can be found on his website,
. Comments on this and future columns can be sent to
and to
In precaution advocacy—urging people to take a risk more seriously—the single most basic principle is to find something in your audience that already predisposes them to do what you want them to do … and build on that. If you’re lucky, you’ll find something substantively relevant to build on. But it’s better to build on preexisting opinions, attitudes, values, and expectations that are only obliquely relevant or even totally irrelevant than to build on nothing at all.
If you can’t convince parents to vaccinate their children because infectious diseases are a lot more dangerous than vaccines, maybe you can convince them to do so because they don’t want to offend their neighbors. Or because they don’t want to be different from everybody else. Or even just because a rock star they admire is a big vaccination proponent.
The substantively relevant reasoning often comes later. You get me to vaccinate my kids because I want to be like my favorite musician. Understandably, I feel a little weird about having made such an important parenting decision for such an irrational reason. (This weird feeling is called cognitive dissonance.) So I start looking for evidence that vaccination is good. I’m still under the sway of confirmation bias. But what I want to confirm has changed. Now instead of seeking to confirm my former opinion that vaccines are dangerous, I’m motivated to confirm that I was right to vaccinate my kids last week. So your pro-vaccination messages are no longer messages I’m trying to avoid, ignore, misperceive, misinterpret, or misremember. They have become messages that can help me feel better about myself.
When you’re trying to calm people who are excessively upset about a risk (outrage management, in my jargon), your confirmation bias problem is pretty much the same. Suppose I think your factory threatens my children’s health. You’re confident I’m wrong on the merits. But you can’t just tell me that “the science” says I’m being stupid.
Knowing that confirmation bias is more powerful when people are in the grip of strong emotions, you should make time to listen to me vent, so I get a little calmer and a little more willing to hear what you’ve got to say. Then when it’s finally your turn to speak, you should validate my valid concerns. If you can’t agree with me that your dimethylmeatloaf emissions are killing my kids (because you’re convinced they’re not), you can find other things to agree with me about—perhaps that your company shouldn’t have stonewalled my demands to know how much dimethylmeatloaf you emit.
One of the core confirmation bias lessons for outrage management: use two-sided rather than one-sided messaging. One-sided messages are fine, at least in the short term, when your audience is uninterested and uninformed, and likely to remain so. But outraged audiences are by definition highly interested, even obsessed. And they’re highly informed, though their information (thanks to confirmation bias) has been cherry-picked to favor their outraged conviction. So you need to use two-sided messaging. You need to acknowledge everything they’re right about—their half, or their ten percent, or even their one-tenth of one percent of the truth.
For both precaution advocacy and outrage management, here’s the confirmation bias bottom line: don’t disagree more than you have to. My clients pick a lot of unnecessary fights with their stakeholders, arousing confirmation bias defenses they didn’t need to arouse.