Posted: December 29, 2001
This page is categorized as:    link to Crisis Communication index
Hover here for
Article SummaryAccustomed to naturally occurring diseases, the U.S. Centers for Disease Control and Prevention (CDC) had a difficult time coping with the anthrax bioattacks of late 2001. Since risk communication was one of its core problems, it asked me to come to Atlanta and help. This four-part “column” grew out of my Atlanta notes. If the CDC was adjusting to bioterrorism, so was I. I put aside my usual outrage management recommendations and developed 26 recommendations specifically on the anthrax crisis. These became the basis for my (sadly) expanding work in crisis communication, and for the crisis communication CD-ROM and DVD Jody Lanard and I ultimately put out in 2004. This column was my first extended discussion of most of these crisis communication recommendations, and it is my only published assessment of the CDC’s anthrax communication efforts.

Anthrax, Bioterrorism, and
Risk Communication:
Guidelines for Action

(Page 2 of 4 – Return to page 1 link up to index)

4. Share dilemmas.

link up to indexDilemma-sharing is to the future what acknowledging uncertainty is to the past and present. It’s acknowledging that you’re not sure what to do.

This is not the same as claiming to have no idea what to do, or claiming not to have considered what to do. Of course you’ve worked on the problem, and of course you have some ideas. But if you’re not sure, say you’re not sure – and say why. “We’re trying to decide between X and Y. X has these advantages and disadvantages. Y has those advantages and disadvantages. We ruled out Z easily for the following reasons – but the choice between X and Y is a tough choice.”

You can do this when you haven’t decided – and ask for guidance from your stakeholders. But dilemma-sharing is just as important when you had to make the decision and you did – to make it clear that it was a tough decision, that you know the choice you didn’t make has merit. This has several advantages: (a) Those who favor the losing choice feel respected; their preferred option got its due consideration. (b) Those who favor the losing choice can’t easily pretend that they are obviously right, when you’re saying it’s not obvious at all who’s right. (c) Those who want to depend entirely on your judgment now and blame you later if you were wrong are forced to recognize that you’re not God and not claiming to be God; that you’re not sure.

In terms of the seesaw, dilemma-sharing is a way of moving to the fulcrum.

Some dilemmas will always be dilemmas; ethical questions, for example, don’t ever get answered. But scientific dilemmas do eventually get answered. In these cases, sharing the dilemma can mean predicting that you will make some mistakes: “We’re going to do X rather than Y for the following reasons. We may turn out wrong.” Or it can mean refusing to decide until the evidence is better: “At this point X and Y look about equally justified. The data aren’t clear enough to choose between them. Patients should consult with their own physicians and make their own choices.” Dilemma-sharing goes down a bit easier in the first scenario than in the second. It is often possible to state a preference, even while explicitly insisting that the science isn’t there to justify it. Or you can offer an algorithm: If people are more worried about A, they’ll probably want to do X, but those who are more worried about B will probably want to do Y.” The purest form of dilemma-sharing — describing the situation as a toss-up and providing no guidance whatever – is also the most painful.

Among the dilemmas CDC had to face during the 2001 anthrax crisis were these: deciding how strenuously to discourage the public from stockpiling antibiotics; deciding which individuals to test, which to medicate, when to stop; deciding which buildings to test, which to close, how to clean, when to reopen. In the early days, I think, CDC tended to come across a little more confident than it should have about these decisions. When the agency got around to dilemma-sharing, it sounded like a change in position – even when it wasn’t.

Here’s a nice example of dilemma-sharing from the November 6, 2001 New York Times. I don’t know who is quoted in this paragraph, but the subject is cleaning the Hart Building: “‘It’s a totally new paradigm and so we’re a bit panicked about it until we develop solutions,’ said a senior federal health official. Ultimately, the official said, the potential for such microbial assaults and subsequent spread of spores should decline.” The reassuring second sentence is all the more reassuring coming as it does from an official who is comfortable confessing that he’s “a bit panicked” trying to figure out how to get rid of the spores.

Dilemma-sharing is hard because it goes against everybody’s grain. Scientific sources usually would rather reach their best judgment, however tentative, and then claim confidence. And the audience usually would rather be told what to do by a confident scientific source. We all tend to get overdependent in a medical emergency, whether the crisis is personal or matter of public health. Unfortunately, that doesn’t keep us from wreaking vengeance if an expert gives us overconfident advice that turns out bad. For the expert, therefore, the choice is clear: Irritate your audience now by acknowledging uncertainty and sharing the dilemma; or claim omniscience now and risk paying a far higher cost later in outrage and lost credibility.

A month or so after I presented these ideas at CDC, the agency faced a classic opportunity for dilemma-sharing – and took it in its purest and most difficult form. (As always, I have no idea how much of this was my influence; there were plenty of other factors to credit or blame.) The issue was what to do with the cohort of individuals who were just finishing their sixty-day course of antibiotics after possibly being exposed to anthrax. CDC had based the original sixty-day regimen on the only research it had – decades-old research on natural anthrax and healthy patients. But there was animal research suggesting that anthrax spores might conceivably survive in the lungs for longer than sixty days. And CDC scientists had learned to expect the unexpected. So CDC offered patients a choice among three options: (a) Stop at sixty days, and watch for the remote possibility of illness; if it happens, get to a doctor fast. (b) Take another forty days of antibiotics, trading the additional risk of antibiotic side-effects for the additional protection against any anthrax spores that might still be lurking. (c) Take the anthrax vaccine – which has possible side-effects of its own, and has never been tested on people already exposed to the disease. CDC made no recommendation. Any of the three options, it said, had low, uncertain, non-zero risk; given the sorry state of the science, any of the three might be best.

The reaction from the media, from politicians, and (at least as quoted in the media) from the patients themselves was uniformly negative. CDC’s clear statement about the unclear nature of the science was described as “muddled” and “confused”; CDC was repeatedly characterized as having “admitted” that it wasn’t sure what patients should do, as if being sure, or sounding sure, were its obvious responsibility. In a congratulations-and-condolences email to CDC officials, I wrote: “There is this consolation: The criticism that comes to an agency that refuses to give advice when it lacks a scientific basis is nothing compared to the criticism that comes to an agency that guesses wrong. And this further consolation: Once we get used to an agency that refuses to guess, we will be more tolerant of its uncertainty when it is uncertain, and much, much more trusting of its advice when it decides that advice is merited.”

5. Do anticipatory guidance.

link up to indexAnticipatory guidance is a common strategy in clinical medicine: Tell the patient how the illness is likely to progress, how the medication is likely to feel, etc. It’s especially important with respect to negative future contingencies: “Some patients experience this or that side-effect. You may be tempted to stop taking the medicine. Instead, try….”

In a bioterrorism attack, there are many occasions to offer anticipatory guidance:

  • The antibiotics will have side-effects. Some people will be tempted to go off them prematurely.
  • There may be new cases after it looks like everything is over, resulting from the long incubation period.
  • Many of the people on antibiotics may be taken off once we determine that their exposure was small enough that it’s safe to take them off – and they may understandably feel insufficiently protected when that happens. “We’re putting you on the meds just to be sure. We hope to establish that your exposure was minimal, so we can take you off ASAP.”
  • We may never find some answers – the source of some people’s infections, for example.
  • We’re recommending closing this building. These are the conditions under which we will recommend reopening it….
  • Our interim standards will change as we learn more. People who don’t like the new standard may resent the changes. People who were managed under the old standard may also resent the changes.

The main benefit of anticipatory guidance about likely negative outcomes is that it reduces the dispiriting impact of those outcomes. Here is Michael Osterholm again, explaining in the October 28, 2001 New York Times how anticipatory guidance works:

The explanations have to include bad news along with the good, said Michael Osterholm…. Mr. Osterholm said he gained hard experience during a 1995 outbreak of meningitis in Mankato, Minn., where he oversaw vaccinations for 30,000 residents in just four days. At the outset he was careful to warn townspeople that one out of seven people who were infected would probably die. Less than a week into the outbreak, a patient died; the news, he said, was accepted without “fueling the fire,” because “people had anticipated it could happen.”

Anticipatory guidance has another benefit that can be even more important in a bioterrorism crisis. Telling people that they are likely to react in a particular way to some future event prepares them to overrule that reaction if appropriate. Assume, for example, that you want to discourage people from trying to get their own antibiotic supplies to have on hand for a possible attack. (I will leave aside for now my questions about the wisdom of this goal.) There are two schools of thought about how to proceed. The don’t-put-foolish-ideas-into-their-heads school advises that you ignore this possibility until people actually start calling their doctors; then you point out why they are wrong to do so. The anticipatory guidance school recommends that you tell people in advance that they may feel tempted to build their own antibiotic stockpiles, and that the temptation is natural but should be resisted, in your judgment, for the following reasons…. In favoring the latter approach (could you tell?), I am making three empirical claims: that many people will find it easier to resist the temptation when they have been forewarned to expect it; that few people will be lured into experiencing the temptation by the forewarning; and that both those who end up stockpiling and those who do not will feel a stronger alliance with the authorities if they have been forewarned than if the subject hasn’t been raised.

Like much of risk communication, anticipatory guidance requires that you have confidence in people’s ability to bear difficult situations. But it beats the alternative, which is surprising them with difficult situations.

6. Acknowledge the sins of the past.

link up to indexI routinely advise my clients to acknowledge anything negative about their own performance that the audience already knows, or that critics know and will tell the audience when they see fit. In fact, I advise clients to “wallow” in the negatives until their stakeholders – not just the clients themselves – are ready to move on.

In public relations, as opposed to stakeholder relations, this is not sensible advice. At most, PR professionals recommend acknowledging negative information briefly before transitioning to something more positive. Why wallow in a bad piece of news that most of the audience hasn’t even found out about? But stakeholders are assumed to be interested enough, and resistant enough, that they are bound to learn the bad news anyway. So wallowing in it makes sense.

Try to imagine Exxon talking about its environmental record to a roomful of environmentalists without mentioning the Valdez spill. The audience is sitting there waiting to see if the spokesperson is going to mention Valdez – and until he or she does, we are only half-listening. A wise Exxon environmental communicator would therefore have an “acknowledgement macro” for the spill; push just one button and out it comes: “As the company responsible for the Valdez disaster….” In the early 1990s, by contrast, I went to the Exxon pavilion at Disney World’s Epcot Center, a wonderful show on Exxon’s record of environmental protection – with not a word about Valdez. It was a great icebreaker; perfect strangers were murmuring to each other about Exxon’s gall in ignoring Valdez. Nothing the company could have said about the accident would have been as damaging as saying nothing.

Whether to acknowledge negatives that nobody knows and nobody is likely to find out — to blow the whistle on your own dirty secrets – is a tougher call. Leaving aside questions of law and ethics, risk communicators estimate that bad news does about twenty times as much damage if you try to keep it secret and fail than if you own up to it forthrightly. It follows that secrecy pays for itself only if an organization can achieve a 95% success rate at keeping secrets. If you fall short of 95%, as I think most organizations do, then blowing the whistle on yourself is cost-effective. But while secrecy is usually a bad risk, it isn’t crazy. What’s crazy is to reveal the secret and then behave as if it were still a secret.

In the 2001 anthrax attacks, CDC had one piece of negative information that was especially important to acknowledge: the fact that in the early days of the attack the agency was in error about whether anthrax spores could escape a sealed envelope to threaten … and kill … postal workers. This wasn’t a secret; the only question was how often CDC spokespeople chose to mention it. My advice to CDC: The more often you do so, the better. I think some CDC officials saw this advice as unfair or unfeeling. I didn’t mean it to be. I realize both how difficult it is to guess right about new risks and how painful it is to have guessed wrong. For a science-based organization like CDC, the need to act on incomplete data is itself painful – but in a bioterrorism crisis the need is unavoidable, and so error is unavoidable as well. There was discussion at CDC about whether it was appropriate to call this a mistake. I suggested that CDC neither call it a mistake nor object when others called it a mistake. What was essential, I said, was for CDC to refer to it often, so the rest of us didn’t feel compelled to do so ourselves.

I actually went further. When asked for its judgment about other matters where the science is unsettled, I suggested, CDC should remind us that its judgment had been fatally flawed before. As far as I know, nobody took this advice. And perhaps it went too far. Certainly in traditional public relations, a source that incessantly reminded reporters of prior errors might well provoke them to look for a more confident source. Nonetheless, I think the risk of dwelling too much on your past sins is a small risk, both in terms of its probability and in terms of its magnitude. The big risk is that you will mention them too seldom.

In addition to acknowledging the sins you have committed, it is important also to acknowledge the sins you have been accused of but have not committed – that is, to acknowledge what critics have said, and why it is understandable that they feel that way. (Defending yourself against mistaken charges works in proportion to how visibly you concede valid charges.) The claim that class or race underlay the difference in how anthrax was handled in Congressional buildings versus how it was handled in postal facilities was the sort of charge that needed to be acknowledged and sympathetically rebutted … not ignored. “Sympathetic rebuttal” isn’t easy, partly because it isn’t conventional (the very word “rebuttal” suggests a hard-edged debate) and partly because self-esteem gets in the way (it is much more appealing to cream the purveyors of false charges than to acknowledge that their error is an understandable one). Nor will it help to replace forthright anger with condescension. If sympathetic rebuttal is beyond you, at least you can manage irritated rebuttal; any acknowledgment is better than none at all. Even in public relations, “I wouldn’t dignify that with an answer” isn’t much of an answer. In stakeholder relations it’s a nonstarter.

Make a list of things that you have been, are being, or even might be criticized for. Talk about all of them. Acknowledge the germ of truth in as many as have a germ of truth; the more self-critical you are, the less critical we will be. If there isn’t even a germ of truth you can cop to, try as you might, at least acknowledge that people think or may think there is. And remember that this is what you do when you are dealing with stakeholders.

7. Be contrite or at least regretful, not defensive.

link up to indexIt’s not enough to acknowledge your prior misdeeds; you have to do so in a way that shows you know they are misdeeds. Exxon’s handling of the Valdez spill is a good example. Corporate officials did in fact acknowledge what had gone wrong, but in a tone that oscillated between pride and defensiveness; they often sounded like they thought they were the victims of the spill rather than its perpetrators. No way were they going to show contrition. The company was ultimately assessed billions of dollars in punitive damages chiefly as a result of its failure to show contrition.

Or consider the 1993 food poisoning outbreak at Jack-in-the-Box fast-food restaurants in the western United States. The hamburger patty processor had sent Jack-in-the-Box contaminated patties, which happens sometimes, and Jack-in-the-Box had cooked them insufficiently to kill the pathogens. Jack-in-the-Box managers did say they felt “responsible” for the safety of all customers, but they denied they were “responsible” for the outbreak, blaming it almost entirely on the supplier and inadequacies in the government meat inspection program. This did not feel like contrition to most consumers. The failure to show contrition, in turn, contributed significantly to the loss in sales.

The problem is what to say when you don’t feel contrite. Yes, there was a bad outcome that was in some sense your responsibility. Maybe you made a decision that in hindsight was unwise; or at least it turned out wrong. Maybe you relied on someone else (the ship captain had been drinking; the supplier sent you contaminated burgers). You wish you’d acted differently, but you don’t feel you did anything wrong – and your attorneys are at your elbow to make sure you don’t say anything that implies otherwise.

If contrite goes too far, aim for regretful. Unfortunately, the word “regret” no longer conveys regret; it sounds more lawyerly than apologetic. “Sorry” does the job well. Or “We feel terrible that….” Ride the seesaw of blame. Give us the information that shows you did your best, you couldn’t have helped it, it wasn’t really your fault, etc. But put this information in a subordinate clause, while in the main clause you regretfully blame yourself.

There are many real-world examples of the seesaw of blame. In the famous case of the Tylenol poisonings, several people died after someone added cyanide to random Tylenol capsules. The CEO of Johnson & Johnson held a video news conference in which he took moral responsibility for the poisonings, insisting that it was J&J’s job to have tamper-proof packaging. Millions of people who watched the clip on the news that night undoubtedly said to themselves, “It’s not his fault, it was some madman.” The Tylenol brand recovered.

But my favorite example is hypothetical. Let’s suppose your 12-year-old got into a fight at school and was sent to the principal’s office. Now your child has to tell you what happened. Consider two scenarios.

Scenario One:
Child:Mom, Dad, I really messed up in school today. I hit Johnny, and the teacher had to send me to the principal’s office.
You:Why did you hit Johnny?
Child:Well, he called me a dirty name, and I lost my temper and hit him. But I shouldn’t have. I should have kept my temper. It was my fault.

At this point you are very much on your child’s side, perhaps even a little proud. Probably there will be no further punishment at home.

Scenario Two:
Child:You won’t believe what that fool of a teacher did to me today. Johnny called me a dirty name, so of course I slugged him, and that jerky teacher had the nerve to send me to the principal’s office!

Your child has an attitude problem and you are likely to provide some at-home “attitude adjustment.”

Note that both scenarios involve the same “data.” If your child feels contrite, you feel forgiving. If your child feels unrepentant, you feel punitive.

Public officials managing a bioterrorist attack may feel they have little in common with Exxon, Jack-in-the-Box, Johnson & Johnson, or a kid who got into a fight in school. What you all have in common is very simple: Something went wrong on your watch. Maybe it wasn’t your fault, maybe it was unavoidable, but something went wrong on your watch. You need to tell us you’re sorry.

8. Ride the preparedness seesaw.

link up to indexIn a bioterrorist attack, one crucial “sin” of the past that must be regretfully acknowledged is insufficient preparedness.

It is important to recognize that the entire society feels it was insufficiently prepared for September 11 and its aftermath – or aftermaths, if it turns out that way. We are almost ashamed of having ignored the early warnings; we didn’t pay any attention to Islamic fundamentalism; we didn’t pay attention to all those reports about what might happen someday and how to improve homeland security. Shame normally gets projected. So CDC (and government in general) is at risk of getting blamed for the blame we all share. To avoid excessive blame, paradoxically, you must take your share readily. This is of course the seesaw again. If CDC blames itself for not having been sufficiently prepared, the rest of us blame it less; we notice the ways in which it was prepared more; we acknowledge our own lack of preparedness more; and we are more supportive of funding for improved preparedness. There is cold logic here as well as the seesaw. If you were well-prepared this time, then this is presumably the best we can expect from you next time. If you were not so well-prepared, we can hope for better in the future.

People don’t want to hear that CDC, the FBI, FEMA, and the other response agencies were well prepared before the anthrax attacks of 2001. They know they weren’t and suspect you weren’t, and so claiming you were just saps the credibility from your post-attack gearing up. “We are ready, just as we have always been” is not a strong message. Better to focus on the gearing up: “The last few months have shown us where we are strong and where we need work. We managed – even managed well – but at some points we were stretched pretty thin. A bigger attack would have stretched us worse. And we need to be ready for a bigger attack. So here’s what we are doing to become better prepared.”

What about current preparedness for future attacks? Any claim that the United States is “ready for the big one” (a major bioterrorist attack) absolutely requires a reference to Dark Winter and the TopOff series of simulations, which tended to suggest that preparedness was inadequate. In fact, virtually everything that was written about bioterrorism before September 11 argued aggressively that our country was insufficiently prepared. It is only since the threat became real that the people defending against it have felt compelled to say they are ready. At least that’s what they feel compelled to say if critics assert that they are not ready. What’s happening here is that some bioterrorism authorities are letting their critics determine their seat on the seesaw: When critics “accuse” them of being unprepared, they forget that they agree, or agree in part; instead, they reflexively climb onto the yes-we-are-too-prepared! seat. Defensiveness about preparedness, of course, backfires into stories about lack of preparedness.

Despite what I’ve said about uncertainty, I’m actually pretty confident about the previous recommendations. They are grounded in the well-established dynamics of stakeholder relations, especially the paradox of the seesaw. Let me move now, for a while, to some recommendations that are grounded in the special characteristics of bioterrorist attacks, where I am less confident.

9. Acknowledge and legitimate people’s fears.

link up to indexWhen people are afraid, the worst thing to do is pretend they’re not; second worst is to tell them they shouldn’t be. Both of these responses leave people alone with their fears. (Mishandling people’s fears keeps company with over-reassurance, but it’s conceptually different: “Everything is under control” versus “Don’t worry.”)

Even when the fear is totally unjustified, it doesn’t respond well to being ignored — nor does it respond well to criticism, mockery, or statistics. If your child thinks there are monsters in the closet, a smart parent doesn’t shrug off the fear or insist that monsters are very rare. You turn on all the lights, take your child by the hand, and check the closet together.

When the fear has some basis, ignoring it and disparaging it are even less effective approaches. I’m not telling you to tell people they are right that X is deadly if you’re pretty sure they’re wrong. But don’t emphasize that they’re wrong either. Emphasize that it is normal, human, damn near universal to be frightened of X, even though…. Then, in that subordinate clause, give your reasons why the risk is low. You can acknowledge and legitimate people’s fears even while you are giving them the information they need to put those fears into context.

Timothy Paustian at the University of Wisconsin has a page on his web site on anthrax. My wife, Dr. Jody Lanard, happened on the site in early November 2001, and sent Dr. Paustian some unsolicited risk communication advice. He changed the site. Here is one before-and-after comparison. (The breezy tone isn’t an addition; the original had the same tone.)

Before:
However, it will be very unlikely that you will receive one of these letters. Think about how many pieces of mail go out and how many people there are. Your chances are very low.
After:
You know it’s unlikely that you will receive one of these letters, but you’re still scared. You know how many pieces of mail go out, and how many people there are, but you can’t completely shake that inner worry. You know your chances are very low, but you find yourself reaching cautiously for the envelope, and you feel … just a little nuts. Welcome to the human race.

By giving people permission to be excessively alarmed about their mail, while still telling them why they needn’t worry, the revised version is far likelier to actually reassure.

I have worked on a wide range of risks of unknown seriousness, some of which later turned out serious and some of which did not: mad cow disease, Three Mile Island, global warming, silicone breast implants, AIDS. (I remember when CDC was trying to decide if AIDS was going to be widespread or not.) None of the concerns on this list benefitted from the pretense that people weren’t afraid, or that they shouldn’t be.

This pretense is especially damaging for people whose fear has escalated into denial. Assume I am fearful and don’t know it – which is what denial means. If you pretend I am not fearful, you’re allying with the denial and strengthening it. And if you tell me not to be fearful, you are challenging the denial … which also strengthens it. Nor will it help to label me. If I am in denial, “You’re afraid” (even “You’re right to be afraid”) is too direct, and will yield a seesaw response you don’t want: “No I’m not!” “You’re in denial” will yield the same response.

The task is to acknowledge and legitimate people’s fears, but only indirectly, in a deflected form. “It is only natural for many people to feel….” “I have talked to a lot of people who feel….” “Even though I keep telling myself all the statistical reasons why I shouldn’t be too concerned, even I sometimes feel….” These formulations make the person who is fearful and even the person who is in denial feel understood (but not exposed); they therefore ameliorate the fear and the denial.

In his superb November 8, 2001 speech from Atlanta, President Bush did an extraordinary job of acknowledging a fear I never thought I would see a U.S. president acknowledge: the fear that we won’t be able to bear it. Wisely, he put the fear into the past tense, but it was there:

The moment the second plane hit the second building, when we knew it was a terrorist attack, many felt that our lives would never be the same. What we couldn’t be sure of then and what the terrorists never expected was that America would emerge stronger, with a renewed spirit of pride and patriotism.

Copyright © 2001 by Peter M. Sandman

For more on crisis communication:    link to Crisis Communication index
      Comment or Ask      Read the comments
Contact information page:    Peter M. Sandman

Website design and management provided by SnowTao Editing Services.