Posted: October 2003
This page is categorized as:   link to Crisis Communication index  link to Pandemic and Other Infectious Diseases index
Hover here for
Article Summary In October 2003, the WHO included social scientists (including me) on its SARS-fighting team for the first time. This invited paper has a list of 24 risk communication principles relevant to a possible second SARS outbreak or to any infectious disease outbreak; it also lists SARS-related risk communication research needs and has a short bibliography.

Risk Communication Recommendations
for Infectious Disease Outbreaks

Prepared for the World Health Organization
SARS Scientific Research Advisory Committee
Geneva, Switzerland — October 20–21, 2003

Scholars in risk communication sometimes distinguish three paradigms. (1) When people are insufficiently alarmed about a serious hazard, the task is to increase their concern and motivate them to take appropriate actions. (2) When people are excessively alarmed about a small hazard, the task is to diminish their concern and deter them from unnecessary and potentially harmful actions. (3)When people are justifiably alarmed about a serious hazard, the task is to harness their concern and guide their actions.

Information plays an important role in all three risk communication paradigms. But it is a well-established error to believe that accurate information is all people need to respond appropriately to the situation.

In general, public health professionals are most comfortable and most skilled in the first of these three paradigms. An example within the infectious disease realm would be efforts to disseminate influenza vaccine.

The SARS outbreaks of 2003 in Beijing, Guangdong, Hong Kong, Singapore, Toronto, and elsewhere best fit the third paradigm, as does the prospect of additional SARS outbreaks in the next several years. Indeed, any outbreak of a novel and little-understood infectious disease is likely to be characterized by appropriately high levels of public concern that public health professionals will be challenged to harness and guide.

Nor can this challenge be safely left to communications staff – partly because it is public health professionals (and political leaders) who set policy and address the public in health emergencies, and partly because even communications specialists may be better trained in public relations and health education than in risk communication and crisis communication. People’s willingness and ability to cope with risk, to bear anxiety, to follow instructions, to help their neighbors, and to recover when the crisis is over will all depend to a significant extent on our success in integrating risk communication expertise into public health planning and policy.

This brief paper will offer risk communication recommendations for serious, and seriously alarming, infectious disease outbreaks. It will focus on recommendations that are counter-intuitive, where public health professionals without access to specialized risk communication expertise are likeliest to err.

The paper is divided into four sections, as follows:

Consensus risk communication recommendations. These are recommendations that we believe virtually all risk communication specialists would accept, but that are nonetheless often not followed by public health professionals.
Debatable risk communication recommendations. The recommendations on this second list, though grounded in research and experience, are not universally accepted by risk communication specialists.
Risk communication research agenda. The research needs associated with the first list have to do with determining how best to disseminate this knowledge to public health professionals. The recommendations on the second list, by contrast, would benefit from research to assess their validity.
Bibliography. Much of the most relevant writing (including much of our own writing) is exhortative or anecdotal, but some more rigorous experimental research is also on point.

Consensus risk communication recommendations

  • 1.  Don’t over-reassure. When people are unsure or ambivalent about how worried they should be, they often become (paradoxically) more alarmed when officials seem too reassuring. This can lead to anger and skepticism as well, and to loss of essential credibility if the truth turns out more serious than predicted. (See #13.)
  • 2.  Put reassuring information in subordinate clauses. When giving reassuring information to frightened or ambivalent people, it is helpful to de-emphasize the fact that it is reassuring. “Even though we haven’t seen a new case in 18 days, it is too soon to say we’re out of the woods yet.”
  • 3.  Acknowledge uncertainty. Sounding more certain than you are rings false, sets you up to turn out wrong, and provokes adversarial debate with those who disagree. Say what you know, what you don’t know, and what you are doing to learn more. Show you can bear your uncertainty and still take action. (See #s 14, 15, 16.)
  • 4.  Don’t overdiagnose or overplan for panic. Panic is a relatively rare (though extremely damaging) response to crisis. Efforts to avoid panic – for example, by withholding bad news and making over-reassuring statements – can actually make panic likelier instead. Do not mistake tolerable levels of fear or disobedient precaution-taking for panic. (See #17.)
  • 5.  Don’t ridicule the public’s emotions. Expressions of contempt for people’s fears and other emotions almost always backfire. Terms to avoid include “panic,” “hysteria,” and “irrational.” Even when discouraging harmful behavior (such as stigmatization), it is important to do so with sympathy rather than ridicule. (See #s 18, 19.)
  • 6.  Establish your own humanity. Express your own feelings, and show you can bear them. Express your wishes and hopes and fears. Tell a few stories about your past, your family, your personal reactions to the crisis.
  • 7.  Tell people what to expect. “Anticipatory guidance” – telling people what to expect – is especially useful (and difficult) in two situations: When it’s about uncertainty and possible error (“we will learn things in the coming weeks that everyone will wish we had known when we started”); and when it’s about bad outcomes (“the statistical mortality figure will probably go up as some of these very sick patients die”).
  • 8.  Offer people things to do. Self-protective action helps mitigate fear; victim-aid action helps mitigate misery. All action helps us bear our emotions, and thus helps prevent them from escalating into panic, flipping into denial, or declining into hopeless apathy. (See #s 20, 21.)
  • 9.  Acknowledge errors, deficiencies, and misbehaviors. People tend to be more critical of authorities who ignore things that have gone wrong than they are of authorities who acknowledge those things. Of course it takes something like saintliness to acknowledge negatives that the public will never know unless you acknowledge them. At least acknowledge those that the public does know, or is likely to find out. (See #22.)
  • 10.  Be explicit about “anchoring frames.” People have trouble learning information that conflicts with their prior knowledge, experience, or intuition. The pre-existing information provides an “anchoring frame” [Kahneman and Tversky] that impedes acquisition of the new information. It helps to be explicit about the change – first justifying the prior view (why it was right, or seemed right; why it was widespread), then explaining the new (what changed, or what was learned). (See #23.)
  • 11.  Don’t lie, and don’t tell half-truths. Among the most thoroughly demonstrated principles of risk communication is the irreparable damage done to credibility by false or misleading statements. In a public health crisis, moreover, the credibility of those in charge may well translate into people’s trust in official information and compliance with official instructions, and thus into lives saved or lost. More is at stake than your reputation. (See #24.)
  • 12.  Be careful with risk comparisons. There are many reasons why certain risks are more alarming than others. The statistical seriousness of the risk (magnitude × probability) is relevant, but so are trust, dread, familiarity, and control. Efforts to reassure people by comparing improbable but alarming risks to more probable but less alarming ones feel patronizing and tend to backfire. (Such comparisons may also be technically deficient. Mortality figures aside, SARS has forced hospitals to close and healthy people to go into quarantine; most years influenza has not.)

Debatable risk communication recommendations

  • 13.  Err on the alarming side. While it is obviously ideal to estimate risk correctly, if you have to get it wrong, it is wiser to err on the alarming side. In a fluid situation, the first communications should be the most alarming; “it’s better than we feared” is a far more tolerable day-two story than “it’s worse than we thought.” (See #1.)
  • 14.  Share dilemmas. When it is not obvious what to do, say so – even after you have decided what to do. Humanize your organization and its decision-making process by letting the pain of difficult decisions show. Explain the case for and against alternative options. (See #3.)
  • 15.  Acknowledge opinion diversity. Help the public learn that not all decisions are unanimous. Show that you can bear these differences within and between agencies, so they do not appear to be alarming fractures in your ability to cope with the crisis. Message consistency is still the ideal, but only if it reflects genuine unanimity. Message diversity is not harmful unless officials seem unaware of discrepancies or contemptuous of what other officials are saying. (See #3.)
  • 16.  Be willing to speculate. Refusing to speculate is better than speculating over-confidently and over-optimistically. But in a crisis you can’t just say you’ll have a report out next month; the information vacuum demands to be filled now. So take the risk of being misquoted or turning out wrong, and speculate … but always tentatively, and with due focus on the worst case. (See #3.)
  • 17.  Do not aim for zero fear. In a crisis, people are right to feel fearful (and miserable). A fearless public that leaves officials and experts alone to manage the problem simply is not achievable. Nor is it desirable, ultimately; vigilance and precaution-taking depend on sufficient fear. Note that humans are hard-wired to experience and tolerate fear; our fearfulness attaches and reattaches to various objects, or takes the form of free-floating anxiety. Officials need to reconsider their own “fear of fear.” (See #4.)
  • 18.  Legitimize people’s fears. Instead of leaving people alone with their fears, help them bear their fears by legitimizing them, and even sharing some of your own. Even technically inaccurate fears can be legitimized as natural, understandable, and widespread: “Despite the evidence that the health risk is very small, even I feel some hesitation when I hear someone coughing on a bus.” (See #5.)
  • 19.  Tolerate early over-reactions. One of the main ways people absorb new risks is by taking precautions that are unnecessary or premature. This is a form of rehearsal, both logistical and emotional; it also helps personalize the risk. Unless these over-reactions cause significant harm, they should be tolerated; they will settle soon enough into the “new normal.” (See #5.)
  • 20.  Let people choose their own actions. Offering people a choice of actions recruits not just their ability to act, but also their ability to decide. Ideally, bracket your recommendations with less and more extreme options, so people who are less concerned or more concerned than you wish they were do not need to define themselves as rebels; you have recommendations for them too. (See #8.)
  • 21.  Ask more of people. In a crisis, pro-social, resilient impulses vie for dominance with less desirable impulses: panic, passivity, selfishness. Ally with the former against the latter by asking more of people. Plan for volunteers; solicit them and welcome them, and use them wisely. Ask more of people emotionally as well. Give us permission to find the situation unbearable, but expect us to be able to bear it, and help us bear it. (See #8.)
  • 22.  Apologize often for errors, deficiencies, and misbehaviors. Forgiveness requires apology, even frequent apology. “Wallowing” in what went wrong is (paradoxically) a good way to persuade the rest of us to move on. And there is no better time to start apologizing than in mid-crisis, while we are too busy depending on you to want to blame you; this will help post-crisis when the recriminations start. (See #9.)
  • 23.  Be explicit about changes in official opinion, prediction, or policy. In emerging health crises, authorities are likely to learn things that justify changes in official opinions, predictions, or policies – for example, changes in the SARS case definition or in official recommendations concerning the use of masks. Announcing the new doctrine without reminding the public that it deviates from the old, though tempting, slows learning and fosters confusion. (See #10.)
  • 24.  Aim for total candor and transparency. There are always good reasons to withhold information – from fear of provoking panic to fear of turning out wrong. These valid rationales become excuses … and too much gets withheld, rarely too little. You probably shouldn’t achieve total candor and transparency, but you can safely aim for it. (See #11.)

Risk communication research agenda

Survey public health leaders on their attitudes and opinions regarding risk communication in infectious disease outbreaks. Which of the recommendations listed above do they accept, and which do they dispute? More importantly, what are the “mental models” – the understandings and misunderstandings – that underlie these attitudes and opinions?
  • What risk communication principles do they endorse?
  • What risk communication dilemmas do they recognize?
  • What risk communication issues have they yet to consider?
  • To what extent do they consider these matters relevant to their mission, and part of their responsibility?
Survey public health leaders on the barriers they see to implementing improved risk communication in infectious disease outbreaks. What are their staffing and resource needs in this area? Their training needs? What cultural or political barriers do they see? What are their recommendations for a process to improve their agencies’ risk communication effectiveness? What priority do they put on this task?
Collect and/or conduct case studies of successful and less successful risk communication efforts during infectious disease outbreaks. Particularly needed, in our judgment, is a detailed case study of Singapore’s risk communication during the SARS outbreak of Spring 2003 – a notable success compared to efforts during the same period in Beijing, Hong Kong, and Toronto. Case studies of other sorts of public health crises (BSE in the United Kingdom, for example) would also prove useful. These case studies should be analyzed especially for the light they shed on the two lists of risk communication recommendations:
  • What recommendations do they confirm?
  • What recommendations do they call into question?
  • What additional recommendations do they suggest?
Assess what changes, qualifications, or additions are needed to adapt the recommendations listed above to particular circumstances:
  • What should change to make the recommendations more relevant to multi-focal epidemics and worldwide pandemics?
  • What should change to make the recommendations more relevant to specific cultural situations – in particular, are the recommendations more appropriate in European and North American contexts than in Asia or the developing world?

Our judgment is that these variables (the sort of crisis and the culture in which it occurs) have surprisingly little impact on what risk communication approach is optimal. But this judgment is subject to dispute, and should be tested.

Test some or all of the second list of recommendations – those we have identified as “debatable” rather than “consensus.” In addition to anecdotal and case study data, consider which recommendations should be tested more rigorously. Consider also an expert consensus-building process to harvest the judgments of a diverse group of risk communication specialists.
Analyze the relevance of mass media systems to the conduct of risk communication during infectious disease outbreaks. How can public health officials best collaborate with the news media? What opportunities and what pitfalls are posed by the roles of journalists? What use should be made of non-journalistic media, such as web sites, advertising, and telephones? How is all this affected by national differences in media arrangements?
Develop a protocol for risk communication efforts before an infectious disease outbreak. This involves two quite different questions:
  • What should public health agencies do now to prepare themselves to cope with the risk communication needs an outbreak would create?
  • What risk communication should public health agencies be doing now, to prepare their publics to cope with such an outbreak?
Investigate how the World Health Organization can best contribute to helping the world’s public health professionals improve their risk communication preparedness and skill – including ways the World Health Organization can improve its own risk communication efforts.

Bibliography

“CDCynergy Emergency Risk Communication: Your Guide to Effective Emergency Risk Communication Planning,” Atlanta, GA: U.S. Centers for Disease Control and Prevention, 2003.

Covello V.T., Peters R., Wojtecki J., and Hyde R., “Risk Communication, the West Nile Virus Epidemic, and Bioterrorism: Responding to the Communication Challenges Posed by the Intentional or Unintentional Release of a Pathogen in an Urban Setting.” Journal of Urban Health: Bulletin of the New York Academy of Medicine, 2001: 78(2)(June):382–391.

Covello V.T., and Sandman P.M., “Risk Communication: Evolution and Revolution.” In Wolbarst A. (ed.), Solutions to an Environment in Peril. Baltimore, MD: John Hopkins University Press, 2001:164–178. (www.phli.org/riskcommunication/article.htm)

Kahneman D., and Tversky A., “Prospect Theory: An Analysis of Decision under Risk.” Econometrica, 1979: 47(2):263–291.

Lanard, J., and Sandman, P.M. “SARS Communication: What Singapore Is Doing Right.” The Straits Times (Singapore), May 6, 2003; also The Toronto Star (Canada), May 9, 2003. (http://www.psandman.com/articles/sars-2.htm)
(A longer, unpublished version: www.psandman.com/articles/sars-3.htm.)

Morgan G., and Fischhoff B., Risk Communication: A Mental Models Approach. Cambridge University Press, 2001.

Powell D., and Leiss W., Mad Cows and Mother’s Milk: The Perils of Poor Risk Communication. Montreal, Canada: McGill-Queen’s University Press, 1997.

Renn, O., and Rohrmann, B. (eds.), Cross-Cultural Risk Perception: A Survey of Empirical Studies. Dordrecht, The Netherlands: Kluwer Academic Publishers, 2000.

Sandman, P.M., “Anthrax, Bioterrorism, and Risk Communication: Guidelines for Action.” On The Peter Sandman Risk Communication Website, 2001. (www.psandman.com/col/part1.htm)

Sandman, P.M., “Smallpox Vaccination: Some Risk Communication Linchpins.” The Peter Sandman Risk Communication Website, 2002. (http://www.psandman.com/col/smallpox.htm)

Sandman, P.M., “Dilemmas in Emergency Communication Policy.” In “CDCynergy Emergency Risk Communication.” Atlanta, GA: U.S. Centers for Disease Control and Prevention, 2003. (www.psandman.com/articles/dilemmas.pdf)

Sandman, P.M., “Beyond Panic Prevention: Addressing Emotions in Emergency Risk Communication.” In “CDCynergy Emergency Risk Communication.” Atlanta, GA: U.S. Centers for Disease Control and Prevention, 2003. (www.psandman.com/articles/beyond.pdf)

Sandman, P.M., “Four Kinds of Risk Communication.” The Synergist (American Industrial Hygiene Association), April 2003, pp. 26–27. (www.psandman.com/col/4kind-1.htm)

Sandman, P.M., and Lanard, J., “‘Fear Is Spreading Faster than SARS’ – And So It Should!” The Peter Sandman Risk Communication Website, 2003. (www.psandman.com/col/SARS-1.htm)

Sandman, P.M., and Lanard, J., “Fear of Fear: The Role of Fear in Preparedness … and Why It Terrifies Officials.” The Peter Sandman Risk Communication Website, 2003. (www.psandman.com/col/fear.htm)

Slovic P., “Perception of Risk.” Science, 1987; 236:280–285.

Tversky, A., and Kahneman, D., “Judgment under Uncertainty: Heuristics and Biases.” Science, 1974; 185:1124–1131.

Copyright © 2003 by Peter M. Sandman and Jody Lanard

For more on crisis communication:    link to Crisis Communication index
For more on infectious diseases risk communication:    link to Pandemic and Other Infectious Diseases index
      Comment or Ask      Read the comments
Contact information page:    Peter M. Sandman

Website design and management provided by SnowTao Editing Services.