For roughly twenty years I defined myself as a specialist in risk communication, not crisis communication. The distinction, I kept telling people, is that risk communicators deal with what might happen, while crisis communicators deal with what just happened or is still happening.
Of course the distinction was always pretty arbitrary. Some of my earliest risk communication work focused on the 1979 Three Mile Island nuclear accident. That certainly felt like a crisis, no matter whether the discussion centered on what had gone wrong at the power plant (a crisis communication issue) or on how much radiation might be released (a risk communication issue).
Still, I felt that crisis communicators needed expertise I didn’t really have – running evacuations, coordinating emergency responders, fielding thousands of simultaneous telephone calls, etc. I was comfortable working on “reputational crises,” controversies that felt like crises to my clients, but were simply hot issues to their stakeholders and the media. I was comfortable advising on potential future crises (worst case scenarios) and even past crises (recriminations). But real crises? I wasn’t sure I had much to say.
This changed after September 11, 2001. Like just about everyone else, I desperately wanted to help. After dozens of false starts, I wrote a long “column” for this website on “Risk Communication and the War Against Terrorism.” Soon afterwards the Centers for Disease Control and Prevention asked me to help with anthrax communication, and then with smallpox and other bioterrorism communication issues. Other terrorism and crisis communication clients started calling. I found I did have things to say after all.
Roughly a third of my professional work is now focused on terrorism, disease outbreaks, and other emergencies.
My work on pandemics and other infectious disease outbreaks, in fact, has become voluminous enough to require its own index. Only a little of that work is listed in this index as well.
Topical Sections in Crisis Communication
The Basics in One 2004 Video
-
Crisis Communication: Guidelines for Action
Produced by the American Industrial Hygiene Association, Fairfax VA, 2004
Posted: January 29, 2012This 166-minute video, produced by the American Industrial Hygiene Association in 2004, covers 25 crisis communication recommendations, focusing chiefly on the most difficult messaging challenges that even experienced crisis communicators may get wrong. AIHA stopped distributing the video in January 2012, so now it᾿s available for free on Vimeo (video) and on this site (audio). Unlike many of my videos, this one was professionally produced in a studio, with multiple cameras and an actual set – and it features not just me but also my wife and colleague Jody Lanard. Although some of the examples may be dated – there᾿s a lot of SARS and bird flu throughout the video – the recommendations themselves haven᾿t changed. A complete set of handouts to accompany this video is available.
- Part One (51:59)
Part One introduces where Jody and I think crisis communication fits in risk communication (high hazard, high outrage), and then discusses the first six of our 25 crisis communication recommendations:
- Don᾿t over-reassure.
- Put reassuring information in subordinate clauses.
- Err on the alarming side.
- Acknowledge uncertainty.
- Share dilemmas.
- Acknowledge opinion diversity.
- Part Two (57:11)
-
Part Two covers numbers 7 through 16 of the 25 crisis communication recommendations discussed in the video:
- Be willing to speculate.
- Don᾿t overdiagnose or overplan for panic.
- Don᾿t aim for zero fear.
- Don᾿t forget emotions other than fear.
- Don᾿t ridicule the public᾿s emotions.
- Legitimize people᾿s fears.
- Tolerate early over-reactions.
- Establish your own humanity.
- Tell people what to expect.
- Offer people things to do.
- Part Three (57:10)
-
Part Three covers numbers 17 through 25 of the 25 crisis communication recommendations:
- Let people choose their own actions.
- Ask more of people.
- Acknowledge errors, deficiencies, and misbehaviors.
- Apologize often for errors, deficiencies, and misbehaviors.
- Be explicit about “anchoring frames.”
- Be explicit about changes in official opinion, prediction, or policy.
- Don᾿t lie, and don᾿t tell half-truths.
- Aim for total candor and transparency.
- Be careful with risk comparisons.
Especially Important to Read
-
Effective COVID-19 Crisis Communication
Posted on the website of the Center for Infectious Disease Research and Policy, May 6, 2020
In late April 2020, some three months into the COVID-19 pandemic, the University of Minnesota Center for Infectious Disease Research and Policy decided to produce a series of reports under the collective title “COVID-19: The CIDRAP Viewpoint.” CIDRAP head Michael Osterholm asked Jody and me to write the second entry in the series, on COVID-19 crisis communication. We focused on six crisis communication basics: Don’t over-reassure; proclaim uncertainty; validate emotions; give people things to do; admit and apologize for errors; and share dilemmas. Throughout the report, we emphasized the most glaring problem of COVID-19 communication in the U.S. so far – nurturing the dangerous myth that COVID-19 will be a one-peak pandemic that's about halfway over already. We also stressed that crisis communication is a field of study and practice and it’s past time for officials and experts to learn the basics.
-
Ebola Risk Communication: Talking about Ebola in Dallas, West Africa, and the World
Posted: October 6, 2014
In early October, I started getting media inquiries about Ebola risk communication. Three such inquiries led me to write emails (two of them jointly with my wife and colleague Jody Lanard) that collectively summarized most of our thinking about how U.S. sources and the U.S. media were handling Ebola – the first U.S. case in Dallas, the disastrous epidemic in West Africa, and the global pandemic risk. Included in this column are: (a) Our October 3 response to Sharon Begley of Reuters; (b) Our much shorter October 5 response to Kai Kupferschmidt of Science; and (c) My short October 6 response to Paul Farhi of the Washington Post. The articles that Sharon, Kai, and Paul wrote are referenced and linked at the very end of this column, or will be once they’re published. (Note that I had some follow-up communications with Kai, by phone and email, that he relied on in his article but are not included in the column.)
-
Pre-Crisis Communication: Talking about What-Ifs
Posted: September 13, 2013
This column is devoted not to crisis communication but to pre-crisis communication: how to talk to people about a possible future emergency. In building the case for communicating before you absolutely have to, the column examines four principal responses to pre-crisis communication: (1) People who were worried already are usually relieved that the issue is on the table. (2) People who have too many other things to worry about are usually apathetic and hard to reach. (3) People who were already too worried to bear it are usually in denial and hard to reach in a completely different way. (4) People who are hearing the scary news for the first time usually go through an adjustment reaction, a temporary and useful overreaction. If the crisis is actually coming, the column argues, pre-crisis communication has considerable upside and no downside. The column ends with recommendations for minimizing the downside of warning about a possible crisis that fizzles.
-
Posted: August 14, 2011
Together with my wife and colleague Jody Lanard, I have long advised clients to release risk information early – and since early information is almost always uncertain, to acknowledge the uncertainty. But even when clients (and non-clients) do what we consider a pretty decent job of acknowledging uncertainty, they often end up in reputational trouble when they turn out wrong, largely because journalists and the public misperceive and misremember their statements as having been far more confident than they actually were. So we have come to believe that it’s not enough to acknowledge uncertainty; you have to proclaim uncertainty, repeatedly and emphatically. This long column uses a severe German E. coli food poisoning outbreak in 2011 to explore the complexities of proclaiming uncertainty: the myriad ways government agencies and industry spokespeople get it wrong, and some recommendations for getting it right … or at least righter. Proclaiming uncertainty is important in all kinds of risk communication – outrage management as much as precaution advocacy and crisis communication. But our focus here is mostly on how to warn people about an imminent, uncertain risk: in this case, how to tell people which foods not to eat because you think they might be contaminated and deadly.
-
Risk Communication Lessons from the BP Spill
Posted: September 13, 2010
This is my eighth (but probably not my last) website commentary on the April 2010 BP oil spill in the Gulf of Mexico. This one also appeared in the September 2010 issue of The Synergist, published by the American Industrial Hygiene Association. So it starts with two risk communication lessons especially appropriate to industrial hygienists: Don’t believe your own propaganda about safety and emergency preparedness, and work to build a safety culture where employees are willing to blow the whistle about unsafe conditions. Then it discusses four more traditional crisis communication lessons: Don’t over-reassure; make contrition credible; make compassion and determination credible; and say how stupid you feel.
An Italian translation was published in Darwin, November–December 2010, pp. 26–31.
Italian translation available
-
Empathic Communication in High-Stress Situations
Posted: June 8, 2010
These are the notes I developed for a multinational management consulting firm that asked me to help give empathy training to its top consultant-managers. Though applied (as best I could) to a management consulting context, these notes are based largely on my 2007 column “Empathy in Risk Communication,” supplemented with such risk communication basics as the “donkey” game, the risk communication seesaw, and acknowledging uncertainty.
-
Climate Change Risk Communication: The Problem of Psychological Denial
Posted: February 11, 2009
Arousing apathetic people to care enough about global warming that they’re actually willing to do something about it is a difficult precaution advocacy challenge. Activists are chipping away at that task with slow but significant success. But there’s another audience for climate change risk communication that I think activists aren’t paying nearly enough attention to: people who are in denial about the crisis because it threatens the way they see the world or because it arouses intolerable levels of fear, guilt, sadness, hopelessness, or other emotions. For people in or near denial, outrage is high, not low; the risk communication paradigm is crisis communication, not precaution advocacy. This long column builds a case that global warming denial is a growing problem, and that messaging designed to work on apathetic audiences can easily backfire on audiences in denial. The column focuses on six common activist messages that need to be rethought in terms of their likely negative impact on people who are in or near global warming denial: fear-mongering; guilt-tripping; excessive hostility to narrow technological solutions; unwillingness to pay attention to climate change adaptation; over-reliance on depressing information and imagery; and one-sided contempt for contrarian arguments.
-
Managing Justified Outrage: Outrage Management When Your Opponents Are Substantively Right
Posted: November 19, 2008
This long column tries to correct a serious oversimplification in my previous writing about risk communication. Outrage management isn’t just for calming people down when they mistakenly believe they have substantive reasons to oppose you. It is also for calming people down when they rightly believe they have substantive reasons to oppose you. Converting justified opposition that’s outraged into justified opposition that’s calm doesn’t (and shouldn’t) eliminate the opposition, but it does accomplish several things: It lowers the level of passion; it opens people up to the possibility of altruism; it gets them in a mood to negotiate; and it enables them to be more realistic in defeat or more generous in victory. While all the usual outrage management strategies apply, two strategies are particularly crucial when your critics are substantively right: acknowledging that they are right, and being candid about the distribution of power. The column also has an important “postscript” on the role of outrage management in a genuine high-hazard, high-outrage crisis.
-
Posted: July 30, 2007
Everyone knows risk communicators need to be empathic, but all too often empathy gets operationalized as telling people you know how they feel – or, worse yet, telling them how they feel. This long column argues that the essence of empathy is “sort-of acknowledgment,” finding a middle ground between obliviousness and intrusiveness. The column goes on to discuss ten elements of empathic communication. Some are pretty obvious (listening and echoing, for example); some are easy-to-learn tactics (such as suggesting that “some people” might feel a particular way instead of accusing your stakeholders of feeling that way); some are complicated and counterintuitive. The most complicated and counter-intuitive ones are grounded in the work of psychiatrist Leston Havens.
-
“Speak with One Voice” — Why I Disagree
Posted: July 27, 2006
This column dissects an issue – one of the few – on which I disagree with most risk communication and crisis communication professionals: what to do when there are differences of opinion within your organization. The conventional advice is to “speak with one voice” – that is, to paper over the disagreements. I urge my clients to let the disagreements show. The column distinguishes the ways of showing opinion diversity that really do undermine public confidence from the ways that (in my judgment) do not, and identifies many reasons why it is beneficial to let the public know that you’re not all on the same page about every issue. Perhaps most importantly, it details what tends to go wrong when organizations muzzle their staff in order to speak with one voice.
-
Katrina: Hurricanes, Catastrophes, and Risk Communication
Posted: September 8, 2005
When I wrote this column shortly after Hurricane Katrina struck, I didn’t realize that it would still be an ongoing disaster years later. So the column focuses on risk communication failures before the hurricane reached New Orleans (especially the failure to scare people sufficiently) and immediately after the hurricane reached New Orleans (especially the failure to acknowledge emergency response inadequacies and to communicate with victims desperate for information). I saw these failures not as unique to Katrina but as warnings relevant to the next big earthquake or infectious disease outbreak. This perspective may have led me to go too easy on the specific defects of Katrina response.
-
Bird Flu: Communicating the Risk
Published in Perspectives in Health (Pan American Health Organization), vol. 10, no. 2, 2005, pp. 2–9
PAHO asked us to combine a primer on risk communication with a primer on avian influenza. The resulting article talks about the challenge of alerting the public to bird flu risks, then offers ten risk communication principles, each illustrated with bird flu examples. The PDF file also includes the cover, an editor’s note entitled “Communication: risky business,” and the contents page.
( There is an online version (same text, but easier to read than a PDF file) posted on the PAHO website. The entire issue is also there.
Spanish translation available
Traducción en Español: La gripe aviar: cómo comunicar el riesgo
-
Pandemic Influenza Risk Communication: The Teachable Moment
Posted: December 4, 2004
This is the first column Jody Lanard and I wrote about pandemic preparedness. We wrote it when many experts believed a devastating H5N1 flu pandemic might be just around the corner – and so we thought so too. (We still think the risk is serious, but there’s much less sense of imminence as I write this blurb in mid-2008.) The thrust of this long column is how to sound the alarm. After a primer on why H5N1 is “not your garden variety flu,” the column proposes a list of pre-crisis pandemic talking points. Then it assesses how well experts and officials were addressing those points as of late 2004. The experts, we wrote, were doing their best to arouse the public. But governments and international agencies were undermining the sense of urgency with grossly over-optimistic claims about pharmaceutical solutions.
-
Posted: August 28, 2004
Most of this long column is addressed to risk communicators whose goal is to keep their audience unconcerned. So naturally they’d rather not talk about awful but unlikely worst case scenarios. The column details their reluctance even to mention worst case scenarios, and their tendency when they finally get around to discussing them to do so over-reassuringly. It explains why this is unwise – why people (especially outraged people) tend to overreact to worst case scenarios when the available information is scanty or over-reassuring. Then the column lists 25 guidelines for explaining worst case scenarios properly. Finally, a postscript addresses the opposite problem. Suppose you’re not trying to reassure people about worst case scenarios; you’re trying to warn them. How can you do that more effectively?
-
Three Mile Island – 25 Years Later
Published in safety AT WORK, April 24, 2004, pp. 7–11
When the Three Mile Island nuclear accident began in late March of 1979, I was asked by the Columbia Journalism Review to go to the scene and “cover the coverage.” The resulting article, “At Three Mile Island,” was written jointly with Mary Paden. This new article focuses on some of the crisis communication lessons I learned at Three Mile Island – lessons many corporate and government crisis managers have yet to learn.
In March 2006, this article was reprinted by the International Atomic Energy Agency in its IAEA Bulletin (vol. 47, no. 2, pp. 9–13) under the title, “Tell It Like It Is: 7 Lessons from TMI.” The IAEA version is available online in the following languages:
Translations available
-
Crisis Communication: A Very Quick Introduction
Posted: April 15, 2004
This short column is made up of two lists. First comes a list of six “focus areas” of crisis communication – including the one I consider most in need of improvement: metamessaging. (This jargony word is the best I can come up with to describe all the content of crisis communications other than information content: how reassuring to be, how confident to sound, how to address emotion, etc.) The rest of the column is a list of 25 crisis communication recommendations – most of them about metamessaging. The 25 recommendations are discussed in more detail in my crisis communication handouts. But this column lists them all conveniently on one page.
Spanish translation available
Traducción en Español: Comunicación de crisis: una introducción muy rápida
-
Fear of Fear: The Role of Fear in Preparedness ... and Why It Terrifies Officials
Posted: September 8, 2003
My government clients often tell me they want to persuade the public to take precautions against some risk … but not if they have to frighten anybody. Jody Lanard and I wrote this long column not just to argue the necessity for warnings to be frightening, but also to analyze the widespread official “fear of fear.” We explore its origins in officials’ justified concern that they will be criticized for frightening people, and in their unjustified concern that the people they frighten will find the experience permanent and unbearable. We also investigate a closely allied phenomenon, “panic panic” – the panicky feelings officials experience when they wrongly judge that the public is about to panic, and the unwise crisis management strategies they typically attempt in order to “allay” the public’s panic.
-
“Fear Is Spreading Faster than SARS” — And So It Should!
Posted: April 28, 2003
Until it turned out less contagious than initially thought, SARS looked to many experts like it might very well be the devastating pandemic they had spent decades fearfully awaiting. When Jody Lanard and I wrote this column in April 2003, that was still an open question. The public’s SARS fears were entirely justifiable – yet many governments, experts, and even journalists were working overtime to dampen those fears. The column describes this “soft cover-up” of SARS over-optimism, tries to explain why so many officials were seduced by it, and offers both good examples of guiding the public’s fear and bad examples of trying to allay that fear. The column concludes with a list of 18 specific risk communication recommendations for talking about SARS.
-
Posted: February 20, 2003
One of the things the U.S. government got wrong after 9/11 was its failure to offer people things to do. So when it started listing some steps ordinary people could take to help prepare for the possibility of more terrorist attacks, Jody Lanard and I noted with interest the widespread disdainful response, most of it linked to the inclusion of duct tape on the government’s list of items a prepared citizen ought to have on hand. In this column, we analyze the reasons for this weird response, which we liken to a similar cynicism about anti-nuclear precautions in the 1950s. The column ends with suggestions for improving the U.S. government’s post-9/11 risk communication, starting with the need to ask more of people.
-
Beyond Panic Prevention: Addressing Emotion in Emergency Communication
In Emergency Risk Communication CDCynergy (CD-ROM), Centers for Disease Control and Prevention, U.S. Department of Health and Human Services, February 2003
This is one of three articles I wrote for the CDC’s CD-ROM on emergency risk communication. This one deals with the likely emotional impacts of terrorism (and other major emergencies), and how communicators can best help the public cope with these emotions. The focus is especially on denial and misery as more common emotional reactions than panic – reactions that may be mishandled if the communicator is over-worried about panic prevention instead.
This project was supported in part by an appointment to the Research Participation Program for the Office of Communication, Centers for Disease Control and Prevention, administered by the Oak Ridge Institute for Science and Education through an agreement between the Department of Energy and CDC. The The entire CD-ROM is available at http://emergency.cdc.gov/erc/erc.asp.
-
Dilemmas in Emergency Communication Policy
In Emergency Risk Communication CDCynergy (CD-ROM), Centers for Disease Control and Prevention, U.S. Department of Health and Human Services, February 2003
This is one of three articles I wrote for the CDC’s CD-ROM on emergency risk communication. Based partly on my earlier Anthrax, Bioterrorism, and Risk Communication: Guidelines for Action, this one deals with ten “dilemmas” facing emergency communication planners:
- Candor versus secrecy
- Speculation versus refusal to speculate
- Tentativeness versus confidence
- Being alarming versus being reassuring
- Being human versus being professional
- Being apologetic versus being defensive
- Decentralization versus centralization
- Democracy and individual control versus expert decision-making
- Planning for denial and misery versus planning for panic
- Erring on the side of caution versus taking chances
For each of the ten dilemmas, my own position leans toward the first of the two poles – and the natural instinct of communicators in mid-emergency leans toward the second.
This project was supported in part by an appointment to the Research Participation Program for the Office of Communication, Centers for Disease Control and Prevention, administered by the Oak Ridge Institute for Science and Education through an agreement between the Department of Energy and CDC. The The entire CD-ROM is available at http://emergency.cdc.gov/erc/erc.asp.
-
Obvious or Suspected, Here or Elsewhere, Now or Then: Paradigms of Emergency Events
In Emergency Risk Communication CDCynergy (CD-ROM), Centers for Disease Control and Prevention, U.S. Department of Health and Human Services, February 2003
This is one of three articles I wrote for the CDC’s CD-ROM on emergency risk communication. The usual paradigm for emergency communication is the obviously horrific event that is happening right here, right now. This article focuses on communication strategies to address six other paradigms:
- Obvious/here/future
- Obvious/here/past
- Obvious/elsewhere/now
- Suspected/here/now
- Suspected/here/future
- Suspected/here/past
Among the topics covered are worst case scenarios, uncertainty, and dilemma-sharing.
This project was supported in part by an appointment to the Research Participation Program for the Office of Communication, Centers for Disease Control and Prevention, administered by the Oak Ridge Institute for Science and Education through an agreement between the Department of Energy and CDC. The The entire CD-ROM is available at http://emergency.cdc.gov/erc/erc.asp.
-
Smallpox Vaccination: Some Risk Communication Linchpins
Public Health Outrage and Smallpox Vaccination: An Afterthought
Posted: December 30, 2002 and January 19, 2003
In December 2002, I was asked to help plan and run a meeting on risk communication recommendations for the U.S. program to vaccinate healthcare workers and emergency responders against smallpox. The first column is an edited version of my introductory remarks. It addresses some familiar “risk communication linchpins” – paying attention to outrage, doing anticipatory guidance, expressing wishes and feelings, tolerating uncertainty, sharing dilemmas, riding the seesaw, etc. – all customized for the controversies I thought likeliest to emerge over smallpox vaccination. What I learned from the meeting was that most of the public health professionals implementing the smallpox vaccination program were themselves outraged that it even existed. So I wrote an “afterthought” on the sources of that outrage, and the need to deal with it lest it undermine the program … which, in my judgment, it later did.
-
Anthrax, Bioterrorism, and Risk Communication: Guidelines for Action
Posted: December 29, 2001
Accustomed to naturally occurring diseases, the U.S. Centers for Disease Control and Prevention (CDC) had a difficult time coping with the anthrax bioattacks of late 2001. Since risk communication was one of its core problems, it asked me to come to Atlanta and help. This four-part “column” grew out of my Atlanta notes. If the CDC was adjusting to bioterrorism, so was I. I put aside my usual outrage management recommendations and developed 26 recommendations specifically on the anthrax crisis. These became the basis for my (sadly) expanding work in crisis communication, and for the crisis communication CD-ROM and DVD Jody Lanard and I ultimately put out in 2004. This column was my first extended discussion of most of these crisis communication recommendations, and it is my only published assessment of the CDC’s anthrax communication efforts.
-
Risk Communication and the War Against Terrorism: High Hazard, High Outrage
Posted: October 22, 2001
It took more than a month after the 9/11 attacks for me to decide that I had relevant expertise to offer. (This column was first posted on October 22, 2001, revised and reposted on November 10.) My wife Jody Lanard crystallized it for me when she said, “You’ve been doing high-hazard, low-outrage risk communication and low-hazard, high-outrage risk communication for years. This time it’s high-hazard, high-outrage risk communication.” But this column isn’t my first crack at a list of generic recommendations for communicating in high-hazard, high-outrage situations; that didn’t come till my anthrax column a couple of months later. This one is more a meditation on the risk communication significance of 9/11, a very tentative first effort to consider how best to talk to people in the wake of that still-shocking event.
Other Articles by Peter M. Sandman
Why So Much COVID-19 Crisis Communication Has Failed: An Expert Explains
Email responses by Peter M. Sandman to questions posed by Eric Lebowitz of Critical Mention, posted verbatim as a Critical Mention “eBook,” July 1, 2020
Jody Lanard and I posted an article on “Effective COVID-19 Crisis Communication” on May 6, 2020. On May 12, Eric Lebowitz of Critical Mention emailed me three follow-up questions, focusing on why the crisis communication principles Jody and I had emphasized were so seldom followed. The answers I sent him on June 5 covered some familiar ground with new COVID-19 examples, including the case for admitting mistakes instead of trying to hide them. But I also included information on two topics I hadn’t written about in so much detail previously: arguments I used to use when trying to convince my consulting clients to avoid over-reassurance and overconfidence; and why traditional public relations paradigms make PR people bad crisis communicators unless they have reoriented their approach. Critical Mention reformatted my answers as a short “eBook” for their clients, with permission to post the eBook on this website as well.
-
Why Do Risk Communication When Nobody’s Endangered and Nobody’s Upset (Yet)?
Posted: April 19, 2018
Years ago I distinguished three paradigms of risk communication: precaution advocacy when hazard is high and outrage is low; outrage management when hazard is low and outrage is high; and crisis communication when both are high. But I have endlessly claimed that there’s no risk communication to be done when hazard and outrage are both low. That’s true when you’re pretty sure hazard and outrage will remain low. This column is a primer on what to do when one or the other is expected to climb: pre-precaution advocacy when hazard is likely to climb; pre-outrage management when outrage is likely to climb; and pre-crisis communication when both are likely to climb. There’s also an introductory section on how to surveil for increasing outrage. (Hazard surveillance isn’t my field.)
-
An Ebola Empathy Exercise (pure speculation, based on hypothetical what-ifs)
Posted: October 3, 2014
Throughout August and September, my wife and colleague Jody Lanard and I obsessed over Ebola. We wrote part or all of several Ebola risk communication columns, only to have our thinking overtaken by events. This short column, completed in one day, focuses on a very narrow question: What might have happened at Texas Health Dallas Presbyterian Hospital on September 25–26, 2014, when Thomas Eric Duncan came to the emergency room with fever and abdominal pain, said he was visiting from Liberia (the heart of West Africa’s Ebola hot zone), and was nonetheless sent home? Two days later, days in which he might have infected other people, Duncan was brought back to Texas Health Dallas by ambulance. That time Ebola was suspected, and later confirmed, making Duncan the first Ebola patient to be diagnosed outside Africa. Commentary has been understandably hostile to both Duncan and the hospital staff for what may turn out to have been a tragic miscommunication. Jody and I felt that anger too. We have tried to temper it with this Ebola empathy exercise, a purely speculative effort to look at a ghastly mistake without assuming reckless irresponsibility on either side. As more facts come out, our speculations may well be proven entirely false. Even so, the need for people to respond empathically to Ebola will not go away. Empathy is needed for the horrific conditions West Africans are enduring; for the threat to the rest of us; for the ways people at overwhelming risk may resort to denial, while people whose risk is much smaller may temporarily overreact; even for the officials who yield to the temptation to oversimplify or over-reassure. The column isn’t about all that, though. It’s just an attempt to imagine empathically what might have happened in that Dallas emergency room.
-
Posted on the Discovery News website, May 1, 2014
When a Ship Captain Abandons Ship Prematurely
Email to Sheila M. Eldred, April 30, 2014
On April 29, 2014, reporter Sheila M. Eldred of the Discovery News website emailed me about an article she was writing in the wake of an April 16 ferry disaster in South Korea on “why captains abandon ship.” My brief email in response stressed that a captain who abandons ship prematurely isn’t panicking, but is simply failing to be a hero in a situation where duty demands heroism. The temptation afterwards, I wrote, is to self-justify instead of admitting as much. Sheila’s story used a lot of my email.
-
Posted on the website of Adweek, March 23, 2014
Data Breaches: Managing Reputational Impact
Email to David Gianatasio, March 4, 2014 (with two March 16 emails interpolated)
David Gianatasio of Adweek emailed me in mid-February 2014 about an article he was writing on “how data breaches and security concerns might impact brands such as Target” (which had announced a huge data breach two months earlier) and “how companies can handle the fallout.” In the weeks that followed, Dave sent me more specific questions. My answers stressed the importance of addressing the concerns of affected stakeholders as opposed to the general public; and of focusing on negative reputation as opposed to positive reputation. The reputational impact of a data breach, I argued, depends mostly on two factors: how competently a company was protecting customer data before the breach, and how empathically it responded after the breach. Very little in my answers is unique to data breaches. Similar advice can be found, for example, in “After the Disaster: Communicating with the Public,” my response to a different journalist’s questions about an April 17, 2013 explosion at a fertilizer facility in West, Texas. Dave ended up focusing more on the specifics of the Target breach than on what companies should do about breaches, but he did find room in the last half of his March 23 article for several snippets from my answers.
-
Other People’s Crisis: Talking to Bystanders
Posted: March 5, 2014
During a health or safety crisis, the most important audience is obviously the people who are at risk. But what about the people who aren’t at risk, but merely bystanders? This column argues that bystanders are also an important crisis communication audience, for six reasons: (1) They may find out about your crisis some other way if you don’t tell them, which could cause them to over-react. (2) They may feel at risk even if they’re not and intellectually know they’re not. (3) They may be feeling miserable about your crisis and what it’s doing to other people. (4) They may not actually be bystanders, but affected in some way you’re not noticing. (5) They may want to help – and helping may be psychologically important for them. (6) Other people’s crisis is a teachable moment, an opportunity to convince them to take seriously the possibility that it could happen to them too someday.
Il processo dell’Aquila agli scienziati dei terremoti e il rischio della fuga
Published in Corriere della Sera, October 22, 2012
Convicting and Maybe Imprisoning Scientists for Bad Risk Communication: Italy’s L’Aquila Earthquake
Emails to Anna Meldolesi, October 16 and October 22, 2012
In April 2009, a powerful earthquake devastated the Italian city of L’Aquila and surrounding villages. The quake had been preceded by a “swarm” of tremors, which many townspeople interpreted as a warning. So a panel of experts was invited to L’Aquila to assess the evidence and try to reassure the populace. The news conference that concluded the panel’s deliberations was indeed reassuring – excessively reassuring. As a result, six scientists and one government official were tried for manslaughter after the quake, and in October 2012 they were convicted – a rare and perhaps unprecedented case of imposing prison sentences on scientists for doing bad risk communication. In response to emails from Anna Meldolesi of Corriere della Sera, my wife and colleague Jody Lanard and I wrote two sets of comments on the case, some of which Anna used in her October 22 story. Both Anna’s story and our emails to her are linked above.
-
Posted: December 8, 2011
The flooding that began in northern Thailand in late July 2011 has been Thailand’s worst flood in at least five decades. This column assesses the Thai government’s crisis communication at the height of the flood, especially its tendency to over-reassure. The column puts this performance into context by reviewing other examples of Thai over-reassurance from our files, and speculates on whether and why over-reassuring the public during emergencies might be more characteristic of Thai crisis communication than of crisis communication in other countries. A final section addresses how the Thai government (or any government or company) might begin to dig itself out from such a history – that is, what to do when your audience has learned to expect dishonest over-reassurance from you.
-
Solicited letter to the editor, London Evening Standard, June 4, 2010
On June 3, the London Evening Standard published a piece by City Editor Chris Blackhurst urging everybody to “Stop putting the boot into BP – we need it to survive.” The editors asked me to write a response for the next day’s paper. So I wrote one, agreeing with Blackhurst that vilifying BP is unwise and in some ways unfair, then pointing out some other ways I think the vilification is justified. I don’t know if the response was published (the Evening Standard website doesn’t include letters), but here it is.
Communicating about the BP Oil Spill: What to Say; Who Should Talk
Posted on Daily Kos, May 30, 2010
On May 29, one of the editors of the popular left-leaning blog Daily Kos, who goes under the nom-de-Web “DemFromCT,” wrote to ask my views on two risk communication aspects of the oil spill disaster in the Gulf of Mexico: What should the sources be saying about the likely future course of the spill, and who should do the talking. He quoted liberally from my response in his May 30 post, entitled “Risk Communication and Disasters: Just Tell the Truth.” He also posted my whole response at the end of his piece. I focused mostly on telling the whole truth, avoiding over-reassurance, and letting everybody talk instead of trying to “speak with one voice.”
-
BBC Radio 4 interview with Peter M. Sandman, broadcasted on the “PM” newscast, May 3, 2010
On May 3 I did a brief interview with BBC Radio on risk communication aspects of the BP oil spill in the Gulf of Mexico. Although the interview was prerecorded, to my surprise they used the whole thing. This page has the link to the MP3 file with the interview. It also has a summary of what I said and what else I’d have liked to say.
-
Three Paradigms of Radiological Risk Communication: Alerting, Reassuring, Guiding
Presented to the National Public Health Information Coalition, Miami Beach FL, October 21, 2009
Posted: January 2, 2010Although this six-hour seminar was entitled “Three Paradigms of Radiological Risk Communication,” NPHIC asked me to go easy on the “radiological” part and give participants a broad introduction to my approach to risk communication, mentioning radiation issues from time to time. So that’s what I did.
Fair warning: These are not professional videos. NPHIC member Joe Rebele put a camera in the back of the room and let it run. You won’t lose much listening to the MP3 audio files on this site instead.
- Part 1
-
Part One is a introduction to the hazard-versus-outrage distinction and the three paradigms of risk communication.
- Part Two
-
Part Two discusses the seesaw and other risk communication games (thus completing the introductory segment), then spends a little over an hour each on some key strategies of precaution advocacy and outrage management.
- Part Three
-
Part Three is a rundown on some key crisis communication strategies.
See especially Part Three.
-
Containment as Signal: Swine Flu Risk Miscommunication
Posted: June 29, 2009
The swine flu pandemic started in North America, and by the time the virus was identified it was already widely seeded in the U.S. So the experts judged that it was too late to try to “contain” its U.S. spread; from Day One, the U.S. was focused mostly on coping with the disease, not stopping or even slowing it. Outside North America, on the other hand, an initial containment strategy made public health sense. But containment isn’t just a public health strategy. It is also a risk communication signal of enormous importance. Containment sends a signal that the pandemic can be contained and that it must be contained – that it is stoppable and severe. Instead of countering these misleading signals, the governments of many countries have issued misleading messages to match. This is doing significant damage to the world’s preparedness to cope with the unstoppable (and soon to be pervasive) but so far mild pandemic that is just beginning.
-
Posted: April 29, 2009
When I started criticizing the government for talking about swine flu as if there were nothing for the public to do but watch and practice good hygiene, we were at WHO Pandemic Phase 3. When I started this column (this morning) we were at Phase 4. When I finished the column (this evening), it was already Phase 5. The focus of this column is why the U.S. government is reluctant to urge the public to prepare now for a possibly imminent pandemic, and why I think the government should overcome its reluctance and do it! If you’re skeptical about advising people to imagine The Big One, get used to that knot in their stomachs, and then get started on preparedness, read this column. If you’re not skeptical and want to know what I think the important messages for right now are, skip this column and instead read “What to Say When a Pandemic Looks Imminent: Messaging for WHO Phases Four and Five.”
On May 21, 2009, Nature published a major abridgment and minor updating of this column under the title “Pandemics: good hygiene is not enough.” (An Adobe Acrobat file (707-kB pdf) of the complete article is also available.)
-
Handling explosive emotions demands five acts of empathy
Published in ISHN (Industrial Safety & Hygiene News), May 2008, pp. 1, 24, 26
Dave Johnson, the editor of ISHN, admired my website column on “Empathy in Risk Communication.” But of course it was much too long for him to republish. So he excerpted the less complicated sections, made a few editing and formatting changes, and came up with a shorter, more accessible article.
-
Media Sensationalism and Risk: Talking to Stakeholders with Reporters in the Room
Posted: September 6, 2006
This short column discusses seven principles for understanding and coping with the media’s entirely appropriate inclination to focus on the most newsworthy things you say – an inclination often labeled sensationalism. Of particular importance is the problem this raises for outrage management. The very same meeting at which you hope to say responsive, apologetic things in order to help reduce the outrage of angry stakeholders will also be attended by journalists, who will naturally convey your revealing admissions to readers and viewers who might otherwise never know. Managing a controversy well, in other words, is in some ways antithetical to managing the news clips well. You have to decide which task is more important. The column recommends managing the controversy.
-
Crisis Communication Best Practices: Some Quibbles and Additions
Published in the Journal of Applied Communication Research, vol. 34, no. 3, August 2006, pp. 257–262
Since 2004 I have been working with the U.S. Government-funded National Center for Food Protection and Defense at the University of Minnesota. One of the projects I worked on was an effort to develop a set of consensus “best practices” in crisis communication. Matthew Seeger wrote up the results in an article entitled “Best Practices in Crisis Communication: An Expert Panel Process.” I got a second bite of the apple when I was asked to write one of four commentaries on Seeger's article. My commentary, posted here with permission, focuses on some things I think the group missed or got wrong: the importance of fear and other emotions, the need to trust and respect the public, and the over-emphasis on message consistency. Seeger's article and the other commentaries are available online from the publisher, but only if you pay. A much less detailed PowerPoint on the ten best practices is available without charge.
-
How Safe Is Safe Enough: Sharing the Dilemma
Posted: April 20, 2006
This short column has two goals. It introduces readers to the invaluable risk communication strategy of dilemma sharing – telling people you’re torn between options and not sure what to do. This strategy is fundamental to both crisis communication and outrage management, but it is seldom utilized, largely because it threatens management egos. The second goal of the column is to apply the dilemma-sharing approach to the specific problem of “how safe is safe enough.” Risk managers have no choice but to prioritize precautions and decide which ones they can implement. The claim to be taking “every possible precaution” is always a lie. Risk managers who don’t want to lie can use dilemma sharing to explain why they have chosen not to take some possible precautions.
-
Risk Communications During a Terrorist Attack or Other Public Health Emergency
Published in Terrorism and Other Public Health Emergencies: A Reference Guide for the Media (Washington, DC: U.S. Department of Health & Human Services, 2005), Chapter 11, pp. 184–193
I have a two-page “essay” in this chapter (pp. 190–191) entitled “Public Reactions to Crisis Situations and Communication Implications,” which covers yet again material that is presented in more detail in “Beyond Panic Prevention: Addressing Emotion in Emergency Communication.” The rest of the chapter (on which I collaborated) is worth reading for its advice to journalists on how the public and the official sources are likely to cope with a terrorism crisis. The rest of the manual, no longer available online, is mostly about biological, chemical, and radiological threats and the government agencies that try to address them.
-
Superb Flu Pandemic Risk Communication: A Role Model from Australia
Posted: July 6, 2005
On May 2, 2005, Australian Health Minister Tony Abbott gave a speech on pandemic preparedness. It wasn’t especially earthshaking; in fact, it attracted fairly little media attention. But Jody Lanard and I thought it was terrific – candid, alarming, tentative, all the things most official pandemic presentations were not (and are not). So we sat down to annotate the speech in terms of 25 crisis communication recommendations we had published previously. If you just read the speech, you’ll discover that good risk communication can sound just as ordinary as bad risk communication. If you read the column’s annotations, you’ll discover how extraordinary this particular speech really was.
-
Posted: June 16, 2005
This column argues that western society has a blind spot for bad guys – that our vision of an actionable emergency is an accident, not an attack. It discusses several examples, from the resistance to evidence that the 1984 Bhopal “accident” was probably sabotage to the opposition of the U.S. public health profession to the possibility that smallpox might constitute a weapon of mass destruction that could justify a vaccination program. The best example – detailed in the column – happened in April 2005, when it was learned that an infectious disease testing company had mistakenly sent samples of a potentially pandemic strain of influenza to labs all over the world. So a fax went out to all the labs telling them so, and asking them to destroy the sample – thus converting a small accident risk into a much larger terrorism risk. The facts were public at the time, but a society with a blind spot for bad guys simply ignored their implications.
-
Public Reactions and Teachable Moments
Published in Homeland Protection Professional, May 2005, vol. 4, no. 4, pp.14–16
This article quickly covers some of the emotional reactions to crisis situations – ground covered in more detail in “Beyond Panic Prevention: Addressing Emotion in Emergency Communication” and “Adjustment Reactions: The Teachable Moment in Crisis Communication.” Some minor editorial changes made by the magazine’s staff have not been replicated here (only the ones I liked).
-
Talking about Dead Bodies: Risk Communication after a Catastrophe
Posted: February 8, 2005
After nearly every natural disaster (earthquake, flood, etc.), the survivors feel an urgent need to bury the dead, often in mass graves that later complicate everything from mourning to inheritance. Yet with some exceptions, the bodies of natural disaster victims are not a significant disease threat to the living, and burying them should therefore have a lower priority than other rescue and recovery tasks. International emergency response agencies do their best to convince local officials and local populations that this is so – but more often than not they fail. In this column, Jody Lanard and I discuss the reasons why the impulse to bury the bodies is so powerful, and offer some empathic ways to counter that impulse, rather than simply explaining the scientific data.
-
Adjustment Reactions: The Teachable Moment in Crisis Communication
Posted: January 17, 2005
When people first learn about a new risk, they go through a temporary over-reaction that is natural, healthy, and useful. Psychiatrists call this the “adjustment reaction.” Having one is virtually a prerequisite to crisis preparedness. This short column outlines the characteristics of adjustment reactions. It advises crisis communicators to guide the public through its pre-crisis or early-crisis adjustment reaction, rather than trying to persuade people to skip this essential step toward being ready to cope.
Spanish translation available
Traducción en Español: Reacciones de ajuste: el momento enseñable en la comunicación de Crisis
-
Tsunami Risk Communication: Warnings and the Myth of Panic
Posted: January 6, 2005
One of the reasons the Thai government neglected to warn its people in advance of the devastating December 2004 tsunami was a fear of panicking them. As if to prove this deadly decision right, a few days later there were warnings that aftershocks might produce another tsunami. Thousands of seaside residents fled to higher ground – and when it turned out there was no second tsunami, media reports said they had panicked. Jody Lanard and I had written before about the tendency of officials and journalists to misdiagnose caution as panic, but the tsunami impelled us to revisit the issue. This time we focused more narrowly on how rare panic is in response to natural disasters – or to warnings about natural disasters.
-
Posted: November 11, 2004
This column is in two parts. Part One lists some basic tips for overcoming the universal temptation to sound overconfident; it’s a primer on how to sound uncertain instead. Part Two goes into detail on the toughest part of acknowledging uncertainty: deciding just how uncertain you ought to sound, and then coming up with words (or numbers) that capture the right level of uncertainty. It assesses five biases that tend to distort our judgments about how uncertain to sound, even after we have accepted the principle that we should acknowledge our uncertainty. Compare “I can’t guarantee that it’s safe” with “I don’t know if it’s safe.” Both acknowledge uncertainty – but very different levels of uncertainty. Which of the two is likelier to get said when the other would have been closer to the truth?
-
Leading during Bioattacks and Epidemics with the Public’s Trust and Help
Biosecurity and Bioterrorism, 2004, vol. 2, no. 1, pp. 25–40
I was part of a 30-person “Working Group” that developed a report urging leaders to treat the public more as an ally and less as a problem in crisis situations. The report was drafted by Monica Schoch-Spana of the Center for Biosecurity of the University of Pittsburgh Medical Center, who later redrafted it as an article for the Center’s Biosecurity and Bioterrorism journal. The writing is a little academic for my taste, but I think the recommendations are wonderful ... and the footnotes are invaluable.
-
Emergency Risk Communication CDCynergy video clips
Posted: February 26, 2011
The “Emergency Risk Communication CDCynergy” CD-ROM from which these video clips were taken was originally produced in 2003 by the Centers for Disease Control and Prevention (Office of Communication), the Agency for Toxic Substances and Disease Registry, the Prospect Center of the American Institutes for Research, and the Oak Ridge Institute for Science and Education. The complete CD-ROM can be ordered. Much of the CD-ROM is also available without charge online, but many of the online links no longer work.
I was one of a number of risk communication experts who contributed to the CD-ROM. Three of my written contributions have long been posted on this website:
The following short video clips on various aspects of crisis communication were part of the CD-ROM but no longer load in the online version. So I have posted them here, converted to Flash videos. (Don’t have a Flash player? Download one of these (both free): Adobe Flash Player or FLV Player.)
- “Move in the Uncomfortable Direction” (1:46)
-
Crisis communication strategies have a side that practitioners find comfortable and a side they find uncomfortable – withholding information versus total candor, for example. Best practice is somewhere in the middle. To get there, practitioners need to move in the uncomfortable direction.
- “Give People Things to Do” (3:34)
-
Giving people things to do – and better yet, choices among things to do – helps them cope with the fear and other feelings that crises arouse.
- “Manage the Risk Communication Seesaw” (2:31)
-
Seesaws prevail in crisis communication as they do in most of risk communication, and practitioners need to climb onto on the side they don’t want the public on. To get blamed less by others, for example, it helps to blame yourself more.
- “Be Willing to Speculate” (2:17)
-
Crisis communication absolutely requires speculation; you can’t confine what you say to things that are certain. The trick is to avoid speculating overconfidently or over-reassuringly.
- “Here’s How to Speculate” (2:16)
-
Tell what you know and what you don’t. Sound as sure and as unsure as you actually are. Focus on both likeliest scenarios and worst case scenarios – and keep the distinction clear.
- “Let Your Feelings Show” (0:41)
-
To be an effective role model for others, you have to show that you’re feeling what they’re feeling (fear, anger, etc.). Watching you control your feelings helps people control theirs.
- “Tell Stories” (0:42)
-
Telling stories about yourself helps humanize you, which helps people bear the crisis better.
- “Choose the Best Spokesperson” (2:39)
-
You want someone who has communication skill, risk communication training, and technical expertise. You also want someone who likes the job, is willing to simplify, and has enough stature in your organization to make decisions and keep promises.
- “Find a Crisis Communicator” (1:09)
You want someone who knows how to guide people who are rightly upset. That isn’t necessarily the same communicator who’s good at arousing concern in people who are unwisely apathetic.
-
Practicing for The Big One: Pennsylvania’s Hepatitis A Outbreak and Risk Communication
Posted: December 4, 2003
In late 2003, an outbreak of hepatitis A in Western Pennsylvania provided a neat case study of pretty good risk communication (not perfect, but not bad) about a pretty serious problem (not huge, but not tiny). In this column, Jody Lanard and I use Pennsylvania’s hepatitis outbreak to illustrate four basic dilemmas in crisis communication – dilemmas that are sure to come up in bigger emergencies: preoccupation with panic; trust and secrecy; over-reassurance; and anticipatory guidance.
-
Risk Communication Recommendations for Infectious Disease Outbreaks
Presented to the World Health Organization SARS Scientific Research Advisory Committee, Geneva Switzerland, October 20, 2003
In October 2003, the WHO included social scientists (including me) on its SARS-fighting team for the first time. This invited paper has a list of 24 risk communication principles relevant to a possible second SARS outbreak or to any infectious disease outbreak; it also lists SARS-related risk communication research needs and has a short bibliography.
-
It Is Never Too Soon to Speculate
Posted: September 17, 2003
Risk communication and crisis communication professionals sometimes urge their clients not to speculate. But they can’t mean it literally. Speculation is talking about things you’re not sure about … and that’s pretty much what risk communication is. In this short column, Jody Lanard and I make the case on behalf of responsible speculation – that is, speculation that sounds suitably speculative (that isn’t overconfident); speculation that pays sufficient attention to dire scenarios (that isn’t over-optimistic); and speculation that explicitly addresses the difficult question of which precautions are appropriate even while the information is still uncertain and which precautions should await less speculative knowledge.
-
Review: The Mad Cow Crisis: Health and the Public Good
Journal of Health Psychology, January 2000
This review of a book on England’s “mad cow disease” crisis is relevant for its discussion of how to communicate about a small problem that threatens to become a big problem. Think anthrax. As always, over-reassurance turns out to be the wrong approach.
-
The Synergist, April 1995
This short column deals with sabotage – and the important possibility that outraged employees can pose a hazard to everyone else. It was written (obviously) before 9/11, but resonates even more powerfully now.
-
Scared stiff – or scared into action
Bulletin of the Atomic Scientists, January 1986, pp. 12–16
This 1986 article aimed at helping peace activists develop communication strategies that wouldn’t deepen people’s “psychic numbing” about nuclear weapons. Though its political content is out of date, its prescription – anger, love, hope, and action – is relevant today to coping with public denial about terrorism. (For terrorism I would want to add to the prescription the need to acknowledge and share the underlying fears – what people are “really” afraid of.)
-
Washington DC: U.S. Government Printing Office, October 1979 (#052-003-00734-7)
After the Three Mile Island nuclear accident in March 1979, President Jimmy Carter appointed an investigative commission. One of the commission’s mandates was to look at public communications, both what sources told the media and what the media told the public. Having just published “At Three Mile Island” in the Columbia Journalism Review, I was asked to work with the commission’s Public Right to Information Task Force, chaired by David M. Rubin. The task force report, with 14 authors, has been out-of-print for decades, but was recently put online as part of a huge Three Mile Island “Resource Center” run by Dickinson College. The report constitutes a wonderful blow-by-blow account of a pretty typical example of corporate, government, and media crisis communication.
-
Columbia Journalism Review, July/August 1979
This was the first “crisis” I watched unfold. The themes have since become familiar (they are playing out now in the terrorism crisis): The sources minimize the risk and over-reassure the audience; the experts are uncertain and disagree with each other; the public doesn’t panic but everyone keeps thinking it will; the color stories overpower the technical stories. (Writing this article in 1979 changed my career; see “Muddling My Way into Risk Communication.”) Includes two sidebar articles: “The local media feel the heat” and “The Inquirer goes for broke.” For a more recent perspective, see “Three Mile Island – 25 Years Later.”
A Macedonian translation by John Obri was posted in March 2012 on http://webhostinggeeks.com/science/.
Macedonian translation available
Handout Sets
Handouts from 2004 Crisis Communication CD-ROM
Entire set of handouts (Note that some handouts are in both sets.)
Terrorism and Crisis Communication (High Hazard, High Outrage)
Anthrax, Bioterrorism, and Risk Communication: Guidelines for Action (No. 2)
Beyond Panic Prevention: Addressing Emotion in Emergency Communication (No. 8)
Crisis Communication: Six “Easy” Strategies (No. 11)
Crisis Communication: Six “Harder” Strategies (No. 12)
Crisis Communication I: How Bad Is It? How Sure Are You? (No. 12a)
Crisis Communication II: Coping with the Emotional Side of the Crisis (No. 12b)
Crisis Communication III: Involving the Public (No. 12c)
Crisis Communication IV: Errors, Misimpressions, and Half-Truths (No. 12d)
Dilemmas in Emergency Communication Policy (No. 14)
Obvious or Suspected, Here or Elsewhere, Now or Then: Paradigms of Emergency Events (No. 32)
Outrage Management in Mid-Crisis (No. 35)
Smallpox Vaccination: Some Risk Communication Linchpins (No. 53)
Talking about Worst Case Scenarios: Eight Principal Strategies (No. 54)
Talking about Worst Case Scenarios: Twenty Additional Suggestions (No. 55)
The Three Kinds of Crisis Communication and Their Relationship to Risk Communication (No. 57)
Interviews, Summaries, etc.
-
Part Two of a two-part interview with Peter M. Sandman by George Whitney of Complete EM, July 22, 2016.
George Whitney runs an emergency management consulting company called Complete EM. His website features a blog and a podcast series. On July 22, 2016 he interviewed me by phone for nearly two hours. He edited the interview into two podcasts, which he entitled “Dr. Peter Sandman – Risk Communication” and “Dr. Peter Sandman – Crisis Communication.” I have given them new titles.
This interview segment, George’s Part Two, ranges broadly. After distinguishing crisis communication from pre-crisis communication, I focused first on some crisis communication basics: don’t over-reassure, don’t be over-confident, don’t think people are panicking when they’re not. Then in response to George’s questions I addressed an assortment of additional topics: civil unrest; crisis planning; the L’Aquila earthquake communication controversy; crisis mnemonics like “Run – Hide – Fight”; how emergency management professionals can use social media; and the pros and cons of going public in a crisis before you have come up to speed. (Part One is “Scaring People: The Uses and Limitations of Fear Appeals.”)
Homeland Security: An Emotional Response Plan
Published in Security Management, October 2014
[Note: Available in the print edition only]The Role of Emotion in Crisis Communication
Email responding to questions from Lilly Chapa, July 5, 2014
On June 30, 2014, reporter Lilly Chapa emailed me about an article she was writing for Security Management, a monthly industry trade publication, on “taking emotional response into consideration when building a crisis response plan.” Instead of an interview, we agreed that she would email me questions, which she did. I responded on July 5, but promised not to post the Q&A until her article was published. The article ran in the October issue under the title “Homeland Security: An Emotional Response Plan.” It is available in the print edition only, but the Q&A is on this site. As it turned out, Lilly’s questions got me going on two other topics besides emotional response: the distinctions among pre-crisis, crisis, and post-crisis communication; and the errors organizations most typically make in their crisis response planning. Finally we got to what I thought would be her focus, planning for the likely emotional reactions of victims, bystanders, and emergency responders themselves.
-
How to Lead during Times of Trouble (transcript)
A roundtable discussion at “The Public as an Asset, Not a Problem: A Summit on Leadership during Bioterrorism,” Johns Hopkins University Center for Civilian Biodefense Strategies, Washington DC, February, 2003
In early February of 2003, I attended a wonderful conference on bioterrorism, focused on “the public as an asset, not a problem.” The panel I participated in was about how to lead a community during times of trouble. Most of the panelists had actually led their communities through various crises, from the 2001 anthrax attacks to Oklahoma City’s bombing; I was added, along with the Washington Post’s Sally Quinn, so there would be at least two panelists whose experience was observing rather than doing.
-
Planning for Bioterrorism Communication
Minnesota Community Health Conference, September 2002
On September 12, 2002, I gave a half-day presentation on “Planning for Bioterrorism Communication” in Breezy Point MN, at the annual Community Health Conference sponsored by the Minnesota Department of Health (MDH). The presentation was based on three chapters I was writing for an emergency communication manual soon to be published by the Centers for Disease Control and Prevention (CDC). The three chapters have now been posted, but this much shorter (nine pages) summary by MDH’s risk communication specialist Buddy Ferguson is still useful.
-
Terrorism, Transparency, and Employee Sabotage
Canadian Chemical Producers’ Association, June 2002
After I gave a June 2002 presentation to the Canadian Chemical Producers’ Association on risk communication aspects of the chemical industry’s Responsible Care program, the CCPA’s Harvey Chartrand interviewed me for the organization’s members-only website. This excerpt from the interview deals with my views on how September 11 should affect chemical industry transparency, and on the relationship between terrorism and employee sabotage.
-
CDC Responds: Risk Communication and Bioterrorism
December 6, 2001
This CDC webcast includes excerpts from a November 2001 presentation I made to the CDC in Atlanta, Georgia, plus live discussion of the issue by a panel of other risk communicators. For a longer written version of my CDC presentation, see my December 29, 2001 column “Anthrax, Bioterrorism, and Risk Communication: Guidelines for Action.”
-
An interview published in safety AT WORK 30 October 2001
This very brief interview discusses the events of 9/11. For a much (MUCH) longer treatment of the same topic, see my October 22, 2001 column “Risk Communication and the War Against Terrorism: High Hazard, High Outrage.”
-
Nukes, the Freeze, and Public Opinion
An interview by Mary Jones, Matrix (Rutgers University), Spring 1984, pp. 9–12
In 1983–84, I took a sabbatical from my professorship at Rutgers University and worked on communication for the nuclear freeze movement. This interview was published in the Rutgers alumni magazine during my sabbatical. It talks about people’s fear of nuclear war and their reluctance to get involved in the peace movement. I have to say that both the world and my political values have changed some – though I do like the young man who gave this interview. What hasn’t changed much is my analysis of nuclear denial, which resonates today for current issues like the fear of terrorism. (The book I said I was writing, by the way, never got written; I still have a hundred or so pages of draft somewhere.)
Articles by Others
Is This the Poster Food for a Radiation Menace?
Published in The New York Times, April 12, 2011, p. D5
Denise called me on April 8 to ask why nearly every nuclear expert she interviewed about the risk of eating food contaminated with radiation from the Fukushima power plants kept talking about bananas. The point of such comparisons, I told her, is to belittle people’s fears about low levels of radioactivity by picking a comparator that will make them feel stupid for worrying – something we eat routinely without knowing it has radioactive ingredients. I emphasized that minimizing comparisons usually backfire, especially in the middle of a crisis – not just because they’re pompous and condescending, but also because people can sense that the source’s goal is to (over)reassure them rather than to inform and guide them.
In Defense of Iodine Snatchers
Posted on the Turnstyle website, April 1, 2011
On March 23, Charlie Foster posted an article for “Turnstyle” (an online information service by and for adults 18–34) on some of the ways the Tokyo Electric Power Company and the Japanese government have fostered mistrust in their handling of the Fukushima nuclear power plant crisis. It was based entirely on his phone interview with me. On April 1, Charlie posted this second article, based on the same interview. This one focuses on why I think people who seek out potassium iodide shouldn’t be belittled as stupid, as hoarders, or as panicking.
In Japan, a New Legacy of Mistrust
Posted on the Turnstyle website, March 23, 2011
I did only a handful of media interviews on Japan’s Fukushima nuclear crisis, and most of them ended up contributing just a paragraph or two to the final story – not really worth posting here. But I like Charlie Foster’s article for “Turnstyle” (an online information service by and for adults 18–34). It is based entirely on his phone interview with me, and focuses on some of the ways the Tokyo Electric Power Company and the Japanese government have fostered mistrust. I particularly like Charlie’s use of my Three Mile Island story: At TMI, the Pennsylvania government warned consumers that local milk might end up tainted with radioactive iodine, whereas the Japanese government tested Fukushima-area milk secretly but said nothing until the day it announced the milk was radioactive (also water and vegetables).
-
Posted on his blog, January 20, 2009
A different version was published in The Age, January 20, 2009, under the title “Bad news must be told.”
The current economic meltdown is surely a crisis (high hazard, high outrage). And so the principles of crisis communication apply, including the crucial principle of leveling with people about bad news rather than feeding them over-reassuring half-truths. Unfortunately, most of those charged with responding to the world economic crisis don’t quite realize that crisis communication is a field. They’re making it up as they go along, and they rarely get it right. (I admit I’m not quite sure economics is a field; they seem to be making that up as they go along too.) It was a pleasure to see Peter Martin cite some of my work on behalf of candor about recession bad news – both the candor of the Australian government and his own candor in his economics writing for The Age, one of Australia’s leading newspapers.
-
Crisis Communications to the Public: A Missing Link
Chapter 5C.6 of Learning from SARS — Renewal of Public Health in Canada: A Report of the National Advisory Committee on SARS and Public Health (the “Naylor Report”), October 2003
One small section of the official Canadian government report on the lessons of SARS addresses public communication – and leans predominantly on the “scathing” assessment of Sandman and Lanard.
-
Published in The Toronto Star, May 30, 2003
This is an almost shockingly lighthearted piece on Toronto’s SARS epidemic. It starts out with a weird focus on the question of whether SARS is God’s punishment, but winds up making some fairly solid points.
-
SARS: How Singapore outmanaged the others
Published in Asia Times, Hong Kong, April 9, 2003
I thought Singapore handled SARS risk communication a lot better than China, Hong Kong, or Canada. But I never expected to be explaining why in a Hong Kong newspaper.
-
Weighing Your Risks of Becoming a Terror Victim
Published in The New York Times Week in Review, March 23, 2003
I’m quoted here on just one point, but it’s an important one: the need to get people accustomed to their fear of terrorism, to show them how to cope with that fear rather than trying to relieve them of it. (There, now you don’t have to read the article.)
-
Published in The BSCS Newsletter [Biological Sciences Curriculum Study], Fall 2002
How should teachers talk to kids about terrorism? This short article has my views and the views of others.
-
Published in The Trenton Times, July 12, 2002
I think this is my wife and colleague Jody Lanard’s first risk communication publication, a newspaper op-ed urging that people who want to be vaccinated against smallpox get sent to “vaccination camp.”
Selected Guestbook
Comments and Responses
2018
2017
Possible kratom ban: what kind of risk communication? (February 2017)
2015
Don’t tell people not to panic. Especially don’t tell them not to panic yet. (February 2015)
2014
Assigning blame for Toledo’s water emergency (August 2014)
Explaining a voluntary, very partial evacuation of an angry town with a coal mine fire (March 2014)
2012
Warning the world about yet another possible catastrophe: solar flares (July 2012)
The widespread insistence that sources should “speak with one voice” (March 2012)
Deepwater Horizon in perspective: the dynamics of blame (March 2012)
“Panic buying” in crisis situations: China’s Fukushima run on salt (March 2012)
2011
Validating the adjustment reaction: “Of course you’re upset….”(December 2011)
Layoffs as a risk communication challenge (October 2011)
Hurricane Irene risk communication: public service or weather porn? (September 2001)
Cultural differences regarding Fukushima crisis communication (April 2011)
Fukushima mistrust (April 2011)
More on Fukushima crisis communication: The failure to speculate (April 2011)
Japan’s nuclear crisis: The need to talk more candidly about worst case scenarios (March 2011)
Unempathic over-reassurance re Japan’s nuclear power plants (March 2011)
Restoring confidence after the Christchurch earthquakes (March 2011)
2010
Optimism, “vision,” and crisis communication (September 2010)
President Obama’s handling of the Deepwater Horizon oil spill (August 2010)
WHO: Hyping the pandemic or helping the world prepare? (June 2010)
Meeting the needs of relatives of disaster victims (April 2010)
Talking about uncertainty when hazard levels are unclear (February 2010)
Making pandemic communications (and all crisis communications) provisional (February 2010)
2009
How do you engage people in mid-crisis long-term planning? Is it even possible? (June 2009) En Français: Comment faites-vous pour amener des gens à la planification à long terme, au milieu d’une crise? Est-ce même possible?
WHO’s “Outbreak Communication Guidelines” – and calling a pandemic a pandemic (May 2009)
Credit default swaps, financial meltdown, and risk communication (March 2009)
2008
Social media and source coordination in pre-crisis and crisis communication (December 2008)
Which media work best in different kinds of risk communication? (October 2008)
Should you tell bystanders about a crisis (or a controversy)? (September 2008)
2007
Managing outrage about the release of a convicted rapist (December 2007)
Origins of the risk communication seesaw principle (December 2007)
Role of leadership in homeland security crisis communication (September 2007)
Asking people to wait in line for medicine in a crisis (July 2007)
What’s unique about “counterterror risk communication”? (July 2007)
Are empathy and compassion really what matters in mid-emergency? (February 2007)
2006
Risk communication and the legitimacy of counterterrorism (November 2006)
Is emergency preparedness getting too much attention? (October 2006)
What does it mean to “manage” terrorism — and the fear of terrorism? (August 2006)
Motivating disaster preparedness (August 2006)
Notes from the Beirut evacuation (July 2006)
2005
Homeland Security's color coding as an excuse not to warn people about bird flu (July 2005)
2004
Getting out preparedness information before a crisis (August 2004)
2003
Informing the public versus informing terrorists and criminals (July 2003)
Scaring people about terrorism (July 2003)
Emergency how-to warnings (February 2003)
Evacuation feasibility — the attractions of fatalism (February 2003)
Why the sudden interest in smallpox? (February 2003)
Smallpox vaccination: Can we trust the government? (January 2003)
2002
Communication now about possible future terrorism (March 2002)
Crisis communication versus risk communication (March 2002)
Anthrax, politicians, and PR (February 2002)
Risk communication for government emergency responders (January 2002)
2001
What did Rudy Giuliani do right? (December 2001)
Bioterrorism and anthrax — candor (even about “what-ifs”) reduces panic (October 2001)
Copyright © 2020 by Peter M. Sandman