Posted: February 18, 2010
This page is categorized as:    link to Outrage Management index
Hover here for
Article SummaryHonest mistakes turn into culpable deceptions when organizations hesitate to come clean. This column outlines ten key recommendations for telling people you got it wrong. It starts with the basics: “Don’t stick to your guns.” “Don’t think that quietly publishing the data protects you.” “Don’t expect misleading ambiguities to save you.” Then it works its way to more complicated advice: “Explain what happened.” “Explain what’s going to happen.” “Explain what else might need to be rethought.” Of course it’s also important to acknowledge uncertainty from the outset. As the column points out, “It’s a lot easier to tell people you got it wrong if you didn’t sound cocksure in the first place.”

Telling People You Got It Wrong

This is the twentieth in a series of risk communication columns I have been asked to write for The Synergist, the journal of the American Industrial Hygiene Association. The columns appear both in the journal and on this website. A significantly abridged version of this one was published in the April 2010 issue of The Synergist, pp. 27–29.

The following is a real situation, but I have changed the details to disguise the client.

Your factory’s emissions have been a problem for years, and you’ve worked hard to protect employees from the potential health effects. You have focused especially on workers in the refinery building, where the risk is usually highest. This year a new problem emerged that raised a lot of concern in your workforce. The early data seemed to show that workers in the polishing unit were more at risk this time, so that’s what you said and that’s where you focused your precautionary efforts. You told refinery employees their risk from the newly discovered pollutant was pretty low.

But new evidence is showing you were mistaken. It’s true that the new pollutant is more serious for polishing unit employees and less serious for refinery employees than your other emissions. But that pollutant is nonetheless more serious for refinery employees than for those in polishing. You’ve been urging precautions on the wrong group.

This is a fairly common situation, and it’s not confined to factory managements. You could just as well be a public health agency that thought a new infectious disease was going to be most dangerous to children, and now it’s turning out even deadlier to adults and seniors.

The problem is how (and whether) to tell people you got it wrong. Here are some do’s and don’ts – starting with the don’ts.

number 1

Don’t stick to your guns.

You may be tempted to keep saying the polishing unit employees are most at risk, ignoring the emerging data that the risk is actually higher to the people in refining. There is a chance that you’ll get away with it. But if you’re in a low-trust, high-outrage environment (for example, if you work for a polluting corporation), the odds are against you. You have enemies who are motivated to root out your misstatements and expose them.

Even if you’re in a comparatively high-trust, low-outrage environment (for example, if you work for a public health agency), sticking to your guns is profoundly unwise. You’re likelier to get away with it, perhaps. But you also have more to lose. Your hard-won credibility is at stake.

In either environment – low-trust/high-outrage or high-trust/low-outrage – sticking to your guns will do harm to the people you fail to warn. So it is ethically wrong.

It’s also dangerous to your organization. You’ll take a small hit if you announce your error – versus a big hit if others prove later that you discovered your error and decided to hide it. That’s especially true if there are decisions still being made that could be made differently if everyone knew the new information – for example, if it’s not too late for your refinery workers to take precautions. The certain small hit is a much better choice than the big gamble.

number 2

Don’t think that quietly publishing the data protects you.

My clients – both corporate and government – typically produce huge amounts of technical paperwork documenting the risk-related situations they’re managing. After nearly 40 years as a risk communication consultant, I am convinced that the paperwork is usually honest. But my clients’ public statements about the paperwork are frequently not so honest.

Telling the truth where no one will notice doesn’t protect you from stakeholders’ outrage, if and when they eventually notice that you said something quite different in more visible venues. In fact, this is arguably the worst of both worlds. If there’s technical paperwork that proves you knew you were lying, you’re likelier to get caught in the lie – especially in a world where Google and the Freedom of Information Act make the paperwork easy to find. And people (especially journalists) will be irritated with themselves for having trusted you instead of digging out the paperwork – and they will project that irritation onto you.

The longer they trusted you instead of digging out the paperwork, the angrier they will be when your credibility gap finally emerges.

number 3

Don’t expect misleading ambiguities to save you.

When you started out saying the polishing workers were “more at risk” from this newly discovered pollutant, you meant they were more at risk than other workers at the factory. And that’s what everybody thought you meant. It’s what your earliest data showed.

But now you know it’s not true. Can you hide behind a different interpretation of “more at risk,” one that’s still true: that polishing unit employees are more at risk from the new pollutant than from your factory’s other, more familiar pollutants? If you choose your words carefully, you can keep giving the impression you now know to be false without actually lying … and without having to admit you were wrong.

That kind of artfully misleading ambiguity might help in a courtroom, but not in the court of public opinion. In hindsight, people can tell when they have been intentionally misled. They’re even more outraged if you found a clever way to mislead them without lying.

number 4

Don’t just change the message.

When I point out to my clients how much their public messaging diverges from their publicly available technical paperwork, they face two problems, not just one. The first problem is whether to align their messaging with their data. The second problem is whether to own up to their previous failure to do so.

My clients sometimes offer this halfway bargain: “Why don’t we just transition our messaging from X to Y? We don’t actually have to point out that we wrongly said X, do we?”

Yeah, you do – for two reasons.

If you simply stop saying X and start saying Y instead, some people won’t notice. They’ve already “learned” (mislearned) X – and they won’t unlearn it until you explicitly tell them that you were wrong about it before … or until somebody else tells them so. That’s the first reason: You can’t readily correct an error (and get credit for having corrected it) unless you acknowledge that you made the error. Once refinery workers got it into their heads that the new pollutant isn’t much of a problem for them, they stopped paying attention to your communications about that issue. To regain their attention, you have to tell them explicitly that you were wrong.

The second reason is that some people will notice that you’ve changed your tune. But since you haven’t explained why, they’ll try to figure out why themselves. Is it because you’re confused? Are you trying to get them confused? Were you right the first time and now you’re trying to deemphasize the risk? Were you wrong the first time and now you’re afraid to admit it? You can’t help people interpret the X versus Y discrepancy unless you explain it yourself, candidly and vividly.

Repeat after me: “We were wrong when we thought the risk would be highest for the people in polishing. It’s turning out even higher for the people in refining.”

number 5

Try to make uncertain predictions sound uncertain.

It’s a lot easier to tell people you got it wrong if you didn’t sound cocksure in the first place. That’s why one of the core principles of risk and crisis communication is to replicate your own level of uncertainty link is to a PDF file in the minds of your stakeholders.

Often you are obliged to act – and to speak – before you are certain about a situation. The best you can do is make sure everybody knows that you’re not claiming certainty, and that you promise to be forthright about course corrections as your knowledge increases or the situation changes.

If you were really, really sure at the start, then you were right to sound sure – and if you turned out wrong anyhow, you’ll simply have to take the hit by acknowledging that you turned out wrong. But if you sounded a lot more confident at the start than you felt, then your problem is largely of your own making. If you told your boss or your Board that you could easily be wrong, that’s what you should have told everybody else too.

It’s often hard to make yourself acknowledge uncertainty. And sometimes even if you do, it’s hard to get your stakeholders to register the uncertainty (or journalists to report it). The trick is to emphasize your uncertainty a lot more than probably feels necessary. Try telling people what your backup plan is in case you end up wrong about who is most at risk. Or if it will be too late for a backup plan, tell people how awful you’ll feel if you learn later that you made the wrong call. Or make a big point out of how unpredictable the whole situation is.

number 6

Correct yourself ASAP.

The longer you propagate false or misleading information, the higher the price when you correct yourself … and thus the more tempting it will be to dig your hole even deeper. So correct your error as early as you can.

That means you need to collect and analyze the data as early as possible, so you’ll get the earliest possible indication that things may be trending in an unexpected direction. And you need to be straight with yourself about your data quality assurance. It’s tempting to dismiss preliminary indications that you might have been wrong as unreliable, while grabbing confidently onto equally preliminary indications that you got it right. Make sure your standards for assessing your data don’t depend on which side you bet on originally.

Above all, you need to tell people what you know as soon as you know it. “People” here means people you gave the wrong information to earlier, plus people who have use for the right information now, plus people who are likely to feel blindsided if they find out later from somebody else. If you told your boss or the regulator, but waited to tell the workforce or the public, you’re setting yourself up for a fall. You need to share preliminary information too, especially preliminary information that you might have been wrong. The “blame clock” starts ticking as soon as you get your first warning signs and don’t say anything.

number 7

Apologize.

You’re sorry you got it wrong. Yes, I know, it wasn’t your fault; you took your best shot in an uncertain situation. Nonetheless, you got it wrong, and now you need to be sorry.

You’re sorry you sounded too confident. How you phrase this depends on what actually happened. “I was really, really confident, and now I’m really, really surprised – and really, really sorry.” Or: “I knew the situation was uncertain, but I didn’t stress my uncertainty enough, and I gave many people the mistaken impression that I was sure.”

And you’re sorry you kept misinforming people for so long. This is the toughest and most damaging admission. But if it took you longer than it should have to find out you were wrong, or longer than it should have to say so once you found out, then that’s going to be the core of your stakeholders’ outrage … and it needs to be the core of your apology.

number 8

Explain what happened.

An explanation is no replacement for an apology. In fact, in the absence of apologies explanations tend to sound like excuses. So make sure you have apologized sufficiently first.

Then explain why you got it wrong. “Most of the experts thought X was likelier than Y.” “X is what has happened the last few times in similar situations.” “The early data suggested that it was going to be X again.” “The situation changed a few weeks ago when….”

number 9

Explain what’s going to happen.

The biggest substantive question in people’s minds when you tell them the situation is turning out different than you expected is: What changes as a result? Are you asking stakeholders to do anything different? Are you going to do anything different yourself? Are you predicting different outcomes now?

Don’t skip too quickly to this question. Oftentimes people aren’t ready to hear about the implications of your new assessment until after you’ve apologized for getting it wrong in the first place and explained how the mistake happened. There’s a risk communication seesaw at work here. If you focus prematurely on the substantive implications of your mistake, people will want to linger on the mistake itself and how sorry you should be. It’s better if you linger a bit on the mistake and how sorry you are, and let them push you to move on to the substantive implications.

Other times lives are at stake and you must tell people immediately what they need to do differently. “Stop sheltering in place. Evacuate NOW. I’ll explain why later.”

When you do turn to what’s going to change as a result of the new information, make sure you pay sufficient attention to what isn’t going to change that people might expect to change. “If we could start over knowing what we know now, we’d do things differently. But we don’t think it makes sense to switch now, and here’s why….” Or: “We had several reasons to focus our precautions on the polishing unit. The new data show that one of those reasons turned out mistaken. But we still think that’s the right course, and here’s why….” Or: “The situation is still fluid. We’re not really a lot more confident about the new conclusion than we were about the previous conclusion. Changing precautionary strategies in midstream is probably not the best option, and here’s why….”

number 10

Explain what else might need to be rethought.

You were wrong about whether the new pollutant would turn out riskiest for the people in refining or the people in polishing. Now that we all know it, one of the questions in people’s minds is going to be what else you might be wrong about.

Make sure it’s clear that that question is in your mind too.

Tell people what other tentative judgments about this situation are still unclear and might also turn out wrong – that is, what else you want us to realize isn’t as certain as we might have supposed. (You can also tell us about some tentative judgments that are turning out right.)

And tell people what you are doing to check those other tentative judgments. Do you have procedures in place to find out as quickly as possible which ones are borne out and which are not? Do you promise to tell us as soon as you have some indication one way or the other? What will change if those other judgments turn out to have been mistaken too?

Do you really have to do all this? Is it ever wise to just hang onto your original messaging and hope nobody notices the gap between your technical data and your public communications?

That’s never wise, in my judgment. But I have to admit it is – unfortunately – sometimes successful, at least in the short term. The client that inspired this column has yet to change its messaging. And so far, few people have noticed and even fewer have objected.

By failing to update its messaging to match its evolving information, it is taking at least three risks:

  • The risk that people will be misled by its communications into taking, or continuing to take, precautionary actions that are less than optimal – that is, the risk that the organization’s failure to change its messaging could end up harming or killing some people.
  • The risk that people will find out about the organization’s ongoing miscommunications, and their trust in the organization will therefore decline – not mostly because it guessed wrong originally but because it didn’t own up to the error in its public communications.
  • The risk that even without finding out, people will “smell a rat,” and their trust in the organization will decline anyhow.
I hope the first risk doesn’t materialize. But I hope the second or the third does. It’s not good for untrustworthy organizations to be trusted. And over the long term, I fervently believe, people usually do find out or smell a rat; over the long term, untrustworthy organizations lose our trust.

Copyright © 2010 by Peter M. Sandman

For more on outrage management:    link to Outrage Management index
      Comment or Ask      Read the comments
Contact information page:    Peter M. Sandman

Website design and management provided by SnowTao Editing Services.