Have you ever wondered why the public and the media seem to have it “in” for Hanford? Why is it that perfectly good science and technical planning get torn apart at public meetings?
Peter Sandman has some answers. A consultant who specializes in risk communication in the environmental field, Sandman is conducting an on-going series of day-long workshops for Hanford contractors and regulators. Many of his workshop participants have been impressed by his knowledge and insights, punctuated by a provocative, pull-no-punches delivery.
Sandman is on the faculty of Rutgers University and runs a private speaking and consulting business from an office in Newton Centre, Mass. Peter Bengtson of Westinghouse Hanford Company External Communications spoke with Sandman about the role of risk communication and public involvement at Hanford. What follows are excerpts from that interview.
Reach: What do you mean by risk communication?
Sandman: Risk communication is the effort to reduce the gap between risk as the experts see it and risk as the public sees it.
Sometimes the experts think a hazard is really serious, but people are inclined to be apathetic. Then the job of risk communication is to get their attention, shake them by their collective lapels and persuade them to take the risk seriously. This could be anything from trying to get people to put smoke alarms into their homes to persuading them to wear seatbelts.
The other side of risk communication is remedying the gap when experts think the hazard is modest and the public is overreacting. Then the goal is to calm people down and persuade them the risk is smaller than they think.
Good risk communication, though, is not so much trying to educate the public or convince them they’re wrong. It’s more trying to figure out what makes them wrong. At Hanford, for example, what is leading people to overreact?
People who think risk communication is mostly correcting misperceptions by telling people “the truth” haven’t got it right. It’s more important to figure out why people are misperceiving your technical truth – and that means figuring out what non-technical truth they have right and you are missing.
The issue may be trust. It may be a relationship issue. It may be control, or accountability, or courtesy. But it’s not random. It’s not just that they don’t understand. Something is making them not want to understand. Figuring out what that is and what to do about it is really the essence of risk communication.
Reach: Why do you think it’s important for people here to understand risk communication?
Sandman: One of the core problems at Hanford is a lot of what I call “outrage” running through the system – several kinds of outrage. There is outrage from some stakeholder publics at the Hanford enterprise itself. People feel abused or misled. People feel the site should never have been started, that it certainly should have been managed more carefully, and that the cleanup is irresponsible, unresponsive or inadequate in some way.
There are also lots of internal publics that experience outrage. There are employees, for example, who feel they are not being permitted to do good work, that they’re subject to endless second-guessing and excessive regulations. They feel they could solve the problems they’ve been asked to solve a lot more efficiently and cheaply if everyone would get off their backs. Of course, everyone won’t because the external publics are outraged as well. And stakeholder outrage translates into political and regulatory pressure.
The dominant characteristic of the environment at Hanford is that everyone is angry – everybody is unhappy with everybody else. That’s something of an exaggeration, but it has a lot of truth to it.
In that kind of environment, the effort to do sound technical work is doomed unless people are sensitive to and responsive to all the outrage that surrounds them. That’s precisely what risk communication helps you do.
If you try to do good technical work while ignoring how people feel, ignoring issues of trust, issues of process, or issues of control, you wind up doing terrible technical work. Having ignored everything except technical considerations, all the things you’re ignoring sneak around behind you and interfere with your plan.
It’s easy to find examples of sound technical judgment going no place at the cost of hundreds of millions of dollars because nobody was exercising any non-technical judgment. That is, nobody was looking at what I call the outrage.
If you pay attention to the outrage, if you pay attention to the non-technical issues as risk communication teaches you to do, you have a much better chance of coming up with reasonable compromises between the technical and non-technical concerns. In other words, you wind up doing paradoxically better technical work if you think like a risk communicator than if you think like an engineer.
Reach: The communication you’re describing could then eliminate, or at least reduce, the chances of a situation like the controversy that developed over plans to use grout as a low-level waste form.
Sandman: Grout is a very good example of technical people pursuing a technical agenda, getting hints that various publics were dissatisfied, and paying very little attention to those hints because, after all, the plan for grout was technically sound. And so the agenda eventually wound up dead in the water.
I suspect that much earlier efforts to integrate technical and non-technical concerns could have prevented that problem and could prevent future problems like it.
Reach: Public involvement has become increasingly important to decision-making at Hanford. Some people are catching on to this, yet others are not. What are the biggest stumbling blocks you’ve seen to public involvement at Hanford?
Sandman: The biggest stumbling block is the custom of doing public involvement after decisions have been made.
The tendency at Hanford is to first get our ducks in a row technically, so we know what we’re going to do. Then we say to the public, “We’d like your comments.”
By then, it’s a no-win situation. The public feels its comments are not really welcome, or else they would have been solicited much earlier, so people tend to respond with destructive comments. And even a constructive comment is destructive when it comes too late in the process.
We have the situation again and again at Hanford where reasonable plans are unlikely to reach completion because the public isn’t drawn into the decision-making process early enough. That problem’s on its way to being solved, but I think it’s still the number-one problem – especially with projects in the pipeline from the days before public involvement.
The number-two problem is that technical people don’t like compromising their technical concerns with other people’s non-technical concerns. They’re perfectly good at compromising, but they’re not good at compromising technical with non-technical considerations. They’re not accustomed to it, and there’s a tendency to think that the purpose of public involvement is primarily to tell the public what they propose to do.
I think engineers with integrity are always open to good suggestions, even if they come from the outside. But often technical types tend to be less open to good non-engineering suggestions, which they see as irrelevant.
A third major problem is that the people who are being asked to be involved, the various stakeholders, are not necessarily any more skilled at giving input than the Hanford folks are at taking input.
Reach: What should be the role of senior management in helping Hanford better understand risk communication and public involvement?
Sandman: So much of what I’m suggesting can only happen if it happens at a high level. In an organization like Hanford, where morale is a problem, internal trust is not that high and the marching orders keep changing, there’s a tendency to be very skeptical about anything new. I think middle managers are looking for signals as to whether or not public involvement and risk communication are an innovation that top management really wants them to invest in.
This is particularly important, because the short-term effects of risk communication are an increase in tension. The most immediately observable result is that people have more opportunity to share their grievances, and they take that opportunity.
Initially, it slows things down, and you can’t afford to do that unless your management (and your customer!) wants you to do it and understands that the payoff takes awhile. In the long term, risk communication smoothes things over and ultimately speeds things up – but not right away.
Since the essence of risk communication has to do with openness, cooperation and tolerance of conflict, you can’t do better risk communication externally than you’re permitted to do internally. If there’s not much openness internally – if Hanford’s top management is not very tolerant of conflict inside the organization – there’s no way you’ll be able to do it outside.
One other thing that ought to be said: Top management sometimes undertakes efforts that preempt and interfere with the risk communication that might otherwise develop organically at a lower level. I’m thinking particularly about the Hanford Advisory Board, which is essentially an advisory board to top management.
That’s a good thing. But you can create a situation – and I’ve seen it happen elsewhere – where instead of being a model and a channel of increased public involvement, your advisory board becomes a filter or a barrier.
Here are some of the danger signals: If the Hanford Advisory Board starts saying that no one is to do public involvement unless they’ve approved it first, or if top management starts saying nobody is to communicate with the Hanford Advisory Board unless they approve it first.
Copyright © 1994 by The Hanford Reach