The Mad Cow Crisis: Health and the Public Good
edited by Scott C. Ratzan
New York University Press, 1998, 247 pp.
U.S.$17.50 (pbk.); ISBN 0–8147–8511–X.
U.S.$55.00 (hbk.); ISBN 0–8147–5101.
IT MUST BE HARD to produce a book on the policy and communication implications of a potential health crisis before it is clear whether or not the crisis was real. A few of Scott C. Ratzan’s contributors assume or assert that the UK’s mad cow crisis of 1996 was a genuine threat to health; most of them assume or assert that it was not. In an “Afterword ” dated October 1997, Ratzan himself leaves no doubt that he is in the latter camp.
For those who missed or do not recall this crisis, the key scientific question is whether the agent responsible for bovine spongiform encephalopathy (BSE) in cows can be passed to humans, presumably through ingestion, and cause a new variant of Creutzfeldt–Jakob disease (vCJD), a previously (and still so far) rare disease much like BSE. The key medical question is how widely the agent was transmitted already before the 1997 crisis, and whether hundreds, thousands, or even millions of mostly British victims are fated to emerge in the years to come as vCJD’s (unknown) latency runs its course. And the key policy question is whether the (quite different) precautions taken so far in the UK, the EU, the USA, and elsewhere are sufficient to ensure that no or virtually no additional transmissions are even now occurring.
Ratzan and most of his contributors are confident that the answers to at least the second and third of these questions are reassuring. Their key question is how could society so overreact to a speculative medical contention. From the hindsight of an additional 15 months (I write in January 1999), I can report that there has still been no upsurge in reported cases of vCJD in the UK. On the other hand, a brief Internet search reveals plenty of credentialed experts still unconvinced that the danger is over.
Mad cow disease is obviously a wonderful case study of public communication and public policy about a new health controversy – one with huge economic and political repercussions (the crisis devastated the British beef industry and destabilized the UK’s relationship with the EU). In a few years we will know whether or not Ratzan is right that the lessons to be drawn are about how to minimize media sensationalism, public hysteria, and political exploitation.
Or, actually, we won’t. Even if mad cow disease turns out to be a non-problem for humans, that in itself will not demonstrate that this was knowable in 1997; the concern may have been wrong without having been wrongheaded. As contributor Paul Anand notes in one of the few chapters that assumes no answers to the key questions, “uncertainty is almost a defining characteristic of the beginning of any public health scare ” (p. 51). Or, we might add, any environmental scare. Consider swine flu, silicone breast implants, power line electromagnetic fields, global warming, and AIDS. There is a large class of risks where the data are initially sparse, the consequences of type 1 and type 2 errors are radically different, and the need for action is either urgent or non-existent (you can’t tell which yet). Anand offers some helpful suggestions for communication and policy making under these difficult circumstances.
There is much enlightenment in the rest of the book as well, once you correct for not having the answers the authors think they have. Most of the book consists of overviews of medical and scientific findings, policy developments, and media coverage. I found the media content analyses less convincing than the technical and policy chapters, but this is doubtless because I am a media specialist much more than a technical or policy specialist. Methodological quibbles aside, the media chapters, like the others, seem to be solid descriptions of what happened. (While the UK media are accused of hyping the story, the US media played it largely for humor; the “mad cows and Englishmen” leitmotif got more attention than the possibility of a similar risk to American cattle and American consumers.) It is more the authors’ recommendations – the last few chapters, and the last few paragraphs of most chapters – that require a correction factor if you are not convinced that the crisis was unjustified. To Ratzan’s credit, the ammunition is here to support other views.
The chapters that I found most useful were written by Ratzan’s contrarians, not because I think they are necessarily right, but because I think a possible problem has more in common with a real problem than with a non-problem. What to do when you don’t know is not a scientific question, after all: it is trans-scientific, a question of values. Individual precautions should be up to the individual (such as not eating beef, as Oprah Winfrey and millions of others at least temporarily decided); societal precautions, in a democracy, should be up to the political process. This suggests that there should be the same kind of openness and candor about a possible problem as is called for about a problem known to be serious.
Interestingly, those who work on non-problems have generally concluded that these, too, call for openness and candor. As I read this book, I kept thinking about the difference between the fields of health communication and risk communication. Ratzan and most other health communicators spend most of their time trying to figure out how to warn people about serious hazards. I and most risk communicators spend most of our time trying to figure out how to reassure people about modest hazards. The two fields not only rely on different theories and literatures and journals; we assume a different default. Though he is an MD and knows enormously more than I do about BSE and CJD, if Ratzan is right that their link is not a serious problem, then he is writing in my field.
What are some of the risk communication lessons of the mad cow crisis?
- Do not over-reassure. In the seesaw of risk communication, if the source sounds insufficiently concerned the audience will reliably become excessively concerned.
- Do not mock your public. Frightened people need compassion and understanding, not contempt.
- Release the bad news immediately. The crises that are most likely to get out of hand are those that seem to keep getting worse.
- Involve your opponents. Critics and mavericks are least responsible and most credible when they have been excluded.
- Recognize that some concerns are inevitable even if they are not technically accurate. Avoidance of spoiled meat may well be hard-wired; avoidance of any agent that can rot your brain sounds pretty normal.
- Acknowledge that the science is uncertain, and that uncertainty is alarming. Avoiding an agent that might rot your brain is also normal.
- Acknowledge misbehavior and apologize. Being technically right is no excuse for being humanly wrong, administratively incompetent, or self-protectively secretive.
These sorts of lessons are implicit and occasionally even explicit in The Mad Cow Crisis, but they are not its focus.
In short, even if I assume with Ratzan that the mad cow crisis was and is a minor health threat, I must still dissent from his view that the main defect in how the crisis was handled was the failure to present the scientific case for calm aggressively enough. What risk communicators have learned over the past two decades is that aggressive scientific reassurance is not reassuring. Continuing to urge such a strategy, therefore, is not scientific.
Copyright © 2000 by Peter M. Sandman