Posted: March 19, 2012
This page is categorized as:    link to Pandemic and Other Infectious Diseases index
Hover here for
Article SummaryThe H5N1 (“bird flu”) virus is incredibly deadly to humans, but almost never transmits from human to human – at least until late 2011, when two teams of scientists bioengineered H5N1 to make it transmissible in mammals. Now a battle rages over whether the two papers detailing this work should be published, and whether the work itself should continue – and whether the concerns of the general public should be considered in making these decisions. When I was quoted in Nature urging proponents to dialogue with critics rather than merely trying to “educate” them, Genetic Engineering & Biotechnology News asked me to write a brief opinion piece expanding on my view. I submitted a short version, plus a somewhat longer version with a little more background on the controversy. They decided to publish the short one; this is the longer one.

Talking to the Public about
H5N1 Biotech Research

(submitted to Genetic Engineering & Biotechnology News, March 18, 2012)

A shorter version link is to a PDF file of this article, scheduled for publication in the April 15, 2012 issue of Genetic Engineering & Biotechnology News, is also online.

See also my two earlier commentaries on this controversy:


Two research papers about a bioengineered influenza virus, already accepted by the prestigious journals Nature and Science, have been sitting in limbo since December 2011, when a little-known U.S. government advisory committee recommended that their methodology sections should be gutted before publication.

The result has been a research moratorium reminiscent of the biotechnology research moratorium that preceded the Asilomar conference of 1975, as well as an intense intramural battle over whether the research should continue and in what form if any the two papers should be published.

One issue in the battle: what role the public should play in these decisions.

Necessary Background

It’s all about the flu strain H5N1, better known to nonscientists as “bird flu.” Since it first emerged in Hong Kong in 1997, H5N1 has proved extremely deadly not just to birds but to humans as well. It has killed a terrifying 59% of the people known to have caught it.

The only thing preventing an unprecedentedly catastrophic H5N1 pandemic is the fact that this strain isn’t adapted to human hosts. Only about 600 people are known to have caught it, nearly all of them from birds, not from each other.

But last year two research teams – one run by Dr. Ron Fouchier at the Erasmus Medical Center in Rotterdam, the other by Dr. Yoshihiro Kawaoka at the University of Wisconsin – reported that they might have cracked the human transmissibility barrier. Neither team did their research on humans, of course; both used ferrets, widely considered the best animal model for human flu transmission. Kawaoka’s team bioengineered a reassortment of H5N1 with a less deadly flu strain that circulates routinely in humans, and created a new hybrid strain that could infect one ferret when another coughed in a nearby cage. Fouchier’s team bioengineered an H5N1 strain that possessed several mutations considered favorable to mammal-to-mammal transmission; then it passed the new strain through ten generations of ferrets via nasal swab, until it became an H5N1 that could be transmitted via aerosol (cough) from ferret to ferret.

It isn’t known how transmissible – or how virulent – these bioengineered flu strains would be in humans.

Arguably the two papers may point the way to further research that could help assess the risk of a catastrophic pandemic and help guide H5N1 surveillance efforts. At the same time, publishing the two papers and extending the research could increase the risk of a catastrophic H5N1 pandemic that isn’t natural, one resulting from laboratory accident or human malevolence.

On both sides the stakes are almost inconceivably high. A pandemic that sickened 30% of the world population (typical of flu pandemics in the first year) and killed 59% of them (H5N1’s record so far) would have a death toll of more than a billion – that’s billion with a b. Anyone focusing on this battle in terms of its significance as a precedent is missing the point. It’s a one-off.

The two papers were submitted to the U.S. government’s National Science Advisory Board for Biosecurity (NSABB) mainly because of bioterrorism fears. The NSABB recommended that the papers be redacted before publication, the National Institutes of Health endorsed the recommendation, the two journals reluctantly assented, and the outraged debate began.

Reassuring the Public

Fouchier and others responded to the furor by organizing a research moratorium, aimed at calming the waters and buying time to make the case for unfettered research and publication. Soon after, the World Health Organization convened a meeting in Geneva, mostly of influenza researchers, which predictably concluded that research and publication should be unfettered. But the group acknowledged that a time-out was needed. In a press briefing after the meeting, WHO’s Keiji Fukuda said one key reason for the pause was to allow time to reassure the public.

The “public” following these events is tiny. Despite editorials in The New York Times and a satirical monologue by John Stewart – and continuing poultry outbreaks and human cases – most people aren’t worried about an H5N1 lab accident or terrorist attack. They’re even less worried about an H5N1 natural pandemic. Convincing people that the H5N1 natural pandemic risk is alarming is a tougher and more important task than convincing people that the H5N1 terrorism risk is less so.

Like the run-up to Asilomar nearly 40 years ago, the H5N1 research controversy is much more a battle within the scientific community than a battle between the scientific community and the public.

But let’s take WHO’s Fukuda at his word and assume the job is to reassure the public that it’s okay to publish the Fouchier and Kawaoka papers and okay to resume H5N1 bioengineering research. What are proponents of this position doing wrong in their effort to reassure the public? I’ll focus on just two (of many) issues: education and contempt.

Education Won’t Do the Job

I worry that advocates of unfettered H5N1 research and publication want to “educate” the public out of its concerns. That almost never works. In the risk communication literature and the planning literature, this strategy is called “decide – announce – defend”: Figure out what to do; then tell the world that’s what you’re going to do; then rebut any and all objections with a mix of technical data and dismissive rhetoric. This is a thoroughly discredited approach.

Decide – announce – defend is especially unlikely to work when serious risks are involved. “How safe is safe enough” is a values question for society, not a science question for experts who have a horse in the race.

The dangers of concocting a potentially deadly pandemic virus in the lab are obvious. The benefits of doing so are less obvious. (Phrases like “mad scientist” come easily to mind.) So the burden of proof is on those who wish to assert that this is a sensible thing to do. Before making their case, they must first “own” the burden of proof, listen respectfully to people’s concerns, and join in a collaborative search for a potential compromise. Arrogant and self-serving rants about “censorship” won’t help.

H5N1 bioengineering researchers are essentially supplicants, asking everyone else for permission to carry out work with huge (but unquantifiable) potential risks and huge (but unquantifiable) potential benefits. I doubt that’s how they will address public concerns – as a supplicant – but it’s how they should.

Some of my corporate clients use the term “social license to operate” to capture their hard-won realization that they can’t do what they want to do if the public doesn’t want them to (and that that’s how it should be). Science, too, needs a social license to operate. The first step in securing your social license is acknowledging that you need it: supplicant, not educator.

Contempt Makes It Worse

Experts understandably have a hard time being respectful of interfering laypeople. But scientists’ visible contempt for the public’s concerns actually increases the risk of such interference.

Everyone (including me) agrees that it was a good move when H5N1 researchers declared a moratorium. But even the Nature letter announcing the moratorium dripped with disdain.

Consider this over-reassuring sentence:

Responsible research on influenza virus transmission using different animal models is conducted by multiple laboratories in the world using the highest international standards of biosafety and biosecurity practices that effectively prevent the release of transmissible viruses from the laboratory.

Nothing can go wrong … go wrong … go wrong…. I don’t have space to document all the lab accidents that have released transmissible viruses. A 1977 lab accident is thought to have released the human H1N1 flu virus, which had not circulated since 1957; it spread globally for the next 32 years. Nor can I review the research establishment’s internal debate over whether BSL3+ labs like Fouchier’s and Kawaoka’s are sufficient for H5N1 research. As for the risk of an intentional release – and the systematic underestimation of that risk inside the flu world – see my article on “A Blind Spot for Bad Guys.”

Here’s a worse example from the same Nature letter (whose senior author was Fouchier):

Despite the positive public-health benefits these studies sought to provide, a perceived fear that the ferret-transmissible H5 HA viruses may escape from the laboratories has generated intense public debate in the media on the benefits and potential harm of this type of research.

Note the extraordinary lack of parallelism. We usually contrast benefits with risks – or if you prefer, potential benefits with potential risks, or even perceived benefits with perceived risks. These are all parallel formulations. But Fouchier et al. do not contrast their confidently asserted “positive public-health benefits” with risks … or with concerns about those risks … or even with fears about those risks … but with something much more ephemeral: a mere “perceived fear.”

As used by these scientists, public “perceptions” are misperceptions, and public “fears” are unjustified fears. If the insult here escapes you, think of a risk you take seriously and imagine someone labeling it a perceived fear.

Paradoxically, this contempt for public concerns might actually provoke stricter regulation of science. If scientists are nasty and myopic enough when claiming that only scientists’ opinions matter regarding what they do and what they publish, society might rebel against such unbridled scientific autonomy. It’s unlikely. Most people have a strong conviction that governments don’t know how to regulate scientists and we’re better off leaving them alone. That autonomy has nurtured a lot of scientific arrogance, but the arrogance hasn’t yet undermined the autonomy, and odds are it won’t this time either.

But if there’s a threat to scientific autonomy, it’s not coming from the NSABB recommendations. It is coming from the arrogant, scientifically dishonest, risk-insensitive way some scientists are responding to the NSABB recommendations.

What I’d Say

The H5N1 debate isn’t a monologue. Especially for the side that wants to publish the two papers and carry on, listening is more important than talking. Validating the other side’s concerns is more important than talking. Implementing some of the other side’s recommendations for additional biosafety and biosecurity measures (and giving them credit for the improvements) is more important than talking.

But when the time comes for talking, here’s what I’d say:

This is uniquely dangerous research, so much so that it has stimulated an extremely unusual push to regulate scientific research and publication. If we’re going to do such research at all, we need to prove that we’re taking safety and security seriously, and we need to implement more precautions, and we need information about those precautions (and all infractions) to be publicly available. Moreover, we need to prove that the research is important enough to justify taking the sizable risks.

This isn’t about research autonomy generally. It’s about whether it makes sense to create a possible monster in our labs in order to do research that might (or might not) have huge payoffs in preventing or fighting the natural monster that could emerge at any time. The research we’re proposing to do is only part of a coherent agenda to address the risk of a potentially catastrophic H5N1 pandemic, an agenda that includes the following other priorities….

Copyright © 2012 by Peter M. Sandman

For more on infectious diseases risk communication:    link to Pandemic and Other Infectious Diseases index
      Comment or Ask      Read the comments
Contact information page:    Peter M. Sandman

Website design and management provided by SnowTao Editing Services.