Should the government allow scientists to create new super-viruses?

Let's suppose a bunch of scientists proposed to take one of the most infectious human viruses—influenza, say—and turn it into a super-bug. Is this a good idea?

Or to put it another way: should scientists be artificially mutating viruses so that they have the potential to become a worldwide pandemic?

Right about now you might be asking: is anyone actually doing this, and if so, what on earth are they thinking?

And yet, several of the world's most prominent influenza researchers have been engaged in exactly this enterprise for several years now. They call their work "gain of function" experiments, because they manipulating viruses to give them new (and very dangerous) functions.

I wrote about this last year, after a group led by Ron Fouchier at Erasmus Medical Center in the Netherlands and Yoshihiro Kawaoka of the University of Wisconsin announced, in a letter to Nature, that they were going to create a new strain of H7N9 influenza virus that had the potential to turn into a human pandemic. Sure enough, just a few months later, Fouchier published results showing they had done just that, although they reported that their newly engineered strain had only "limited" transmissibility between ferrets (the animal they used for all their experiments).

Fouchier and Kawaoka had already done the same thing with the deadly H5N1 "bird flu" virus, causing a huge outcry among scientists and the public. As reported in Science magazine almost three years ago, Fouchier admitted that his artificially mutated H5N1 was "probably one of the most dangerous viruses you can make."

And yet he did it anyway—and then did it again, with H7N9.

Many other scientists were and are extremely concerned about these experiments, which some of us consider dangerous and irresponsible. This past July, a large group of scientists known as the Cambridge Working Group (of which I am a member) released a statement calling for a hiatus, saying:
"Experiments involving the creation of potential pandemic pathogens should be curtailed until there has been a quantitative, objective and credible assessment of the risks, potential benefits, and opportunities for risk mitigation, as well as comparison against safer experimental approaches."
Just two days ago, the U.S. government responded, announcing that it was going to take a serious look at whether creating these superbugs is a good idea. The Office of Science and Technology Policy (OSTP) is creating two committees to "assess the potential risks and benefits" of these experiments, particularly those involving the influenza, SARS, and MERS viruses.

Until the committees come up with recommendations, the government is halting any new funding for these experiments and asking for a voluntary "pause" on existing work.

Not surprisingly, Fouchier and his colleagues have argued that their work has benefits; that it has "contributed to our understanding of host adaptation by influenza viruses, the development of vaccines and therapeutics, and improved surveillance." Yet these arguments are tenuous at best. Fouchier and company have failed to show that the mutations they found in ferret experiments are likely to occur in the natural course of human outbreaks, which means that using their viruses for vaccine development would be a huge mistake.

And to claim that creating super-viruses in the lab will lead to "improved surveillance" is, frankly, laughable. Surveillance means getting out in the field and collecting samples from sick people. Gain-of-function laboratory experiments have basically nothing to do with surveillance.

Harvard's Marc Lipitsch has been one of the prominent voices arguing against this line of research, writing just last week that the scientific benefits of these experiments are very limited, for reasons detailed in his article. Lipitsch is also one of the founding members of the Cambridge Working Group.

According to the announcement from The White House, the first committee to evaluate the merit of these experiments will meet in just a few days, on 22 October. Meetings will continue throughout the winter, with recommendations expected sometime in the spring of 2015.

We have enough problems with influenza, and now with Ebola too, without scientists creating incredibly deadly new viruses that might accidentally escape their labs. Let's hope that the OSTP does the right thing and shuts down these experiments permanently.

3 comments:

  1. what I don't understand in this whole debate is, why almost everyone
    talks about the scientists doing those experiments and almost
    noone about possible evildoers doing or repeating them.
    Using the available science that we have developed during the
    last decades. We have developed - and are still working on it -
    methods to understand and genetically manipulate organisms,
    including potentially dangerous pathogens, and we are close to
    easily creating pandemics of any desired virulence.
    And it's not just Kawaoka and Fouchier.
    And it's not just influenza.

    I mean, why are you so concerned that they are doing this and not
    concerned that they are publishing it, thus helping others
    to create pandemic viruses ?

    Maybe you can write an article about that risk too, which I consider
    much bigger, although it's rarely being talked about.
    The activities that you critisize must be seen in this context :
    we need a scientific advantage over the possible evildoers.
    Strangely, even the GOF-people rarely mention this argument.

    BTW. as you show a big picture about the 1918 pandemic in the
    Forbes-article, where were you when the 1918-sequences were published ?
    Was that "good" ?


    gsgs

    ReplyDelete
  2. Who said I'm not concerned about their publishing this? I'm very concerned and have said so on multiple occasions. I wouldn't support them doing the work in the first place, or publishing it in the second.

    ReplyDelete
  3. Currently the discussion is all about preventing them from doing it.(FVR etc.)
    Because it could escape their labs although being high-security.

    But not doing it in the first place leaves us unprepared when others do it.
    I mean, you can't stop it anyway - so better be prepared.
    As science proceeds more and more labs will be able to do it ever cheaper
    and with ever less safety measures...
    People will understand it better and design new viruses with computer help
    and reverse genetics lets them create them.
    For influenza you can just automize and analyze lots of reassortment experiments

    ReplyDelete

Markup Key:
- <b>bold</b> = bold
- <i>italic</i> = italic
- <a href="http://www.fieldofscience.com/">FoS</a> = FoS

Note: Only a member of this blog may post a comment.