Gain of function research needs to be banned, but we need to define it properly


I’ve been writing about dangerous gain-of-function research on viruses for years, originally on the flu virus and more recently on the Covid-19 virus. Many people are deeply concerned about this research, which might have caused the Covid-19 pandemic, and yet there are still no real regulations controlling it, neither in the U.S. nor anywhere else.

I can already hear the objections: oh, but what about the new rules that NIH put in place in 2017, after a 3-year “pause” in some gain-of-function (GoF) research? Those rules were utterly ineffective, but I’ll get to that in a minute.

Despite my arguments, and the concerns of many other scientists, which have been expressed in various forums and articles for at least a decade now, the virology community continues to insist that any limits on GoF are unnecessary, and that GoF is wonderfully beneficial. A group of 156 virologists even wrote an opinion piece, published in the Journal of Virology, making this very point.

I’ve tried to convince some of my colleagues in the infectious disease world that GoF should be banned, and I’ve discovered that many of them–even some non-virologists–are opposed to any government regulation of GoF research.

They are wrong. However, they do raise one important concern that I think is valid, and that I will address in this column. Their concern is that any government regulation will be ham-handed, and will likely end up limiting or preventing a range of very useful experiments that have the potential to lead to beneficial new drugs and vaccines.

I get it. When the government tries to regulate science, it can write rules that are far too broad, or that get mis-interpreted even if well-written, and unintended consequences follow. So let’s not do that: below I’ll explain what I think needs to be banned.

But let’s not forget why we are having this debate right now: there is a very real possibility that the Covid-19 pandemic started in a lab that was doing GoF research on coronaviruses. We know that the Wuhan Institute of Virology (WIV) was doing this kind of research–that fact is not under dispute. We don’t know (and we may never know) if the original Covid-19 virus first appeared as a result of a lab leak, but it might have. That’s why we’re asking whether such research is worth the risk.

Before I explain what I think the rules should be, let’s look at the current NIH rules, which I mentioned above. First, though, let’s remember that NIH rules only apply to research funded by NIH. Research that is funded privately, or by any other part of the government, is unaffected by these rules and remains entirely unregulated.

So: back in 2017, when the NIH lifted the 3-year funding pause, they put in place some rules (detailed here) for work on “potential pandemic pathogens,” or PPPs. (The government loves acronyms.)

The pause itself was prompted by work on avian influenza, led by virologists Ron Fouchier and Yoshihiro Kawaoka, that was designed to turn some deadly bird flu viruses into human flu viruses. The work was successful: the researchers did indeed create viruses that had the potential to infect humans. These results were really alarming to many scientists: I wrote about it at the time, and other scientists also raised the alarm. Those concerns are what led to the funding pause.

Since 2017 then, the NIH regulates (but doesn’t ban) research on PPPs that are both:

  1. “likely highly transmissible and likely capable of wide and uncontrollable spread in human populations, and
  2. likely highly virulent and likely to cause significant morbidity and/or mortality in humans.”

One of the first things to notice about this definition is that avian influenza–the very work that prompted the new rules–isn’t really covered.

Another thing to notice is that work on coronaviruses in bats–the GoF work that was apparently going on in the Wuhan Institute of Virology, and that may have caused the Covid-19 pandemic–wouldn’t have been covered either. Those bat viruses would not have been considered “likely highly transmissible in humans,” not before the pandemic.

Of course, we all know differently now.

In any case, the rules that NIH introduced in 2017 only applied to a very narrow class of work, and as far as I can tell, they didn’t restrict anything. On the contrary: the NIH resumed funding the GoF work on avian influenza work by Fouchier and Kawaoka soon after lifting the funding pause. And let’s not forget that NIH rules aren’t a ban: it remains perfectly legal to do any kind of GoF work.

So how can we put in place intelligent restrictions that will prevent dangerous GoF research in the future?

First, rather than rejecting any restrictions whatsoever, as some virologists have done, scientists should work with the government to craft a thoughtful set of limitations. For starters:

  1. Research that creates new strains of the Covid virus (SARS-CoV-2) that might have greater virulence or transmissibility should be entirely banned.
  2. Research that takes non-human viruses, including avian flu and bat coronaviruses, and gives them the ability to infect any new animals, should be banned.

To scientists who can’t even agree on these restrictions, I would say that it appears you oppose any restrictions whatsoever. If that’s your position, then the government might step in and impose far broader bans, which are not likely to be good. If you’ll agree to these two restrictions, perhaps we can broaden them slightly to cover other types of highly risky GoF work.

Finally, let me return to a point I’ve made before, but that bears repeating: the supposed benefits of GoF research are essentially zero. The claim that GoF research that makes a virus more deadly will help us “understand pathogenicity” or “be prepared for the next pandemic” are just hand-waving arguments. I wrote a whole column just last month explaining why these claims are fundamentally wrong, so I won’t repeat that here.

If we do ban some GoF research, with carefully-crafted rules, we won’t lose anything. Instead, we’ll gain at least two things: first, virologists can apply their expertise to truly beneficial virology work, and second, the scientific community will regain some of the trust it has lost during the pandemic. That would seem like a good thing.

No comments:

Post a Comment

Markup Key:
- <b>bold</b> = bold
- <i>italic</i> = italic
- <a href="http://www.fieldofscience.com/">FoS</a> = FoS

Note: Only a member of this blog may post a comment.