5-14-20 The original SARS has not reemerged from the wild since 2003, but it has actually escaped from three different labs, one in Taiwan, one in Singapore, and one at China’s National Institute of Virology in Beijing where two researchers were infected. The researchers mistakenly believed they were handling a version of the virus that had been inactivated. One researcher at the NIV passed the infection to her mother who eventually died, as well as a nurse who passed the disease to five other people.
As dangerous as it is to culture deadly natural pathogens, the most troubling research involves engineering pathogens to be even deadlier. Concerns over this so-called “gain-of-function” research flared up in 2011 when two different teams showed how an extremely deadly strain of avian influenza, which kills approximately 60 percent of its victims but is not easily transmissible between humans, could be mutated to make it extremely infectious through the air.
The scientists argued that such experiments allow us to learn how viruses might evolve to be more infectious or lethal, and many others agreed. Gain-of-function studies help “to inform the influenza vaccine strategy for pandemic preparedness, from selection of candidate vaccine viruses and development of high-yield seeds to manufacture of safe vaccines for the global community,” 23 scientists wrote in a guest editorial in mBio, the journal of the American Society for Microbiology.
But others believed the risks dwarfed the benefits. The biosecurity expert Lynn Klotz, together with science journalist Edward J. Sylvester, surveyed the CDC’s lab accident data and conservatively estimated the chance of a pandemic pathogen escaping a lab at just 0.3 percent per year, meaning there would be an 80 percent chance of an escape from a single lab over 536 years of work. Perhaps that would be acceptable, but they quickly counted 42 labs known to be working with live SARS, influenza, or smallpox, which translated to an 80 percent chance of an escape every 12.8 years. And that was in 2012, when such work was far less commonplace than it is now. The two later estimated the likelihood of an escaped virus seeding “the very pandemic the researchers claim they are trying to prevent…as high as 27%, a risk too dangerous to live with.” They wrote, “There is a substantial probability that a pandemic with over 100-million fatalities could be seeded from an undetected lab-acquired infection (LAI), if a single infected lab worker spreads infection as he moves about in the community.”...
At the University of North Carolina in 2015: working with researchers from the Wuhan Institute of Virology, bioengineers added a new spike protein to a wild coronavirus that gave it the ability to infect human cells—eerily foreshadowing COVID-19. The argument for it was that doing so would help us learn how to treat a novel SARS-like coronavirus, but many watchdogs objected, including Richard Ebright. “The only impact of this work is the creation, in a lab, of a new non-natural risk,” he told Nature at the time.
“This research is so potentially harmful, and offers such little benefit to society, that I fear that NIH is endangering the trust that Congress places in it.”
Writing in the Bulletin of the Atomic Scientists in 2014, the bioweapon historian Martin Furmanski argued strongly that our safety precautions have not been commensurate with the risk. “It is hardly reassuring that, despite stepwise technical improvements in containment facilities and increased policy demands for rigorous biosecurity procedures in the handling of dangerous pathogens, potentially high consequence breaches of biocontainment occur nearly daily: in 2010, 244 unintended releases of bioweapon candidate ‘select agents’ were reported. Looking at the problem pragmatically, the question is not if such escapes will result in a major civilian outbreak, but rather what the pathogen will be and how such an escape may be contained, if indeed it can be contained at all.”
“GOF research is important in helping us identify, understand, and develop strategies and effective countermeasures against rapidly evolving pathogens that pose a threat to public health,” announced Francis Collins, the NIH director. Some scientists strenuously objected, such as Johns Hopkins’ Steven Salzberg, who wrote, “I can’t allow this to go unchallenged. This research is so potentially harmful, and offers such little benefit to society, that I fear that NIH is endangering the trust that Congress places in it.”
Megan Palmer, a biotechnology and security expert at Stanford University, told me she is also deeply concerned by some of the research done in high-security biolabs, but that evaluating the dangers can be difficult. “The problem is that in most cases, we don’t actually know how risky or beneficial the research will be.” To get a better handle on the science, she says, “We need much more sophisticated systems for understanding and managing risk. We should be collecting incidents and analyzing them and then sharing that information and trying to draw lessons for improvement.”
...Palmer says of Trump’s biosecurity research plans. “We say these things are important, and then we don’t follow through.” https://webcache.googleusercontent.com/search?q=cache:n_yX1457W8wJ:https://www.motherjones.com/politics/2020/05/the-non-paranoid-persons-guide-to-viruses-escaping-from-labs/+&cd=1&hl=en&ct=clnk&gl=us&client=safari
No comments:
Post a Comment