In America’s biolabs, hundreds of accidents have gone undisclosed to the public.
It started with a bold idea. “Someone finally convinced me to do something really, really stupid,” virologist Ron Fouchier told Scientific American in 2011. Fouchier, of Erasmus Medical Center in Rotterdam, and another scientist, Yoshihiro Kawaoka of the University of Wisconsin–Madison, had separately tweaked the H5N1 virus — an influenza that primarily infects birds — in a way that made it spread more easily in ferrets. H5N1 is a prime pandemic candidate, and ferrets are often used as proxies for humans in flu experiments. When word got out that the two scientists were planning to publish papers detailing their experiments, making a blueprint available to the world, the outcry was extreme. The scientists were trying to better understand H5N1 in order to prevent a pandemic, but critics worried that their experiments could instead cause one — or provide would-be bioterrorists with an outbreak manufacturing guide.
The New York Times ran an editorial titled “An Engineered Doomsday.” The backlash was so severe that in 2012, Kawaoka, Fouchier, and other prominent flu scientists voluntarily agreed to pause the transmissibility work. The debacle prompted an overhaul of policies, now being reconsidered in the wake of the coronavirus pandemic, governing work with so-called gain-of-function research of concern.
The story is well known. And yet, what happened next has never been reported in its entirety.
Early on, Fouchier told Science that he had created “probably one of the most dangerous viruses you can make.” But after controversy broke out, as the science communicator Peter Sandman has written, Fouchier and his supporters shifted to downplaying the danger. In early 2013, flu scientists ended their voluntary pause, arguing that when the research was done at enhanced biosafety level 3, or BSL3+, the benefits outweighed the risks. Kawaoka, who was normally the more taciturn of the two, hosted journalists in his lab, where he explained his safety procedures. “The influenza virus is sensitive to detergent,” he reportedly said while explaining the process of showering out. “They die.” A biosafety staffer at the University of Wisconsin got up before a university audience to dispel what she called myths about lab oversight. The address was broadcast on local television.
Then, months later, Kawaoka’s lab saw two accidents involving lab-generated flu viruses, just one week apart.
The accidents, a spill and a needle prick, carried a low risk of infection. Flu viruses are typically transferred through respiratory droplets, not skin contact or injection. Nonetheless, in letters obtained by The Intercept, staff at a funding agency accused the university of shirking biosafety precautions that Kawaoka had promised to adopt. They also demanded changes to the University of Wisconsin–Madison’s protocol for accidental lab exposures. Of particular concern was a plan to quarantine all researchers exposed to modified H5N1 at home, even if they were at high risk of infection — an approach that the funding agency administrators found so alarming that they threatened to end the lab’s grant unless the university changed course.
At the center of the debacle was the National Institutes of Health, whose National Institute of Allergy and Infectious Diseases had funded both Kawaoka’s and Fouchier’s labs. (Fouchier was a sub-awardee on a grant to a U.S. institution.) The agency oversees biosafety protocols on the same research it funds, and its oversight arm has a reputation for being timid, generally resolving issues through polite dialogue. “We want to be cautious about when we use that stick,” said Jessica Tucker, acting deputy director of NIH’s Office of Science Policy, referring to the threat of termination.
Under NIH’s guidelines on research with recombinant DNA, home quarantine is acceptable for low-risk H5N1 exposures, like the two 2013 accidents, but not for high-risk ones in which a scientist has potentially inhaled the virus. The guidelines say that lab workers exposed through their respiratory tract or mucous membranes need to be isolated in a dedicated facility, like a hospital.
With pathogens like modified H5N1, quarantining an exposed lab worker in such a facility is “a prudent precaution and reduces the risk to the worker’s family and community if they do become infected,” wrote Gregory Koblentz, director of the Biodefense Graduate Program at George Mason University’s Schar School of Policy and Government, in an email.
The University of Wisconsin–Madison did not have such a plan in place, according to the documents. In a letter to NIH, a university vice chancellor wrote that after consulting health care providers and Wisconsin health officials, administrators had determined that “a home quarantine was appropriate for all exotic influenza viruses.” Rebecca Moritz, who was with the University of Wisconsin–Madison’s Office of Biological Safety at the time, told The Intercept that the outside health experts were concerned that quarantining researchers in the hospital would put medical staff at risk and unnecessarily take up an isolation bed.
NIH alleged in a letter that university officials also worried about “the stress [hospital quarantine] would place on the laboratory worker.”
“That is not a persuasive argument,” said Richard Ebright, a molecular biologist at Rutgers University who sits on his university’s institutional biosafety committee. “Most major hospitals have an infectious disease isolation ward with rooms that are expressly designed to reduce transmission. No homes do.” In a hospital, he added, “[Quarantine] is supervised, which is not happening for a person in a home.”
Although the scientific community was debating how to oversee gain-of-function research, and the accidents would have been relevant to that debate, the dispute was handled quietly. One incident came to light 18 months later, the second emerged only last year, and the full story has gone untold until now.
Descriptions of the two incidents, along with the agency’s responses, appear in the trove of documents obtained by The Intercept, Edward Hammond, and Lynn Klotz detailing lab accidents reported to NIH’s Office of Science Policy over a span of 18 years. The documents, which number over 5,500 pages, cover the years 2004-2021 and paint a picture of various animal bites, escapes, needlesticks, equipment malfunctions, and even some human infections. Hammond, former director of the now-defunct transparency group Sunshine Project, and Klotz, senior science fellow at the Center for Arms Control and Non-Proliferation, requested some of the documents under the Freedom of Information Act; The Intercept requested others directly.
When the University of Wisconsin reported the two incidents to NIH’s Office of Science Policy, as required of institutions that use NIH funding to research certain pathogens, it set off a flurry of heated conference calls and sternly worded letters involving high-level administrators on both sides. The spat lasted six weeks.
Kawaoka and the university had assured the world that his research was safe, but NIH alleged that they were not adhering to all regulations, and even in some cases to the university’s own policies. “NIH has significant concerns regarding the biosafety practices associated with both of the recent incidents,” two agency officials wrote in one letter. They threatened to “institute enforcement action(s),” including suspending or terminating Kawaoka’s grant.
The agency’s reaction to the accidents was more extreme than in any other instance examined by The Intercept. After many other accidents, including for some involving potential pandemic pathogens, the same bureaucrats responded with brief thank-you emails. In a few cases, they asked for corrective actions. In no other instance did they threaten to withhold funding.
Withholding or terminating funding “remains the agency’s last measure for compliance and thus the agency tries to prioritize the other tools at our disposal to achieve our policy goals as a first measure,” wrote Ryan Bayha, a spokesperson for NIH’s Office of Science Policy, in an email. “These include working with researchers and other institutional officials to help bring the researcher back into compliance. In our experience, this has been a successful approach.” (Bayha was previously an analyst with the office, and the initial report of the needle prick went to him, along with other staffers.)
In some instances, the University of Wisconsin–Madison biosafety practices singled out for scrutiny by the agency aren’t clearly delineated in agency guidelines. The dispute highlights a lack of clear standards in how to respond to exposures in high-risk labs — a gray area that, critics argue, could put the public in danger. “It shouldn’t be up to people in the moment of a disaster,” said Rocco Casagrande, managing director of Gryphon Scientific, a biosafety advisory firm. “Someone needs to step in and say, ‘This is how it should be.’”
The needle prick was previously reported by USA Today, as part of a larger investigation into U.S. biolabs. Klotz wrote about the two accidents in an article for the Bulletin of the Atomic Scientists last year. The Intercept is publishing the full correspondence between NIH and the University of Wisconsin–Madison about the breaches, along with additional details and reporting, for the first time. (The Intercept has omitted one document to protect the lab’s security.)
“The Influenza Research Institute has never experienced an event where public health or safety has been put at risk,” wrote Andrea Ladd, director of biological safety at the University of Wisconsin–Madison, in an email to The Intercept. “This does not mean incidents do not occur, but when they do, there are protocols and systems in place to ensure that risk is mitigated and our researchers, community, and environment are protected from harm.”
“No one that is currently at UW-Madison was involved in those conversations with the NIH and therefore we cannot confirm details of those conversations,” she added. But she wrote that before NIH intervened, home quarantine was the university’s policy “in most cases” following exposure to highly pathogenic avian flu viruses like H5N1: “Examples of when quarantine would have been at a location other than a personnel residence were not specified in the UW Exposure Control Plan prior to December 2013.” The university currently quarantines high-risk exposures in an isolation room at a local hospital, she said.
Ladd wrote that unlike in the bombshell avian influenza studies, the two strains at issue in the accidents were “not known to be mammalian transmissible.”
Critics counter that Kawaoka’s research entailed stitching genes from H5N1 onto human flu strains and adding progressively more mutations until the hybrid viruses became transmissible, and that, while risk is hard to predict, strains along that continuum could also be concerning. “If it is a version that is on the pathway toward mammalian transmission, more than strains that circulate in nature, then it is a subject of high concern,” said Ebright. According to the documents, one of the strains had a mutation in the receptor binding site, which is critical to infection.
Fouchier declined to comment, writing in an email, “I have commented many times in the past.” Kawaoka confirmed the accuracy of Ladd’s responses but declined to respond to questions. Shortly after publication, he wrote in an email that “on rare occasions, humans become infected with avian influenza viruses, usually following close or prolonged contact with infected birds. The mutation in question was found in a patient sample and is not known to be mammalian transmissible. We did not test the transmissibility of the virus in question or use this virus for any animal experiments.”
“Dr. Kawaoka is one of most compliant, if not the most compliant PIs [principal investigators] I have ever worked with,” said Moritz, who was with the University of Wisconsin–Madison’s Office of Biological Safety at the time. “He takes safety and security incredibly seriously and works very, very well with people like me to figure out how to mitigate risk.” She added: “One of the things that I find most disheartening about this entire debate is that we’re debating the ethics of a set of experiments. That’s what we’re ultimately debating.”
Others agree that the accidents were not unusual or reckless but contend that when it comes to experiments with a small but real chance of ravaging the population, safety and ethics are inextricable. “We should be brave enough to say that some experiments should not be done,” said Simon Wain-Hobson, a virologist at the Pasteur Institute in Paris who supported restrictions on controversial gain-of-function research in the wake of the 2011 studies. Because such work accounts for a small proportion of biomedical science, he argued, “This is not an attack on the scientific system. It is about protecting the integrity of the scientific system and society as a whole.”
When flu viruses reassort, or swap gene segments, in nature, the hemagglutinin gene often plays a critical role. For the controversial 2011 experiments, Kawaoka’s team had combined a mutated version of the hemagglutinin gene from an avian H5N1 strain with gene segments from a human H1N1 flu strain. They used a similar approach to generate one of the viruses at issue in the 2013 accident reports, though with a different strain of H5N1 and fewer mutations in the hemagglutinin gene. On a Saturday evening that November, a researcher in Kawaoka’s lab was using a needle to draw up liquid containing the virus when he pricked his hand. The needle punctured his glove, sinking into his finger.
The researcher dialed the on-call lab manager, who gave detailed instructions on what to do next: Squeeze blood out of the wound, run his hand under water for 15 minutes, put on new gloves, clean up, and shower out. After notifying health care providers and other staff, the lab manager gave the researcher Tamiflu, an N95 mask, and a new glove to cover his injured hand; the laboratory manager drove the researcher home, instructing him to quarantine there for a week. A colleague had called ahead and told the researcher’s family members to pack their things so they could be moved to a hotel before he arrived. A university employee alerted city and state health officials. Once home, the researcher started taking his temperature, and the next morning the lab manager collected throat and nose swabs for testing. Soon after, a biosafety officer informed NIH about the accident, boasting: “This has been an exceptional response.”
Administrators at the Office of Science Policy disagreed. A week earlier, a researcher in the same lab carrying a stack of tissue culture plates containing a different modified H5N1 strain dropped a plate, spilling a small amount of virus onto the lab floor. Some of it splashed onto his Tyvek suit, just below the knee. From there, the suit extended down his legs and then stopped at his ankles, leaving patches of bare skin. The researcher cleaned up the accident; doused his arms, legs, and some lab equipment in an ethanol solution; stuffed all of the waste into a biohazard bag; and phoned the on-call scientist to report what had happened. After consulting a doctor and getting him a prescription for Tamiflu, biosafety staff discharged the researcher, telling him to monitor his body temperature. After he left the lab, a second researcher went in to dispose of the waste.
As NIH staff pried into the University of Wisconsin’s policies for research on avian influenza viruses, they learned that the institution planned to quarantine exposed researchers at home in all cases, no matter the risk level. “An individual’s permanent residence is not appropriate due to the fact that many residences are in buildings with high occupancy that share air exchange and other infrastructure,” wrote Jacqueline Corrigan-Curay, an official in the Office of Science Policy, in a December 2013 letter to the university. She pointed out that in a research plan sent to NIH earlier that year, Kawaoka had said he had access to a “designated quarantine apartment” for researchers who were at high risk of infection. (Ladd told The Intercept that Kawaoka’s statement about the apartment was caused by a “misunderstanding” between him and the university on where researchers would quarantine after high-risk exposures.)
Corrigan-Curay ordered the university to find a dedicated quarantine facility, noting, “An isolation room in a hospital would be appropriate.”
NIH also noted that the exposed researcher had been using the needle for an unauthorized purpose; the laboratory’s standard operating procedure did not allow needles to be used for drawing up tissue culture supernatant, the liquid the researcher had targeted. (Ladd said that the policy has since been “revised for improved clarity” and that the lab workers were retrained.)
In the spill, NIH took issue with the researcher’s exposed ankles. Agency officials contended that bare skin violated the agency’s guidelines covering research with recombinant DNA.
On a phone call, university representatives disagreed. According to a note about the call in the correspondence, someone said that the lab had recently been inspected by the Select Agent Program, which is jointly administered by the U.S. Department of Agriculture and the Centers for Disease Control and Prevention, and that the report from the inspection did not mention any restrictions of the sort on bare skin.
NIH shot back that the agency had consulted staff at the Select Agent Program. “They are in agreement that bare skin is unacceptable at this level of containment,” wrote Corrigan-Curay. “The University must take immediate action to ensure that, in the future, no workers in this or any other high containment laboratories have exposed skin.”
The dispute over bare ankles illustrates a lack of clear and consistent standards. In Canada and select other countries, research on pathogens is centrally regulated. The United States has a jumble of policies, and biosafety training can vary widely from one lab to the next. After the uproar over the 2011 avian influenza studies, NIH adopted additional biosafety guidelines for research with H5N1 strains that are transmissible in mammals, but even those are not comprehensive.
“They don’t have good guidelines about when things are mitigated enough,” said Casagrande, whose firm has advised NIH on the risks and benefits of gain-of-function research. “They can have one response that is guns blazing, and another that is very muted — and why? What’s the standard?”
“Clearly, [oversight] only happens in extraordinary cases,” said Koblentz, the biosafety scholar. “But really it should be the routine.”
The exchange may have been particularly heated because the accident occurred at a fraught moment for high-risk viral research. H5N1 belongs to a group of what are called “potential pandemic pathogens”: bacteria, viruses, and other microorganisms that, either through handling or through modification, could set off another pandemic. Policies governing research with such pathogens were established in the wake of Kawaoka’s and Fouchier’s controversial papers, which were published with some revisions by Nature and Science, respectively, in 2012. (Klotz, who provided The Intercept with the University of Wisconsin incident reports, coined the term “potential pandemic pathogen” with Edward Sylvester of Arizona State University.)
In 2014, the U.S. government adopted a moratorium on funding for gain-of-function research that could spark a pandemic. Three years later, the pause was lifted and the Department of Health and Human Services, NIH’s parent agency, shifted to a framework called P3CO, under which research that involves modifying potential pandemic pathogens or is “reasonably anticipated” to create them has to undergo a special review process in order to get funding.
Neither policy has been evenly or transparently implemented. The Intercept has reported that in 2016, National Institute of Allergy and Infectious Diseases administrators flagged a proposal by EcoHealth Alliance, a U.S. nonprofit that worked with the Wuhan Institute of Virology on bat coronavirus research, as potentially being covered by the moratorium. But instead of insisting on modifications that would have made the research safer, they let the organization craft an unusual rule to govern its own work. Since the P3CO policy was adopted in 2017, according to Health and Human Services, only three projects have undergone special review. In a detailed analysis of NIH’s grant database last year, the Washington Post identified a total of eight projects that appear to have warranted review. And just last month, in articles from Stat and Science, it emerged that two more risky experiments had not undergone review. In the first instance, at Boston University’s National Emerging Infectious Diseases Laboratory, scientists created a hybrid version of SARS-CoV-2, the virus that causes Covid-19. The National Institute of Allergy and Infectious Diseases alleged that they had not sought approval for the work, prompting scientists not connected with the experiments to point out that the guidance on when to seek approval is unclear.
Moritz, the former University of Wisconsin–Madison biosafety staffer, contended that NIH’s response to the 2013 accidents was overblown and driven by the gain-of-function controversy. “You need to look at the timeframe and the context of what was going on politically,” said Moritz, who is now biosafety director at Colorado State University and the incoming president of ABSA International, a biosafety professional association. “That’s why the reaction was the way it was.”
“These decisions are not made politically,” said Tucker, the Office of Science Policy acting deputy director. “They’re made in terms of the best response and working with institutions to come into compliance.”
After NIH threatened to terminate Kawaoka’s grant, the documents show, the University of Wisconsin overhauled its policies, agreeing to adopt new guidelines on quarantining and on exposed ankles. The university sent the agency copies of new training slides. One conveyed a mixed message to lab workers. “Cannot have more accidents,” read one. “But MUST report any incidents, even the most minor.”
The researcher who had spilled the plate containing modified H5N1 got up in front of his fellow lab workers and reenacted the accident. Staff peppered him with questions. “Did you ever have a moment when you panicked?” asked one. “What was your worst fear? Quarantine?” asked another. The university sent notes on the meeting to NIH.
Finally, the two sides reached an agreement. On Christmas Eve in 2013, NIH wrote in a letter to the University of Wisconsin associate vice chancellor that the university had complied with its demands. Kawaoka’s lab could resume the controversial work.
Today, research with potential pandemic pathogens is again in the spotlight. In February, NIH charged a committee called the National Science Advisory Board for Biosecurity with reconsidering the P3CO policy, along with a policy on dual-use research. The NSABB has a fraught history, and its members are appointed by NIH itself. “There is an inherent conflict of interest in having a group appointed by an agency to review that agency’s work,” said Koblentz.
At NSABB meetings in April and September, tensions ran high. At stake is not just the future of gain-of-function research, some participants stressed, but the safety of the world. And yet, lost in the discussion is the fact that one of the labs that set off the gain-of-function fracas actually has had accidents involving modified H5N1.
When lab accidents happen, “They don’t put it in the local newspaper, and I think it’s reasonable that they don’t,” said Stuart Newman, a cell biologist at New York Medical College who sits on his university’s institutional biosafety committee. “But because it’s all handled quietly, the general public isn’t aware of the frequency of incidents like this — or even that they exist.” Even though most incidents don’t lead to infection, Newman said, “Just the fact that they happen should be more widely known.”
The NSABB released preliminary recommendations last month. Critics say they’re incomplete. At the meeting where the results were unveiled, Harvard University epidemiologist Marc Lipsitch took issue with the section of the recommendations that deals with transparency, saying, “It’s too weak and too nonspecific.” Tucker said the final recommendations are expected in December or January.
Some biosafety advocates say that a broader overhaul is needed. “As long as all of the oversight is strictly advisory and none of it is enforceable with force of law, nothing ever will move forward — particularly so long as the oversight is housed in an institution that performs and funds research,” said Ebright. “It needs to come from Congress or the White House.”
One model for regulating pathogen research could be the Nuclear Regulatory Commission, which oversees all facilities that work with radiological materials and also funds research on safety and security.
Others say that nothing short of a dramatic shift in worldviews is needed. Jesse Bloom, an evolutionary virologist at Fred Hutchinson Cancer Center, compared it to research on human subjects. Until the 1970s, scientists regularly carried out experiments on prisoners, including for infection studies. Over time, opinions shifted. “At some point, it became accepted that even though experiments on prisoners were scientifically informative, they just aren’t ethical to do,” said Bloom.
For the rest of the world, how the United States regulates research with dangerous pathogens matters. “The United States has a special responsibility when it comes to oversight and getting it right,” said Filippa Lentzos, an expert on biosecurity and biological threats at King’s College London who co-chairs an international task force on high-risk pathogens with Bloom. “It is a leader in a lot of this research, and it’s where most of this research takes place.”
In 2014, the moratorium on gain-of-function work made it impossible for Kawaoka’s lab to continue with transmissibility studies. But five years later, Science reported that the Health and Human Services P3CO panel quietly greenlighted the controversial bird flu experiments to resume, without alerting the public. The agency did not release details on how the panel assessed the proposals or what evidence was evaluated. The grant was contingent on the lab following additional safety measures, but the agency did not announce what these were. The decision came to light only because word leaked to a journalist, a fact that two prominent experts writing in the Washington Post called “unacceptable.”
The 2013 accident reports and correspondence might have helped inform the discussion. But at that point, they weren’t public.
Update: November 2, 2022
This article has been updated with a comment from Yoshihiro Kawaoka that was received after publication.
In America’s biolabs, hundreds of accidents have gone undisclosed to the public.