Monday, November 18, 2024
HomePoliticsOpinion | The Pandemic Threat That Hasn’t Gone Away

Opinion | The Pandemic Threat That Hasn’t Gone Away


In December 2014, two monkeys in outdoor cages at the Tulane National Primate Research Center, about 40 miles north of New Orleans, became ill with Burkholderia pseudomallei, a deadly bacteria in the federal government’s highest risk category, reserved for pathogens like smallpox, anthrax and Ebola. This is the category for which the Centers for Disease Control and Prevention sees “significant potential for mass casualties or severe effects.”

A short drive from the cages, there was a lab working with the pathogen, which at the time had never been found naturally in the continental United States

Some of the staff members who worked with the monkeys were not told of the infections for nearly a month after the pathogen was suspected and 10 days after it was confirmed.

In March 2015, the Tulane center’s director, Andrew Lackner, claimed that “various Burkholderia species have been present in domestic animals in Louisiana since at least 2004, long before any scientific study of the organism began” at the center.

Lackner also dismissed media reports of a possible lab leak as a “predictable news cycle” and said there was “no known threat to public health.”

By then, though, the C.D.C. had already determined that the bacteria in the monkeys was the exact strain studied in the center’s lab.

An Agriculture Department report later documented a history of lax practices at the laboratory as well as problems with its wastewater treatment, which may have been how the pathogen had leaked.

Much of this would not have been known if not for years of dogged reporting by Alison Young, a former investigative reporter for USA Today and a professor at the Missouri School of Journalism.

In her new book, “Pandora’s Gamble,” Young lays out the shocking extent of lax laboratory standards and procedures, and lack of accountability and transparency, in the United States and around the world.

Young is a rare breed, an investigative journalist who has covered the C.D.C. and biosafety issues since at least 2006, when she worked for The Atlanta Journal-Constitution, long before the Covid pandemic.

Her book is full of calmly reported but jaw-dropping details of incident after incident in which research labs, more interested in public relations than public safety, have been opaque or even misleading about safety failures and laboratory-acquired infections. She reveals lab accidents that have gone unreported or details that have not been publicly known at the Army’s biolab at Fort Detrick in Maryland, at the C.D.C., at a San Francisco veterans medical center and elsewhere.

In December 2019, a research trainee’s breathing tube delivering safe, filtered air became disconnected while she was working at the University of Wisconsin with ferrets — whose upper respiratory tract resembles that of humans — potentially exposing her to the deadly bird flu virus H5N1. The strain had been genetically modified so that it could spread through airborne transmission among mammals, which researchers were trying to determine was possible.

Such genetic modification, called gain of function research, had been allowed to resume only earlier that year after an almost five-year ban. The ban followed a public outcry over news in 2011 that researchers in Wisconsin and the Netherlands had manipulated the bird flu virus to be transmitted through the air to ferrets — and thus, potentially, among humans. H5N1 doesn’t normally spread among people easily, a big relief since the virus has a known human fatality rate of about 50 percent.

As Young reports, even though university officials were required to immediately report that the researcher had breathed room air, they waited two months to alert the Office of Science Policy of the National Institutes of Health, which oversees U.S. research with genetically engineered pathogens, and almost two months to tell the university’s own biosafety committee. The university on its own decided to end the trainee’s quarantine without telling state and local public health officials.

A representative of the university later defended its lack of openness to Young by telling her, “most people are also not equipped to appropriately evaluate the risk.” The N.I.H. told Young that university officials did not think they needed to report the incident because they felt there was “no reasonable risk of virus exposure,” presumably because the hose was disconnected only briefly from the source of purified air.

Yet, during the Covid pandemic, we learned that airborne transmission can occur in even a few seconds and even at a distance.

The trainee researcher turned out not to have been infected. But “trust us” is not the proper response to a case in which the scientist could have been infected and spread the virus, even without obvious symptoms.

Lax lab safety procedures are a global problem. A Washington Post investigation reported recently that in the summer of 2019, hundreds of people in Lanzhou, China, got sick after thousands of people were exposed to bacteria that can cause brucellosis, after a government-run biomedical complex failed to properly disinfect its waste. A scientific paper published in November called it “possibly the largest laboratory accident in the history of infectious diseases.”

And yet, it took a month for the authorities in China to discover and fix the problem and four months before they informed the public.

The failure to detect in real time is a too-common feature of many lab accidents and biosafety mishaps. In 2012, in its first published report of its kind, the C.D.C. reported 11 laboratory-acquired infections in laboratories over six years. Not a single one was reported or realized unless a lab worker was later discovered to have been infected.

Sometimes, researchers don’t even know about a dangerous pathogen in their lab. The Marburg virus, similar to Ebola, is named not after any place where it had been endemic but after the German city where the first known outbreak occurred, in 1967. Lab workers had handled monkey tissue infected with the then-unknown virus. Seven people died and 32 were infected.

In 2003 and 2004, SARS leaked from labs in China, Singapore and Taiwan so many times after its initial outbreak, which had been contained, that the World Health Organization said that if it ever came back, it would most likely be because of a lab mishap.

When researchers conceal problems, it can be because they are avoiding embarrassment and hope for the best.

In 2003, a SARS researcher in a top biosafety lab in Taiwan, in a hurry to go on a trip, got infected when he rushed to clean a spill. He traveled anyway and then became very ill. Instead of informing the authorities, he self-quarantined. His father said the man “wanted to die at home” rather than bring shame to his lab and his country. The incident came to light only after the desperate father threatened to kill himself unless his son sought medical help.

In 2022, a lab worker conducting research on the new coronavirus in Taiwan exposed 110 people to Covid after she got infected when she removed her mask improperly — the blunder wasn’t discovered until later, when she tested positive. Back in 2003 her supervisor had been infected with SARS in a lab. Science magazine noted then that this was “a grim reminder, experts say, that the very researchers fighting SARS could unleash its next global outbreak.”

Of course, a large majority of lab research happens without such mishaps. And scientific research is how we study dangerous pathogens. And obviously, we’ve had outbreaks and pandemics even before we got the ability to work with pathogens in labs.

But the abilities scientists have developed in the past few decades have increased the threat. In 2012, when some scientists published a paper titled “Reconstruction of the 1918 Influenza Virus: Unexpected Rewards From the Past,” about a strain of influenza that had killed tens of millions around the world, other scientists argued that any knowledge gained wasn’t worth the risk or could be obtained more safely.

Biosafety also involves field research and animals at the labs. Scientists studying animals in the wild can carry back pathogens to their lab and the densely populated areas where they can be situated. A scientist at the Wuhan C.D.C., a few hundred yards from the market where the first known big outbreak of the Covid pandemic was detected in December 2019, was filmed in a video released that month about his field research on bats. The video showed him conducting field work without full protective gear, even though he noted in the video that “it is while discovering new viruses that we are most at risk of infection.”

And if scientists bring the animals themselves back to the labs, there can be even more danger. Scientists at the Wuhan Institute of Virology reportedly kept colonies of live bats to study the virus they carry. Those colonies could be a source of cross infection of people.

Cities are a particular danger. Spillovers, when a novel animal virus infects a human, are common; however, pandemics are rare. New viruses can’t trigger a pandemic unless they also evolve to spread effectively from one human to another. Rural communities do not provide the same density and population to facilitate that as cities do. (That’s also why urban wet markets are dangerous).

Yet, a new global biosafety report shows that 75 percent of the labs with the highest-level safety, BSL-4, reserved for the most dangerous pathogens, and 80 percent of BSL-3+ labs, where viruses like SARS could be experimented on, are in cities — places with more than 50,000 people living within 2.5 miles of the lab.

* * *

A model for the sort of global collaboration in the interest of safety is commercial aviation, which relies on international cooperation, transparency and a culture of accountability, including investigations of accidents that don’t seek to assign blame. Instead of hiding an accident, air authorities and airlines are encouraged to come forward with the understanding that it will lead to improvements, not finger-pointing and blame. It’s clearly not perfect — as the two Boeing 737 Max crashes demonstrated — but it does work impressively well.

As Young’s remarkable book extensively documents, scientists are prone to the human and institutional instinct to deny problems and hope for the best. Startling new research can get them grants and get them published — an important incentive for many scientists.

Research involving experiments with pandemic-potential pathogens should have very strict requirements for merit, conduct and accountability, which should be externally reviewed in a transparent manner so that the public can be assured the utmost care has been taken.

Regulations can require most risky research to be conducted outside of cities and with mandatory quarantines of researchers — not just after incidents are noticed, but after every potentially highly dangerous experiment.

When a researcher at the University of Wisconsin pricked a finger with a needle containing an engineered H5N1 bird flu strain in 2013, the N.I.H., Young writes, was alarmed to learn that the university had no dedicated quarantine facilities despite previous assurances to the contrary. It let the researcher wait it out at home while monitoring symptoms. To defend all this, university officials said “uncomfortable and confining” quarantine quarters, like hospital isolation rooms that minimize airborne spread, might make researchers hide their exposures. But the possibility of such reluctance is a reason for mandatory quarantine for all dangerous research.

There should also be uniform standards for biosafety for research that isn’t necessarily going to spark a pandemic but nevertheless poses a threat to the immediate environs.

One of the most important measures could be for top scientific journals to set high biosafety standards for what type of research they are willing to publish — similar to how research violating basic human rights would not be publishable. Instead of one country’s practices being singled out, standards should be set for all.

Young’s book raises a much needed alarm about the dangers of ignoring the need to do things right. The stakes are too high.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments