A Boston University scientist came under attack this week. experiment They messed with the Covid virus.I can not breathe headline Claiming they created a deadly new strain, the National Institutes of Health rebuked Universities that did not seek government permission.
As it turned out, experiments conducted in mice were not what the inflammatory media reports suggested.The engineered virus strain was actually less lethal than the original virus strain.
But the fuss has highlighted the shortcomings of how the U.S. government regulates research on pathogens. and a seemingly haphazard pattern in the federal oversight policy known as the P3CO Framework.
Even when the government publicly reprimanded Boston College, the government never issued a public warning. Other experiments It funded researchers engineered the coronavirus in a similar fashion. one of them It was carried out by the government’s own scientists.
Angela Rasmussen, a virologist with the Vaccines and Infectious Diseases Agency at the University of Saskatchewan in Canada, said the Boston episode “certainly tells us that we need to rethink the P3CO paradigm quite dramatically.” . “The whole process is kind of a black box, and it’s very difficult for researchers,” she said.
The NIH said all research it considers for funding is scrutinized for safety concerns by institutional experts, who decide whether to escalate it to a higher level Dangerous Agents Committee. rice field.
But some experiments were conceived later or didn’t rely directly on federal funding, so they ended up outside the scope of that process, causing confusion, according to Bio. Safety experts said. And the rules could be reviewed soon. After months of meetings, a panel of government advisers will provide updated recommendations for such studies by December or January, the agency said.
Government policy against such experiments is the Potential Pandemic Pathogen Care and Surveillance, or P3CO framework. It was established five years ago in response to a series of controversial experiments in which researchers attempted to convert an influenza virus that infects birds into a virus that can infect mammals.
Under this policy, the NIH and other agencies are supposed to flag grant applications for experiments that could cause new pandemics. High-risk research may not be funded or may require additional safeguards.
Critics of P3CO complain that the evaluation is largely confidential and ignores projects not funded by the US government. In January 2020, the government’s advisory board, the National Biosecurity Scientific Advisory Board, announced: public meeting Discuss reform. However, the arrival of Covid ironically canceled the subsequent meeting.
In the months that followed, Republican politicians attacked the NIH for supporting past research on the coronavirus at the Wuhan Institute of Virology, suggesting that the lab leak at the NIH may have caused the pandemic. (Dr. Rasmussen and other scientists in July It was published the study Instead, it points to the Wuhan market as its origin.)
Under this increasing scrutiny, the NIH Advisory Board met in February to work on new recommendations over the summer, draft last month. He suggested expanding the range of pathogens that could prompt review, beyond pathogens with high mortality rates. Unlike smallpox and Ebola, Covid has a low fatality rate but is so contagious that it still causes global devastation.
In its ongoing discussions, the committee has also considered the risks posed by computer software, such as programs that could figure out how to spread pathogens faster.
Read more about the coronavirus pandemic
Researchers’ reaction to the new guidelines was mixed.
“The first draft brought some important advances and did a lot,” said Mark Lipsitch, an epidemiologist at the Harvard TH Chan School of Public Health who has called for stricter rules since the bird flu experiment more than a decade ago. I’m leaving it unresolved.
of comment Dr. Lipsitch and his colleagues, who submitted to an advisory committee last month, said the proposed experiments must be justified by actual practical benefits, not unsupported claims.
While other scientists welcomed clearer guidance, they were concerned about cumbersome regulations that stalled mundane and harmless experiments.
Robert F. Garry Jr., a virologist at Tulane University, said: .
The ambiguity in government policy came to light this week when news broke about an experiment at Boston University.
Mothan Said, a virologist at the school, and his colleagues posted a report online aimed at understanding the differences between Omicron and other subspecies. Researchers have created a new virus identical to the original version but with an Omicron spike. The modified virus was then put into a strain of mice that are highly susceptible to Covid and widely used to study the disease.
A previous study found that the original strain of Covid killed 100% of mice. A new study found that the modified virus was less lethal, killing 80%.
last Sunday, Article published in The Daily Mail “Scientists have created a deadly new Covid strain with an 80% fatality rate,” the headline claims.The next day, NIH official Emily Abelding told the news site statistics Boston University should have discussed the experiment with the institution in advance.
But some researchers say federal guidance is vague about what disclosures are required after a research proposal is approved. , the agency generally does not apply guidance to experiments devised after funding has been granted.
“Governments need to provide guidance to help people make sense of this,” said Gregory Coblenz, a biodefense expert at George Mason University.
In a statement to the New York Times, Boston University said the experiment was approved by its own safety committee, as well as the Boston Public Health Commission.
The university also said that while the scientists had received government funding for related research, they used university funds to pay for the experiments in question and were not obligated to notify the NIH. He said he was looking into the matter.
Syra Madad, an infectious disease epidemiologist at New York City Health Hospital, said the high-profile controversy over laboratory technical protocols sent a mixed message to the scientific community and the public.
“It seems like an epic communication failure,” said Dr. Maddad, who also sits on the National Scientific Advisory Board for Biosecurity. “This is why we are reviewing our policies to make sure they are clear, transparent, make sense, and operationally viable.”
Dr. Madad and other experts agreed that the Boston University experiment proposal should have undergone a more rigorous evaluation. “In my opinion, this certainly looks like it meets her P3CO review criteria,” she said.
But even if the study had gone through that process, some scientists said it probably would have been given the green light.
The coronavirus is already circulating among humans and has evolved far beyond the mutants used in the experiment, said Jesse Bloom, a virologist at the Fred Hutchinson Cancer Research Center. Even if a hybrid lab virus leaked, it is unlikely to pose a serious threat.
“I understand why people are concerned because we are creating a virus whose characteristics are completely unpredictable,” said Dr. Bloom. “But I don’t think this is particularly risky.”
The NIH’s harsh public statement about the Boston University study raised questions about how the NIH and other health agencies have evaluated such experiments in the past. Last month, Food and Drug Administration scientists, like a team in Boston, announced a study in which mice were injected with a coronavirus designed to carry the Omicron spike.
FDA must comply with P3CO regulations. But officials said in a statement that the hybrid virus created as part of the study did not represent a “new version of the virus.” Because “we aim to understand how viruses work, not to identify new ways to make them more powerful.”
Some independent experts said the agency’s rationale could not explain why the study passed.The experiment went through the approval process simply because the researchers had no intention of creating a more dangerous virus. cannot be avoided.
“If research is expected to lead to an increase in potential pandemic pathogens (strains that are more contagious and/or virulent than those found in nature), it should be reviewed. Bloomberg Public Health Dr. Tom Inglesby, director of the Johns Hopkins Center for Health Security, said in an email.
FDA researchers aren’t the only American scientists tinkering with the coronavirus in this way. At the University of Texas Medical School in Galveston, scientists rely in part on federal funding. the study As to whether vaccines produce protection against coronaviruses modified to have omicron spikes.
These techniques will save scientists months of waiting for samples of Omicron virus from human patients, allowing them to study the dangers of new variants and predict the need for booster shots. Outside experts said the Texas experiment was even less dangerous than the Boston study. This is because they generally infected cells, not living animals, with the virus.
The proposal from the Texas team would have been reviewed by the NIH, but was never escalated to the Dangerous Agents Board. The agency did not disclose the reason. (Since 2017, only three of his studies that NIH has proposed funding have been reviewed by that committee. it said.)
“No one is really in charge of scanning the medical literature, and it could be a random occurrence that brings these particular experiments to the public eye,” said Dr. Inglesby. “And it shouldn’t be like that.”
Some raise another issue. Research that is not funded by the government is not subject to government regulations.
Carmela Haynes, a biomedical engineer at Emory University and a member of the National Scientific Advisory Board for Biosecurity, said, “Ultimately, everyone agrees that it would be ideal to publish a broadly applicable policy.” I think you’d agree.
One possibility is Federal Select Agent Programwhich requires anyone attempting to work with certain dangerous substances, such as anthrax, to register with the government.
“Recommendations that don’t include codifying legally enforceable regulatory requirements mean nothing,” said Richard Ebright, a molecular biologist at Rutgers University.
He added that federal officials could come under pressure to step up surveillance next year if Republican crackdown advocates win power in November’s midterm elections.
On the other hand, some say that politically arduous debates could make better regulation even more difficult to achieve.
“I’m worried that it will hamper our ability to understand these viruses that have killed millions of people,” said Gigi Grombal, a biosafety expert at the Johns Hopkins Bloomberg School of Public Health. I’m here.