Mike Clarke/AFP/Getty Images
Ebola virus, the bacteria that causes plague, a pandemic strain of flu – if any of these pathogens could be turned into bioweapons by terrorists or rogue nation states, they would threaten humanity. And information that might aid that weaponisation process is in danger of leaking out, says the US National Academies of Sciences, Engineering, and Medicine.
Most life scientists have little awareness of biosecurity issues, according to a National Academies report released today. And it says there are “multiple shortcomings” in the systems designed to stop potentially risky research from being published.
Current US policies restrict research on 15 pathogens or toxins classed as “dual-use research of concern”, in other words, work that could both benefit medicine and be used to kill.
The bacterium that causes anthrax is on this list. In 2001, a former US government scientist sent anthrax spores in the mail, infecting 22 people and killing five of them.
But this list should no longer be considered exhaustive, warns the report, partly because new techniques such as CRISPR make it easier for microbes to be genetically edited or for novel life forms to be made from scratch.
“The driving vision of synthetic biologists is that genes can be put together like Lego bricks,” says Filippa Lentzos of King’s College London, who wasn’t involved in the report. “We have focused on locking up dangerous pathogens so people don’t have access to them. But today you can just build them in the lab.”
Earlier this year, researchers at the University of Alberta in Edmonton, Canada, announced they had synthesised a virus called horsepox from genetic sequences they bought through the mail. While harmless in itself, horsepox is a relative of deadly smallpox. As smallpox vaccination stopped in the 1970s when the disease was eradicated, people under 40 would have no immunity to it, says Robin Weiss of University College London.
The horsepox work has yet to be published. Journal editors are supposed to consider security risks before publishing any research with dual-use potential. In 2011, a US biosecurity committee asked Science not to publish research on how bird flu could be genetically altered to make it spread between people more easily, although it was eventually cleared for publication.
But the National Academies report points out that there are other ways of disseminating such knowledge, such as “preprint” websites. The rules on potentially risky research only apply to institutions that get federal funding – not private companies or the “do-it-yourself” community.
Lentzos says advances in drug delivery techniques also raise concerns. “We are seeing a lot of new delivery techniques for vaccines and gene therapies, like sprays and inhalers, that might be used to deliver weaponised forms of these drugs,” she says. “You can imagine a swarm of drones at a public event.”
It is hard to know where to draw the line, says Gigi Gronvall of the Johns Hopkins Center for Health Security in Maryland. “You don’t want to curtail research being done for beneficial reasons unnecessarily,” she says. “There’s a plethora of things that could be misused. Nature has a lot of ways to kill people.”
The report doesn’t attempt to offer any solutions to the problem. It concludes that despite decades of effort, there is little international consensus on policies to address the risks.
More on these topics: