Earlier this year, scientists published a paper describing how they pieced together segments of DNA in order to bring back a previously eradicated virus called horsepox.
The paper, written by two University of Alberta researchers and the co-founder of a New York pharmaceutical company, was controversial because, as various experts told the magazine Science, someone could use a very similar process to bring back a related virus: smallpox.
Smallpox, you’ll recall, killed hundreds of millions of people before the World Health Organization declared it eradicated in 1980. That was the result of a long vaccination campaign — so the idea of piecing the virus back together from bits of DNA raises the specter of a horrifying pandemic.
Two journals rejected the paper before PLOS One, an open access peer-reviewed journal, published it. Critics argue that the paper not only demonstrates that you can synthesize a deadly pathogen for what Science reported was about $100,000 in lab expenses, but even provides a slightly-too-detailed-for-comfort overview of how to do it.
Some of the horsepox scientists’ coworkers are still pretty upset about this. PLOS One’s sister journal, PLOS Pathogens, just published three opinion pieces about the whole flap, as well as a rebuttal by the Canadian professors.
MIT biochemist Kevin Esvelt, for instance, wrote on Thursday that the threat is so grim that we shouldn’t even talk about it:
At present, we decidedly err on the side of spreading all information. Despite entirely predictable advances in DNA assembly, every human with an internet connection can access the genetic blueprints of viruses that might kill millions. These and worse hazards are conveniently summarized by certain Wikipedia articles, which helpfully cite technical literature relevant to misuse.
Note the deliberate absence of citations in the above paragraph. Citing or linking to already public information hazards may seem nearly harmless, but each instance contributes to a tragedy of the commons in which truly dangerous technical details become readily accessible to everyone. Given that it takes just one well-meaning scientist to irretrievably release a technological information hazard from the metaphorical bottle, it may be wise to begin encouraging norms of caution among authors, peer reviewers, editors, and journalists.
Then, in a Q&A with an MIT blogger, also published Thursday, Esvelt blamed interest in smallpox synthesis on people who talk to the media as well:
DNA synthesis is becoming accessible to a wide variety of people, and the instructions for doing nasty things are freely available online.
In the horsepox study, for instance, the information hazard is partly in the paper and the methods they described. But it’s also in the media covering it and highlighting that something bad can be done. And this is worsened by the people who are alarmed, because we talk to journalists about the potential harm, and that just feeds into it.
The Canadian professors — though not their colleague from the pharmaceutical company, interestingly — fired back by arguing that, well, it was gonna happen anyway:
Realistically all attempts to oppose technological advances have failed over centuries. We suggest that one should instead focus on regulating the products of these technologies while educating people of the need to plan mitigating strategies based upon a sound understanding of the risks that such work might pose. In these discussions, a long-term perspective is essential.
It’s grim to envision engineered smallpox infecting and killing humans, the way its naturally-evolved predecessor did. Perhaps some subjects aren’t worth studying, after all.
More on smallpox: Scientists Eradicated Smallpox. With Synthetic Biology, We Can Bring it Back.