This article was originally published at The Conversation. The publication contributed the article to LiveScience's Expert Voices: Op-Ed & Insights.
Last October, scientists in California sequenced the DNA for the “type H” botulinum toxin. One gram of this toxin would be sufficient to kill half a billion people, making it the deadliest substance yet discovered – with no antidote. The DNA sequence was not placed on public databases, marking the first time genetic code has been withheld from the public over security concerns.
As biological discoveries accelerate, we may need to censor even more genetic data. The line between digital data and our physical world is not as clear cut as it once was, with the advent of 3D printing technologies and DNA synthesisers. Many people are familiar with the first printed gun, cited heavily by the media as a dangerous development. But many would probably be surprised to learn that analogous technology is used to print pathogens. For example, the polio virus was successfully recreated in 2002, and the 1918 flu virus was resurrected by a DNA synthesiser in 2005.
But the increasing accessibility to this technology raises concerns about the “dual-use” nature of it as an unprecedented weapon. President Obama was worried enough to commission a report on the safety of synthetic biology, while volunteers have created software to detect malicious DNA sequences before an unsuspecting company prints them out.
For the first time in human history, knowledge that is discovered has a reasonable chance of never being forgotten. And while this would normally be a great thing, it also creates a ratchet effect with dangerous information – once a bit of malicious code is online, the whole world can dissect and modify it.
We saw this with the infamous Stuxnet virus which appeared in 2010 – an elegantly created computer virus designed to hack Iranian nuclear labs and manipulate centrifuges to the point of breaking them. While this may have been a strategic boon for Israel and the United States, we now must contend with the availability of Stuxnet’s source code, which was later posted to Github. The genius mechanisms the virus used to bypass security systems are now available to the world for delivery of alternative cyber payloads.
If a similar dynamic emerged with biological code rather than computer code, the results could be catastrophic. About a century ago, 50m people died due to a particularly lethal strain of flu, the genome of which is available online. And it is estimated that if the same virus were to be released today, the initial death toll could top 80m. Any knowledge or technology that has the capability for such destruction ought to be handled with the same caution we give to nuclear secrets, even if it means slowing the advances in medical biotechnology.
The 1972 Biological Weapons Convention, which originally codified an international agreement against the development of biological weapons, should be revamped to be fully effective. Only a multilateral approach can fully solve the regulation problem associated with synthetic biology, since viruses can spread across international borders as quickly as the airplanes carrying them.
We also need to give some serious thought to how openly we want to develop biotechnology. As Nick Bostrom, founder of the Future of Humanity Institute at Oxford University, once said:
It is said that a little knowledge is a dangerous thing. It is an open question whether more knowledge is safer. Even if our best bet is that more knowledge is on average good, we should recognise that there are numerous cases in which more knowledge makes things worse.
In the case of synthetic pathogens, our probing could indeed make things much worse if we’re not careful.
Andrew Snyder-Beattie is an Academic Project Manager at the Future of Humanity Institute. FHI's research includes analysing the extreme tail risks of technological development.
This article was originally published at The Conversation. Read the original article. The views expressed are those of the author and do not necessarily reflect the views of the publisher. This version of the article was originally published on LiveScience.