Should information be withheld from academic journals because of the potential that it might fall into the hands of terrorists? The National Science Advisory Board for Biosecurity (NSABB) has asked the journals Science and Nature to keep certain details out of reports they intend to publish about experiments that produced a human-transmissible version of a flu virus that is deadly about 50 percent of the time.
The NSABB said conclusions should be published, but not “experimental details and mutation data that would enable replication of the experiments.” This government panel has not sought to ban the release this information, so we’re not talking about formal censorship, but the request is at an early point on the censorship continuum.
It would seem that withholding this information from academic journals might do some good. But the limiting factor on production of a newly transmissible virus is training in virology (or whatever) and access to the equipment that allows such work to be done—not access to data about the technique used in these experiments. Whether it’s published in these journals or not, a criminal/terrorist virologist would probably be able to access the data using the subterfuge of having a genuine scientific interest.
So, to stop terrorists accessing bioweapons do we limit training in virology? Control laboratory equipment as dual-use civilian/military technology? No, because the massive weight of training and equipment—something approaching, if not actually, 100 percent—will go to people who will use these things to make us safer, even if a one-off tries to use virology skills to make us unsafe.
It’s a close call, and I’m not entirely certain about what I’ve just said, but this is a more difficult logic puzzle than most people think. Given the overwhelming majority of good people using information for good, diffusion of information will almost always be good. I doubt that the NSABB has sufficiently considered the costs of withholding information about the modified virus from people who would use that information to secure against its modification by whatever invention they bring to bear. (I can’t cite the invention because it hasn’t been invented yet!)
This is akin to the gun control issue. Consensus goes against guns because they make a loud bang and often draw blood when they’re used harmfully, but they are utterly silent in their beneficial use of deterring crime and violence, which is what they do the vast majority of the time. The idea of a massive epidemic strikes our primal imaginations with fear, while the notion of scientists converting diffuse knowledge into security against epidimics is a somber intellectual exercise.
Speaking of imagination, the idea of the terrorist super-villain is widespread, but imaginary. It’s important to remember that the 9/11 terrorists had box cutters. They had no idea their attack would produce the collapse of the twin towers, though many people reasoned backwards from that devastation to give them sophistication (and motivations) they didn’t actually have. It’s our psychology/imagination that gave terrorists access to chem/bio/rad/super-weapons over the last decade, a notion that almost certainly infects the considerations of the NSABB.
It’s probably a mistake to withhold scientific data from publication. We’re rather more safe from the threat of biological terrorism than most people think, and we’d get marginally safer from having information about virus experiments easily available to any researcher who might use it to discover ways of making us even safer.
(Related: Milton Leitenberg of the University of Maryland’s Center for International and Security Studies has a great contrarian piece in the Cato book Terrorizing Ourselves about the counterproductive mania around bioweapons, though his points don’t easily sync up with what I’ve said here.)