That title may sound silly, but it has a point. DURC stands for “Dual-Use Research of Concern,” and it refers to research whose dissemination may permit people to do harm. Some say that dissemination should therefore be restricted in some fashion short of declaring it “Classified” or “Top Secret.” Saying “If it quacks like a Top Secret” might be more apt, but it wouldn’t be nearly so much fun.
DURC came to prominence in connection with some alarming research on bird flu. As its name implies, “bird flu” affects primarily birds. It can also affect mammals, including humans, but it does not spread as easily among mammals as it does among birds. In particular, unlike the ordinary flu that we get vaccinated against every year, it does not spread via droplets sneezed or coughed into the air. Thus even though it may infect a few hundred people in an outbreak, and kill a sizable proportion of its victims, it holds little potential to cause a pandemic (widespread epidemic) akin to the 1918 flu, which killed at least 50 million people.
A few years ago, two research teams led by Ron Fouchier of Erasmus MC in Rotterdam, the Netherlands, and Yoshihiro Kawaoka of the University of Wisconsin, Madison, modified the H5N1 bird flu virus so that it could move more easily through the air between ferrets (which respond to flu much as do humans). This made it potentially more hazardous to humans. When in 2011 the two groups submitted papers to Nature and Science explaining how they had modified the virus, the journals accepted the papers but refused to publish them until the United States’ National Science Advisory Board for Biosecurity (NSABB) could decide whether key details should be removed or “redacted” from the papers (and made available only to researchers with a clear need to know) in order to prevent terrorists from learning how to create a flu pandemic. Alarmed critics bloviated that the modified virus was an “Armageddon virus.”
The key details could be essential to identifying a more infectious version of the natural virus before it caused a pandemic, but nevertheless the NSABB recommended redaction. It soon reconsidered, though, and in 2012, the two papers—unredacted–were finally published. Later that year, the National Institutes of Health (NIH) proposed stringent reviews of whether similar research should receive government funding, or even be classified. Research resumed in 2013 but funding was put on hold from 2014 to 2019.
The DURC question remains live, even though researchers seem to have kept their heads down in the last few years. At least, the dates on the examples below are a few years old.
But really, is it true that some research is too dangerous to permit or, if permitted, too dangerous to publish? An essential component of the scientific method is publication. Form a hypothesis, do the necessary experiments to determine whether you’re right, and then tell the world so others can check your work. If you work for a corporation, however, your reports might never be seen by anyone outside the company. If you work for the military or on defense contracts, your reports might be classified and, again, never seen by anyone without the appropriate security clearance. National security can come into play, as it did forty years ago when mathematicians working on cryptography ran into efforts at control by the National Security Agency. This is why spies exist.
Aside from proprietary and military contexts, is there any real reason why any knowledge, new or old, should be kept from the public eye? If you consider what one can do with very easily available knowledge, you might wonder. High school chemistry is enough to figure out how to build a bomb out of things found under the kitchen sink. High school biology can reveal how to make a whole prom ill by putting sewage in the punch or how to breed (by natural selection) antibiotic-resistant bacteria. A college microbiology text… A mycology (fungi, including deadly mushrooms) text… Well. And these days it’s all on the Internet, too.
A paper by Lawrence M. Wein and Yifan Liu, “Analyzing a Bioterror Attack on the Food Supply: The Case of Botulinum Toxin in Milk,” dealt with how terrorists might attack the U.S.’s milk supply (and therefore with how to safeguard it). It was scheduled for the May 30, 2005, issue of the Proceedings of the National Academy of Sciences until the Department of Health and Human Services asked the NAS not to publish the paper on the grounds that it provided “a road map for terrorists and publication is not in the interests of the United States.” The journal eventually published the paper anyway.
The rapid advances in genetics and DNA manipulation have provided even more examples. The July 2002 report that researchers had successfully assembled a polio virus from biochemicals and the virus’s gene map roused fears that if you can do it with polio, you can do it with smallpox. It also led directly to the formation of the NSABB. The October 2005 report that researchers had reassembled the deadly 1918 flu from synthesized subunits soon led to calls for researchers to censor their own work.
It is easy to say that all knowledge should be freely available. It is, after all, not what one knows, but what one does with what one knows, that creates problems. Unfortunately, the things that quack like a DURC tend to be the obvious things. Most of the problems that result from scientific and technological knowledge are ones that we do not see coming. The Green Revolution of the 1960s solved a looming hunger problem, but it permitted population to triple (so far) and thus recreate the original problem on a larger scale. The invention of the automobile gave us traffic jams, road rage, and urban sprawl. The invention of the transistor in 1947 led to computers, smartphones, video games, the Internet, and Artificial Intelligence, with issues of privacy, obsession, fake news, and unemployment, among others. The discovery of how to splice genes in the 1970s led to genetic engineering, GMO crops, manipulation of human embryos, and eventually transhumanism (all of which some insist are problems). And so on.
Most of us would say the benefits of scientific and technological progress outweigh the drawbacks. It is better to know more, and to do more, and to keep striving to know and do even more. We will stumble from time to time, but we will recover and move on.
But what about the bad actors, you say? The ones that DURC fans worry about? The terrorists and the deranged? Synthesizing a smallpox virus, or devising something even worse, requires advanced knowledge (even though some of the necessary equipment, such as DNA sequencers, is available on eBay). A foreign government is more likely to do it, and they have the personnel to work out the methods on their own (that it can be done is obvious; you just need the gene map, which is what DNA sequencing machines give you).
Making mischief from basic textbooks is both more likely and unstoppable. Those textbooks are everywhere.
than restricting research and publication it might be better to create an
agency whose mission is to imagine all the awful things that could result,
deliberately or not, from available knowledge—textbooks, Internet, research
papers—and then to figure out how to cope, adapt, or fix the problems.
 The potential for accidental releases from labs was also a major concerns; see Laurie Garrett, “The Bioterrorist Next Door,” Foreign Policy, December 15, 2011, Fred Guterl, “Waiting to Explode,” Scientific American, June 2012, and Tina Hesman Saey, “Designer Flu,” Science News, June 2, 2012.
 Jon Cohen, “Does Forewarned = Forearmed with Lab-Made Avian Influenza Strains?” Science, February 17, 2012.
 Masaki Imai, et al., “Experimental Adaptation of an Influenza H5 HA Confers Respiratory Droplet Transmission to a Reassortant H5 HA/H1N1 Virus in Ferrets” (http://www.nature.com/nature/journal/vaop/ncurrent/full/nature10831.html), and Sander Herfst, et al., “Airborne Transmission of Influenza A/H5N1 Virus Between Ferrets” (http://www.sciencemag.org/content/336/6088/1534.full).
 David Malakoff and Martin Enserink, “New U.S. Rules Increase Oversight of H5N1 Studies, Other Risky Science,” Science, March 1, 2013.
 Jocelyn Kaiser, “Controversial Flu Studies Can Resume, U.S. Panel Says,” Science, February 15, 2019.
 National Academies of Sciences, Engineering, and Medicine; Policy and Global Affairs; Committee on Science, Technology, and Law; Committee on Dual Use Research of Concern: Options for Future Management, Dual Use Research of Concern in the Life Sciences: Current Issues and Controversies (National Academies Press, 2017).
 There is also “sensitive but unclassified” or “controlled unclassified information,” which can mean restrictions and redactions of key details. See https://fas.org/sgp/cui/index.html.
 Gina Bari Kolata, “Cryptography: A New Clash Between Academic Freedom and National Security,” Science, August 29, 1980.
 Phillip A. Sharp, “1918 Flu and Responsible Science” [Editorial], Science, October 7, 2005.
 Yudhijit Bhattacharjee, “Should Academics Self-Censor Their Findings on Terrorism?” Science, May 19, 2006.