Skip to content

When you choose to publish with PLOS, your research makes an impact. Make your work accessible to all, without restrictions, and accelerate scientific discovery with options like preprints and published peer review that make your work more Open.

PLOS BLOGS ECR Community

The responsibility of research: the dual use dilemma

Editor’s Note: Anna Perman and David Robertson contributed equally to this article.

“Do you worry that some of the research you do might be exploited in a way that harms people?”

Adam pauses. He leans forward, hands clasped. We’ve been embedded in Imperial College’s Blast biomechanics lab, of which Dr. Adam Hill is a key part, for about a month. It’s the first time we’ve seen a crack in his composed, assured demeanour.

“I do worry that understanding the vulnerabilities in things, that might…” He struggles for words for a few moments. We are sitting in a Westminster cafe looking out over the Thames, on a sunny day, which is strangely incongruous with the weighty topic at hand.

The Blast lab experiments on real human body parts, subjecting them to severe stresses. And what they’re doing could determine whether a person lives or dies. Every so often though, that reality intrudes on them.

For Adam, this is a reason to pause, but not a reason to stop. Scientists frequently stand on the boundary between what we do and don’t know. Crossing that boundary is like opening Pandora’s box – there’s a risk of letting out something terrible. And therein lies the dual use dilemma. Greater knowledge about the world around us is not without its risks; people can abuse that knowledge and use it to inflict suffering on others, or use it unethically to put themselves in a position of power.

Scientists are entrusted as the best, and sometimes only, experts capable of assessing the risks that newly discovered information brings. But they can’t always control what their research will be used for. Consider the case of Arthur Galston, who discovered the key chemical in the defoliant Agent Orange in the 1940s. He had no idea that the byproduct of the manufacture of this chemical – dioxin – would devastate the lives of millions after its mass deployment during the Vietnam War. The discovery that his research indirectly caused such harm led him to say in a 2003 interview that “The only recourse for a scientist…is to remain involved with [research] to the end”.

Sometimes the researchers who make the discoveries do not adhere to the ethical and moral guidelines laid out by our society and institutions. Recently, the threat of bioterrorism in the wake of the 9/11 attacks has refocused the spotlight on dual use. High profile investigations into the 2001 anthrax mailings in America focused on several government scientists,; the prime suspect was vaccine researcher Bruce Ivins, who later committed suicide [PDF]. Although it is thought that the genetic evidence against him was overstated, this episode highlighted the potential for scientists to abuse their technical capabilities, as well as the need for extra scrutiny of their work.

While it may help, the peer-review process will not single out dual use research before it is made publicly available. This was illustrated in 2005 by a controversial paper which modeled a potential terror attack on the American milk supply in The Proceedings of the National Academy of Sciences (PNAS). After warnings from several officials from the US department of Health and Homeland Security, PNAS published the paper.

PNAS was criticised for its decision to publish, though it defended itself by arguing that the paper did not give specific enough information to someone to actually carry out an attack. The episode led people to question how effective the guidelines for journal editors should be to filtering sensitive information before it is made public, or even whether it is their role at all. To better understand the difficulties of research in such a field, we spoke with an expert in modelling disease outbreaks, Professor Neil Ferguson, of Imperial College London.

He told us that to commit a simple attack, familiarity with the highest level of specialist knowledge isn’t necessary. Most conclusions of research about the highest-risk release events in bioterrorism tend to state the blindingly obvious,” he said. “Intuition tells you what you would need to do.” For rogue individuals or small groups planning a malicious attack, scientific discussion of the error magnitude of lethal doses of a toxin is unlikely to really affect their plans. The only actors capable of truly exploiting the dual use potential of bioengineering research are states, and they are already subject to measures which discourage them from doing so.

And what of the scientists themselves, who are arguably the best placed to exploit this information? “There are a limited number of groups capable of actually doing useful research in the field of disease outbreak, and they’re populated by sensible scientists,” Professor Ferguson explained.

But some ethicists, such as Professor Malcolm Dando, have argued that the ethical guidelines for researchers don’t give them nearly enough guidance on this issue. Containment of material is usually the focus, with less attention paid to their research findings’ end uses.

This raises a crucial distinction in dual use: information, which could be used to plan a malicious action, as opposed to actual harmful materials. That’s the key difference between the anthrax attacks and the PNAS controversy. The former involved misuse of restricted-access materials by a scientific specialist, while the latter focused on the broader issue of open publishing and access to information.

“We’ve got this line we can’t step over, and that imposes our boundaries,” Adam says. “All the rules are unwritten. And there are a lot.”

For Adam, the intended application of Blast’s research is always known – it is to make life safer for soldiers. What separates Adam from most other scientists is that when he is working on a question which could have dual-use implications, the Ministry of Defence (MoD) have a hand in the details of what he can study and publish. In this sense, he says, “a basic scientist might be able to fly a little below the radar.”

In the information age, knowledge is more accessible than ever before. Scientists need to consider the dual use implications of the knowledge they generate, and the progression of this knowledge should be as transparent as possible for the public. In the case of Blast’s research, more information, used well, can protect soldiers. But science moves forward stage by stage. And to address a weakness, you have to first expose it.

Moving into position By The U.S. Army, http://www.flickr.com/photos/soldiersmediacenter/5852701586/

Photo via Flickr (CC-BY). Also featured on Mother Jones: We’re still at war”

To truly understand the dilemmas Adam faces, imagine that you are a soldier in the field. It might give you comfort to know that research is being done, by dedicated scientists such as those in the Blast lab, into how to best keep you safe. But if you were to learn that a piece of kit you use day in, day out, is not as safe as it should be, how would you feel? What if research done in your own country shed light on the most damaging types of explosive against your vehicle, and that research was accessible by your enemies?

Back in the cafe in Westminster, Adam runs his hands over his cropped hair, and straightens up. With his usual efficiency and brevity, he sums up the dilemma that he and all scientists face.

“Knowledge is power – it might be best to know. It’s just a question of how many people you want to know.”

Discussion
  1. Hopefully, such research will do more to help us than hurt us and will be carefully disseminated. Our enemies are determined to gain such information, whether we do or not.

  2. No Vince, it’s not a new problem. If you look hard enough, almost every technological breakthrough has been used nefariously at some point or another. However, it’s not a dimension regularly discussed by, or of, scientists, until something goes wrong – as in the cases of Agent Orange and anthrax.

Leave a Reply

Your email address will not be published. Required fields are marked *


Add your ORCID here. (e.g. 0000-0002-7299-680X)

Back to top