Revelations that two international teams of researchers (in the US and the Netherlands) have mutated avian influenza (H5N1) to be both lethal and highly contagious have led to a difficult decision for scientific journals: to publish or not to publish.
I was involved in a thoughtful and considered look at this question here in Canada a few years ago. Under the auspices of Defence Research and Development Canada and Health Canada, a two-day workshop was held with numerous stakeholders to look at the issue of legitimate biological research that could pose a risk to biosecurity if it fell into the wrong hands, or “dual-use research of concern” as it is being called.
Following the workshop, I formed a committee of Canadian journal editors involved in microbiology and ethics journals to consider a Canadian response. In this regard, I would like to mention in particular Dr. Harry Deneer, a microbiology professor at the University of Saskatchewan and a co-editor of the Canadian Journal of Microbiology, who wrote a report and recommendations on this.
With the current H5N1 research controversy, the US National Science Advisory Board for Biosecurity (NSABB) is asking journals not to publish the research in its entirety because of its potential for bioterrorism. Instead, it is suggesting leaving out key details that would allow others to construct the mutated virus; these details could be available to legitimate researchers from the authors.
This is not the first time this issue has been raised with journals. The NSABB has approached major journals in the past, and most have published statements that they would take a reasoned, responsible approach to the peer review and publication of such reports. The US Department of Health and Human Services asked Proceedings of the National Academy of Sciences (PNAS) to pull a paper it was planning to publish in 2005 because of concerns about its potential use. There was also concern when the full genome of the highly lethal and contagious Spanish influenza virus was published some years ago as well.
There are several issues wrapped up in this controversy.
First, why do such research at all? For complex research requiring university laboratories to be undertaken, it must be approved by funding agencies that have taken a hard look at it. University administrations and ethical review boards may also be involved. Therefore, for such research to proceed, there are a variety of checks and approvals.
The reason the research is approved is that it’s valuable. Understanding Spanish influenza at a genetic level, for example, helps us understand what makes a pathogen contagious and lethal. Therefore, we can better foresee outbreaks, prevent transmission, develop vaccinations, and treat infections.
Next, why publish it?
Publication and openness are at the very core of scientific research. While there is some defence research that is done in a secure setting and not released, the basis of the academy and the scientific endeavour is transparency. Violations of these fundamental tenets must be justified and well-thought-out. Also, research is validated by reproducing it. For other teams to reproduce research, they must have open access to the full methods and results. There is also a practical reason to publish: other research groups may be able to take the data further and produce useful results.
The publication of the Spanish influenza virus was vindicated by important findings made by other research groups in the wake of publication.
The counter-argument is that such information would allow bioterrorists to create an agent for biological warfare. How serious a threat is this?
At the workshop I attended, I spoke with a researcher who had visited the former Soviet laboratory Vector, once it had been converted to benign use. While the Soviet regime never admitted to conducting research into biological agents, journals in the library at Vector were dog-eared and well-thumbed at articles about lethal pathogens, the researcher told me. It was clear what their interests were.
Scientists have told me that the main concern here is state-sponsored terrorism. The average freelance terrorist does not have the means or education to conduct the delicate, expensive experiments needed to engineer viruses or bacteria. A large well-funded lab is needed, of the kind run by governments. Furthermore, well-educated scientists and access to genetic material is also required. Concern that they may be training future bioterrorists led the US to deny entry to the US to scientists and science students from several suspect countries. Canada has no such policy, and professors have told me that education in a Canadian setting helps budding scientists from these other countries to learn positive, ethical attitudes to research.
There are currently rogue states (run by sociopaths) that have large laboratories and foreign-trained scientists. So the risk of misuse of legitimate research exists.
So how to decide whether to publish solid, important research that runs this risk?
All journal editors and publishers I have talked with agree that the decision must remain the autonomous, independent decision of the editors, as it is for all publishing decisions. Editors should not be overruled by publishers (unless there is a problem with the editor’s competence) or dictated to by government organizations.
But they also agree it is incumbent upon editors in fields where the dual-use issue may arise to take a responsible stand on this. The American Society of Microbiology, for example, has a policy on this for its many journals. Submitted papers with a potential “dual-use” aspect are escalated and subjected to a special review for security risk. Several other publishers have issued statements and put procedures in place to identify and evaluate these papers specially.
Once reviewed, the journal editors decide whether to publish the research in full, in part, or not at all. Most would publish an editorial accompanying such a paper to indicate their reasons for publishing, or for suppressing part of the paper.
There are many factors influencing the editors’ final decision.
One of my former colleagues, an editor of an engineering journal, described a paper submitted to his journal that detailed how to blow up a bridge. He passed this through a quick ethical filter: Does the research have any legitimate use? No. Does it pose a high risk of misuse? Indeed. He told the author, essentially, “Are you kidding?” The author argued that understanding how to blow up a bridge would help engineers build more terrorism-resistant bridges, but the editor thought that argument specious. The author threatened to publish the paper elsewhere, but chances are he got the same reception at other journals.
“Dual-use” research is rarely so clear cut. The benefit to humanity may outweigh the risks.
These are difficult, Solomonic decisions. Since we cannot foresee the outcome, editors must make decisions, observe the consequences, and learn. I don’t envy the editors who receive the papers about the H5N1 research, but I wish them well.