Cold War Human Radiation Experiments: A Legacy of Distrust
By Mark Goodman
The April 1995 APS Meeting in Washington DC marked two significant anniversaries in the history of ionizing radiation and health. A special session celebrated the 100th anniversary of Roentgen's discovery of x rays. Since this discovery, ionizing radiation and radioactive tracer materials have become ubiquitous tools in medical research, diagnosis, and treatment. Another session, which I organized, marked the 50th anniversary of the first use of nuclear energy for military purposes and delved into the darker history of Cold War human radiation research.
In December 1993, Energy Secretary Hazel O'Leary learned of a newspaper article by an Albuquerque reporter about people who had plutonium injected into their bodies to study the resulting risks. O'Leary was shocked, and called for an outside investigation of these and other experiments that had come to light. She persuaded President Clinton to establish the Advisory Committee on Human Radiation Experiments, to report on human radiation experiments performed by the Department of Energy and other agencies implicated in similar activities. This committee of experts in medical science, biomedical ethics and related fields released its final report in October.
The Advisory Committee's report has been well-received in general, although some have expressed disappointment with its failure to condemn certain experiments and scientists. Reaching consensus on the ethical judgment of past actions proved quite difficult given the limits of available information. But the committee was widely praised for the way it carried out its two other main tasks, providing a public accounting of the events of the past and making recommendations for the future based on lessons from these events.
I was not a member of this committee, but served on its staff. The staff was responsible for most of the historical research, and drafted findings and recommendations for consideration by the committee. My work focused on experiments involving the deliberate release of radioactive materials into the environment.
More than most scientists, biomedical researchers face ethical questions of what means are legitimate for gathering experimental data, particularly when the experiments involve human beings as subjects of research. Biomedical ethics involves two basic principles. First, researchers must weigh the anticipated benefits of an experiment against the anticipated risks. Second, people who are subjects of research must knowingly agree to take part in that research, a requirement known as informed consent.
These principles seem fairly obvious, and served as the basis for the Nuremberg Code that provided the standard for judging Nazi doctors accused of crimes against humanity. Still, they were not widely observed as the standard of practice until decades later. Even in the 1960s, the physician was vested with great authority and informed consent was often honored in the breach. But the Cold War history of human experimentation raises more specific and serious concerns about secrecy and whether national security interests overrode respect for basic human dignity.
One of the first challenges of the Manhattan Project was the safe handling of radioactive material. Radium was the most hazardous radioactive material known at the time. Women who painted radium watch dials suffered high rates of cancer and necrosis of the bone caused by the radium they ingested when they licked their brushes. Yet the entire pre-war stockpile of radium amounted to no more than 100 grams.
The Manhattan Project would produce tens of kilograms of plutonium and tons of radioactive waste. Like radium, plutonium is a strong emitter of alpha particles, but its biological hazard was unknown. Radiation safety therefore became an important part of the Manhattan Project, known by the code name of "health physics." The name stuck, and health physics is now a large and thriving specialty.
In the spring of 1945, urine samples suggested that those handling plutonium at Los Alamos were approaching the safety limits for plutonium within their bodies. These limits were based on animal studies, however, and health physicists therefore felt an urgent need for human experiments. Over the next few years, 18 patients at Oak Ridge, the Universities of California, Chicago, and Rochester were injected with plutonium to determine how it was metabolized.
Certainly the patients gained nothing from these experiments, but neither did they suffer any immediate harm. Choosing patients with short life expectancies would have limited any long-term health risk. This probably explains why many subjects were misleadingly described as "hopelessly sick" or "terminal." In fact, several survived for decades with body burdens well in excess of current occupational safety guidelines. Still, the Advisory Committee found no evidence of clinical harm to any of them.
The Advisory Committee found these experiments to be clearly deficient on the basis of informed consent. In most cases, it appears that the patients did not know that they were experimental subjects. Even where they may have known they were subjects, they could not be informed of the nature of the experiment because plutonium was a secret material and radioactivity in general was a sensitive subject.
Perhaps the most powerful evidence that officials were concerned about the ethical problems with these experiments lies in their subsequent secrecy. Researchers wanted to publish an article on these studies in an official history of the Manhattan Project, but their draft paper was classified solely because it might expose the government to lawsuits. After that, the Atomic Energy Commission's insurance branch assumed a formal role in declassification, not to protect national security but to shield the AEC and its officials from legal liability.
Nowhere was this secrecy more pervasive than with the intentional release experiments that the Advisory Committee was asked to review. For example, a series of tests of prototype radiological weapons undertaken from 1949-1952 was not made public until 1994. The Army's Chemical Corps tested these mechanisms to disperse radioactive materials at the Dugway Proving Ground in Utah, but kept them secret from nearby residents. The main reason for this secrecy was to avoid public alarm, and there is no indication of a national security motive.
The most controversial of the intentional release experiments I studied was known as the Green Run, in which large quantities of radioactive gas were deliberately released from the Hanford plutonium separation plant. Conducted in December 1949, less than three months after the discovery of debris from the first Soviet nuclear test, Green Run tested equipment and techniques that could be used to monitor Soviet nuclear production. The Green Run remained secret until 1986, its intelligence purposes acknowledged only in 1993.
The Green Run released roughly 8,000 curies (a curie is 3.7 x 1010 disintegrations per second, based on the radioactivity of a gram of radium) of iodine-131 into the atmosphere near Hanford. While dwarfed by the roughly 700,000 curies released during and soon after World War II, this was the largest one-day release ever from Hanford, more than 1,000 times the average daily emissions at the time. It was largely a matter of luck that the risk to nearby residents turned out to be small. Because the Green Run took place in the winter, dairy cows were not grazing on contaminated pastures, but this could not have been planned because the milk pathway was unknown at the time.
Nevertheless, the Green Run raises questions that remain relevant today. What benefits were anticipated or actually resulted? How can the government find an appropriate balance between disparate benefits to national security and risks to public health? The available documents provide no indication that anyone ever weighed the risks and benefits of the Green Run and concluded it was worth the risk. Furthermore, there is no evidence of a lasting benefit to U.S. intelligence from the Green Run; information that proved useful for that purpose came from other sources. More disturbingly, it remains unclear whether environmental risks undertaken as part of classified programs receive adequate review even today.
The story uncovered by the Advisory Committee, though it deals mainly with medical research, raises more general questions about the ethical obligations of scientists to society. What limits should be imposed on the means scientists use to obtain new knowledge? How do we weigh the costs of research, in dollars and in human health, against the benefits of the knowledge to be gained?
These questions underlie today's debate on public funding for research. To scientists it seems obvious that the benefits of their endeavor outweigh the costs, but the citizens who support research through their taxes are not so sure. While we have all come to rely on the technological and medical fruits of modern science, most citizens show little understanding of or interest in science.
Beyond a mere lack of interest, the public has grown increasingly skeptical of scientific experts and government officials (often one and the same), viewing them as special interests. Reassurances about the safety of nuclear testing, which often proved to be misleading, fed this skepticism, as did the exaggeration of risks by other "experts." As a result, our society is remarkably averse even to very small risks.
Most scientists do not involve themselves in such controversies, yet we all bear some responsibility for our poor image. Many of us interpret our obligations as scientists too narrowly. We prefer to avoid thinking about the value of our work, and when we do, we tend to rely on shopworn platitudes about the benefits of scientific progress. To restore our credibility, we will have to take seriously both the need to explain what we are doing to those who pay for it, and the obligation to understand how our work affects people.
Mark Goodman is a physical science officer at the Arms Control and Disarmament Agency. In 1994-1995 he was a research analyst with the Advisory Committee on Human Radiation Experiments in Washington, DC. The views expressed are his own and do not necessarily represent those of the Advisory Committee. An earlier version of this article appeared in the July 1995 issue of Physics and Society, the Forum on Physics and Society newsletter.
©1995 - 2016, AMERICAN PHYSICAL SOCIETY
APS encourages the redistribution of the materials included in this newspaper provided that attribution to the source is noted and the materials are not truncated or changed.