A group of 4 letters continues the previous discussion on low-level radiation and nuclear power; the reader is reminded of the Jan. ‘97 issue of this journal which was devoted to this important topic. As usual, all letters in Physics and Society may be edited or shortened by the Editor.

Principle of Collective Dose.

In the January issue of Physics and Society, Richard Garwin claims that the views expressed by Bertram Wolfe are only "part of the story". He then goes on to apply the principle of collective dose to the public exposure that resulted from the Chernobyl accident. He implies that the majority of evidence and the scientific community consensus supports such an application.

I don't believe anything can be further from the truth. In 1995, the Health Physics Society (HPS) issued a position statement that contained the following statement:

"In discussing the question of the limitations of extrapolation to estimate radiogenic risk in the millirem range, the National Academy of Sciences, in its 1990 BEIR V report noted '...the possibility that there may be no risks from exposures comparable to external natural background radiation cannot be ruled out. At such low doses and dose rates, it must be acknowledged that the lower limit of the range of uncertainty in the risk estimates extends to zero.' The Health Physics Society recommends that assessments of radiogenic health risks be limited to dose estimates near and above 10 rem. Below this level, only dose is credible and statements of associated risks are more speculative than credible." [1]

This statement, issued by a scientific organization dedicated exclusively to the protection of people and the environment from radiation and composed of more than 6,800 scientists, physicians, engineers, lawyers and other professionals representing academia, industry, government, national laboratories, trade unions and other organizations, does not appear to agree with Mr Garwin's application of collective dose. He states that the average individual dose was 7 mSv (700 mrem), which is quite a bit lower than the cutoff level recommended by HPS for making risk estimates.

Also, an extensive amount of data exists that contradicts the use of the ICRP risk model, not only the single study in China refered to by Mr. Garwin. For a recent and brief review of that data see Dr. Pollycove's article on the nonlinearity of radiation health effects [2]. In this article, data is reviewed from several studies that contradict the ICRP risk model Mr. Garwin has applied. A much more extensive review of the evidence contradicting this model can be found in the data document compiled by Radiation Science and Health, Inc.[3] This document references hundreds of studies that do not support the application of the ICRP model to the population exposure that resulted from the Chernobyl accident.

Finally, what is the harm with the application of the linear ICRP risk model? Shouldn't nuclear power be able to stand up in a fair comparison to other energy sources? Absolutely, but the key word is "fair". Our regulatory policies are based on models similar to the ICRP risk model. They do not serve to protect the occupational worker or the public and may even endanger public health.[4-5] Monetary resources are too small for simple everyday medical care, while billions of dollars are spent to reduce perceived radiation hazards that are trivial. These billions of dollars are increasing the cost of producing electricity from nuclear power without a benefit to the public. Nuclear power is being forced to compete on an uneven playing field.

While nuclear power can be competitive in the current environment, as Mr. Garwin points out, why should it be forced to compete at a disadvantage when science simply does not support the basis of that competition?

[1] "Position Statement on Risk Assessment" Health Physics Society, April 1995.

[2 Pollycove, Myron, "Nonlinearity of Radiation Health Effects," Environmental Health Perspectives, Vol. 106 (suppl. 1), February 1998.

[3] "Low Level Radiation Health Effects," Radiation, Science, and Health, Inc., Edited by J. Muckerheide, March 19, 1998. http://cnts.wpi.edu/rsh/Data_Docs/index.cfm

[4] Rockwell, Theodore, "Our Radiation Protection Policy Is A Hazard To Public Health," The Scientist, Vol. 11(5), March 3, 1997.


[5] Rockwell, Theodore, "Discussions Of Nuclear Power Should Be Based In Reality," The Scientist, Vol 12(6), March 16, 1998.


Michael C. Baker, Ph.D., P.E., (505) 667-7334; Fax: (505) 665-8346

Radioassay and Nondestructive Testing Team, Environmental Science and Waste Technology Group Mail Stop J594

Los Alamos National Laboratory P.O. Box 1663, Los Alamos, NM 87545 mcbaker@lanl.

Defending the Linear No-Threshold Hypothesis

Michael Baker (see previous Letter) responds to my letter in the January P&S in which I note that the International Commission on Radiation Protection (ICRP) and the Board on Effects of Ionizing Radiation (BEIR) provide a best estimate of cancer deaths at the rate of 0.04 per person-sievert (one sievert = 100 rem, = one gray for external gamma radiation). I said "There is little dispute over the collective exposure ... at 600,000 person-Sv ("p-Sv"). The cancer deaths are thus likely to be 24,000 ..." I did not say there was "scientific community consensus" (with the meaning of "unanimity") but I did quote the two organs that have been set up by the community to make these estimates. There is certainly great uncertainty as to the coefficients, not excluding the value of zero as asserted by Mr. Baker. But to quote a statement of "6800 scientists, physicians, engineers, lawyers and other professionals ..." does not add anything to our estimate.

Let's look at some of the evidence Mr. Baker cites (in Reference 3) attacking the "Linear No-Threshold Hypothesis" that underlies the use of "collective dose" with the recommended 0.04 cancer deaths per person-sievert. Ref. 3 states "... low-level radiation DNA damage is insignificant compared to normal oxidative DNA damage (0.3 cGy causes approximately six DNA damage events per cell, roughly the normal background radiation per year, compared to 240,000 per cell per day, or about 90 million per year, from normal oxidative DNA damage)."

John Graham(1) quotes the same 240,000 DNA events per day but goes on to say that the "Double-stranded breaks constitute 5% of the single-stranded breaks, so that with a background level of 240,000 breaks per cell per day, there are 10,000 to 12,000 double breaks." But that is incorrect. According to Maurice Tubiana(2) 4% of the breaks due to radiation are double-stranded, while only one in 15,000 is double-stranded in the case of the spontaneous damage. So Ref. 3 is telling us there are six DNA damage events per cell per year due to background radiation and 240,000/15,000 = 16 per day or 6000 per year from normal oxidative DNA damage.

Is the ICRP coefficient 0.04 cancer deaths per p-Sv inconsistent with these data? Assuming (without asserting) that double-stranded DNA damage is the cause of all cancer, the 6,000 double-stranded DNA breaks per year, accumulated for some 40 years, would account for the eventual 20% of the people who die from cancer in every developed society. It is generally believed that it requires on the average some 5-8 unrepaired damages to the DNA in a cell to permit that cell to escape from the normal strict regulation of growth and to be a source of cancer; the abnormal cell must also escape "apoptosis," cell death imposed by internal monitors. Using Tubiana's "4% of the DNA damage events from radiation are double-stranded" and Ref. 3's "six DNA damage events per cell per 0.3 cGy", we find that some 80 double-stranded breaks are caused by one Gy (or one Sv for gamma radiation).

If we assume that the 80 double-stranded breaks due to 1 Sv are simply additive and of the same kind as 6000 per year due to natural (non-radiation) events, we must consider that the 1 Sv effect is added to some 40 years of accumulated spontaneous damage--to some 240,000 spontaneous double-stranded breaks. But the increase in cancer rate would not be 80/240,000 = 1/3000, because the increased rate could apply to any one of the (let's say eight) steps required to transform a cell to be cancerous. A Taylor-series expansion(3) shows, in fact, that if there are n steps, then the increase in rate up to (1 + e) increases the effect by (1 + n e), so that 1 Sv of radiation would increase the cancer rate not by 1/3000 but by 1/400.

This is hardly a quantitative justification of the ICRP coefficient, which would predict 0.04/0.20 = 1/5 rather than 1/400; the ICRP coefficient would require radiation induced double-stranded breaks to be 100 times as effective as spontaneous breaking in leading to cancer.

Baker's Ref. 3 ridicules the linear no-threshold hypothesis as "... equivalent to predicting that: if 5 persons die in each group of 10 persons given 100 aspirins each, giving one aspirin each to 1000 persons will result in five deaths." In fact, if one out of 100 sugar pills carries a lethal dose of poison, then if ten persons are given 100 sugar pills each, seven will die, and giving one sugar pill each to 1000 persons will result in ten deaths--a nearly linear relation. The question is not one of arithmetic but of understanding which model is appropriate.

Another major data point is also presented in Ref. 3, but, contrary to the implication, it does not contradict the ICRP coefficient. In brief, some 65,000 people were studied in a High Background Radiation Area in China, in which external radiation exceeded that in the control area (25,000 people) by 1.77 mGy/yr. The data in Table 3 of Ref. 3 show "all-cancer" rates equal in the two groups with 90% confidence limits of 0.86-1.15 for the risk ratio between the HBRA and the control area. What would the ICRP coefficient of 0.04 cancer deaths per p-Sv predict? Forty years of exposure would correspond to about 70 mSv which corresponds to an individual cancer risk of 0.28%. Compared with the normal cancer incidence of 20%, this would be an increase in cancer risk by 0.28/20 = 1.4%. Since the 90% confidence interval is +/- 15%, the experiment really has no power to detect predicted cancer augmentation to 1.014 times that in the control area; in no way does this epidemiological study contradict the ICRP estimate.

One can perfectly well understand the frustration of those dealing every day with radiation when confronted with statements or even regulations that demonize the slightest dose. And one can likewise sympathize with those who are incensed with a portrayal of the spontaneous rate of double-strand lesions as 10,000 per day, when it is almost a thousand times less.

Combined with a 20% natural cancer incidence rate, Taylor's expansion tells us that there is sure to be a linear, no-threshold rate of induction of cancer by radiation; the coefficient might be negative.

I agree with Mr. Baker that it makes no sense to have a criterion of "as low as reasonably achievable" for radiation exposure risk. If we take the ICRP coefficient as 0.04 cancer deaths per Sv, and the value of a life lost (or saved) as $1 million, then it is worth $40,000 to avoid an exposure to the public of one p-Sv. According to the 1993 report of the United Nations Scientific Committee on the Effects of Atomic Radiation(4) one gigawatt-year of electrical energy produced (GWy) from coal contributed some 7 p-Sv of exposure, while the operation of a typical nuclear plant contributed some 1.8 p-Sv. The energy produced sells for some $400 million, so the damage to society from reactor operation of $72,000 is tiny in comparison. UNSCEAR data shows that the reprocessing of fuel from the same 1 GWy, as it was done commercially in France, provided a further global exposure of 1250 p-Sv, 99% of which comes from carbon-14. In addition, mining and milling of ore contributes, on the average, 150 p-Sv for the once-through nuclear fuel cycle, while the 20% economy in uranium from reprocessing and recycle reduces the mining and milling component to 120 p-Sv per GWy. The British Nuclear Fuels Limited operation at Sellafield now captures C-14, reducing the exposure from reprocessing; and modern mining and milling can reduce that component by a factor 100 or more. The ICRP dose-response coefficient can play an important role in allocating resources, despite the uncertainty in its magnitude.

Richard L. Garwin, IBM Fellow Emeritus

Thomas J. Watson Research Center

P.O. Bos 218, Yorktown Heights, NY 10598-0218

(914)945-2555; FAX (914)945-4419



1 "The Benefits of Low Level Radiation", a speech to the Uranium Institute Annual Symposium, London, 1996.

2 Radioprotection, 1996.

3 Crump, K.S., Hoel, D.G., Langley, C.H., and Peto, R. (1976), "Fundamental Carcinogenic Processes and Their Implications for Low Dose Risk Assessment," Cancer Research, 36:2973-2979, quoted in Wilson, R. (1997) "LowDose Linearity: An Introduction" Physics and Society, 26: January 1997.

4 UNSCEAR, 1993, Table 53.

Both Wolfe and Garwin are Right, but Incomplete

The argument between Dr. Wolfe and Dr. Garwin - each of whom obtained his Ph.D. in experimental nuclear or particle physics - is interesting because both are right in some respects and incomplete in others. Dr. Wolfe is correct: the deaths attributable to the Chernobyl accident are only about 40. But he is incomplete and, to this extent, misleading because, as Garwin points out, the number of deaths worldwide that are calculated may be 20,000. It is important to say "maybe" because the calculation is based on a linear no threshold theory and most of the deaths (if they come) will come from very low levels that are indistinguishable from the variable background. Neither scientist made the important point that the general arguments for a linear no threshold theory apply to a vast number of other situations in society where society conventionally ignores the effects of low exposures. Air pollution is the most well known example but only one of many. Many scientists believe air that pollution is causing the (delayed) deaths of tens of thousands of people in the USA every year.

Unless this or a similar comparison is made (as Dr. Garwin did not do and has not done on many other occasions) the discussion of the large number of calculated deaths from Chernobyl can be highly misleading.

Richard Wilson

Mallinckrodt Professor of Physics

Harvard University, Cambridge, MA 02138

(617) 495-3387; fax: (617) 495-0416;;home (617) 332-4823





Wolfe on Garwin and the LNT

Richard Garwin, in his letter (P&S January 1999) attacks one part of my October letter in which I raise the point that low level radiation, like low level sunlight may be healthy. Garwin quotes the International Committee on Radiation Protection (ICRP), and the National Academy of Sciences Committee on Biological Effects of Ionizing Radiation (BEIR) which adopt the "Linear Theory of Radiation (LNT).

"The LNT says in effect, that if gulping down thirty ounces of liquor will kill a person, then if thirty people each drink an ounce one of them will die. The linear theory was adopted at the start of the nuclear industry as a conservative means to protect the public at a time when the effects of low level radiation were not known or understood. It still is in effect. But the question is whether it is doing more harm than good.

Because the effects are so small it is hard to develop a clear measure of low level radiation effects. But despite Garwin's arguments, the data we are now collecting1 seems to indicate that at worse there is no effect below about 300 mSv, and at best it is healthy. Dr Bernard Cohen of the U of Pittsburgh[2] found that people living in areas with high levels of radioactive Radon develop less cancer than those in low level areas. Dr. Sohei Kondo of Japan[3] finds that those Japanese who received low levels of radiation from the Atom bombs live longer than those who received no radiation. And as mentioned in my letter, those living in high backround radiation areas, like Denver, live longer than those in low radiation areas.[4] Despite Garwin's arguments about Chernobyl, there are still no measurable added death rates in the Soviet public.

There is much additional data that seems to support the positive effects of low level radiation. But the effects are small so that the statistics are hard to verify.

One should understand that the linear theory, if wrong, can kill people. Even if one believes in the linear theory it can do harm if it is not properly explained to the public. The European Chernobyl abortions are one example. Should one have aborted when the extra radiation was less than the normal variations in nature's background radiation? And should we be spending many billions of dollars, and risking lives, by activities and transportation to reduce radiation at military sites (like Hanford, WA) below the normal background variations? Is the public being helped by being frightened about lifesaving radiation sterilization of food?

Garwin and I apparently agree on the public benefits of nuclear energy. My concern is that those knowledgeable about radiation have not properly educated the public; and the extremists opposed to nuclear energy hurt the public by frightening them with distorted views about low level radiation.

1.Low Level Radiation Effects: Compiling the Data. (1998) Prepared by Radiation, Science, and Health, Inc. Editor: James Muckerheide

2.Cohen, B.L. (1995) Test of the linear-no-threshold theory of radiation carcinogenesis for inhaled radon decay products. Health Physics, 68, pp157-174

3.Kondo, S. (1993) Health Effects of Low-Level Radiation. Kinki University Press, Osaka; and Medical Physics Publishing Co., Madison, WI

4.Yalow, R.S. (1994) Concerns with low level ionizing radiation. Mayo Clinic Proc., Vol. 69, pp436-440; ANS Transactions, Vol. 71

Dr. Bertram Wolfe

Vice President of GE, Manager of its Nuclear Energy Division (Retired).

15453 Via Vaquero, Monte Sereno, CA 95030

Phone and Fax: 408 395 9039

Email: Bert.Wolfe@gene.GE.com

Is Science Bad for the Poor?

I found Caroline Herzenberg's article on planning for the future of American science very interesting. However, in commenting on attitudes toward science, even she fell prey to a common misconception. She wrote that ``it (science) has been instrumental in the development of a civilian technology that systematically widens the gulf between the rich and the poor.''

In the long term (centuries) this is manifestly untrue In western countries, the vast majority of the population now have adequate food, clothing, clean water, and shelter, necessities that were not universally available in the 18th and 19th centuries. Life expectancy has risen dramatically. Transport speeds have increased from 5 to 50 or 500 mph for everyone, not just the rich.

Even in the last fifty years technology has not `widened the gulf.' It might appear that way due to the nature of news: expensive new technologies are newsworthy but cheaper existing technologies are not. The first televisions, antibiotics, lasers, personal computers, ... were not for the poor. In addition to the necessities, the vast majority of the population now has a telephone, refrigerator, television, and automobile as well as access to vaccines and emergency medical care. On an absolute scale, the physical improvement of the poor has been phenomenal.

Only if you choose your time period carefully, define `rich' and `poor' restrictively, and define `gulf' as `the ratio of incomes' or `the relative availability of luxuries' then perhaps the `gulf' has widened. It is not reasonable, but all too human, to focus on a small relative change in the status of two groups over a time when the absolute wealth of both has increased extraordinarily.

Science clearly does have a perception problem. It stems from several sources: 1) people do not make absolute comparisons, only relative ones; 2) humans have a short memory: today's consumer good is disconnected from yesterday's scientific advance; and 3) science has been so successful within its own realm that our failure to be equally successful in the realm of societal problems causes disappointment and even hostility.

To partially overcome these perception problems, we need to remind people (and ourselves) of the tremendous differences scientific and technological advances have made in their own lives.

Lawrence Weinstein

Associate Professor of Physics

Old Dominion University

Norfolk, VA 23529

757 683 5803




Should Physicists Dismiss Speculation?

Two letters (R. Riehemann, July 98, p. 2; V. Raman, October 98, p. 2) have questioned the value of drawing philosophical or religious lessons from modern science. Riehemann is "personally acquainted with persons who have been seriously misled by such books," and wonders whether "these books help or harm the public...." Raman claims that "writers...created an altogether new genre of scientific knowledge which consists largely of poetic and picturesque world views, dubiously related to hard-core science.... This has become fertile ground for unbridled imagination, mystical interpretations, and theological extrapolations...."

I agree that there is plenty of published nonsense connecting modern physics to broader notions. But before we physicists dismiss all such speculations, we should note that we ourselves have for centuries indulged in such speculations.

The "mechanical universe" has been part and parcel of physics since Newton. Although today we no longer accept Newton's version of it, we still use the term to describe classical physics. And many of us contribute, wittingly or unwittingly, to the perception among the general public that the universe described by science is in fact precisely this impersonal, automatic, mechanical universe, allowing little room for such commonly-believed notions as free will or ultimate purpose.

The mechanical universe, whether the one that Newton and Descartes puzzled over centuries ago, or a more modern version, is in fact a grand philosophical scheme with immediate religious implications, of just the sort criticized by those who are nervous about broad interpretations of modern physics. For one famous example, it represents a grand extrapolation of classical physics, far beyond its then-known range of validity, to conclude, with Laplace, that "an intelligence which at a given instant knew all the forces acting in nature and the positions of every object in the universe...could describe with a single forumula the motions of the largest astronomical bodies and those of the lightest atoms. To such an intelligence, nothing would be uncertain; the future, like the past, would be an open book." Talk about sweeping generalizations!

Much of the philosophical speculation from modern physics is less far-fetched than the mechanical universe. For example, consider the notion that every particle in the universe is "entangled" with every other particle, and that every detection-type event (of the sort that causes "collapse of the wave packet") therefore causes a simultaneous quantum jump of every particle in the universe. That is, every eye-shift as you read this page causes a subtle instantaneous shiver throughout every particle in the universe. This seemingly outrageous notion is a straightforward implication of quantum theory coupled with the rather broadly accepted notion that the universe originated in a single quantum event.

We physicists are awfully quick to criticize broad implications suggested by modern physics, but we have failed, for more than three centuries now, to seriously critique the mechanical, predetermined universe that is suggested by Newtonian physics. This is inconsistent, to say the least, and should cause us to adopt a rather broad-minded and humble view of the philosophizing of others.

Art Hobson

University of Arkansas

3889 words ~ 3 pages