Volume 23, Number 1 January, 1994


Symposium on Perception of Risk and the Future of Nuclear Power: Confronting the Crisis of Public Confidence in Science and Technology

Physics and Society presents here papers based on the four talks given at the invited session on Risk and Nuclear Power, held at the APS meeting in Washington, DC, on 14 April 1993.

Perception of Risk and the Future of Nuclear Power

Paul Slovic

Scientists and policy makers were slow to recognize the importance of public attitudes and perceptions in shaping the fate of nuclear power. In 1976, Alvin Weinberg observed:

"As I compare the issues we perceived during the infancy of nuclear energy with those that have emerged during its maturity, the public perception and acceptance of nuclear energy appears to be the question that we missed rather badly..... This issue has emerged as the most critical question concerning the future of nuclear energy (1)."

Seventeen years later, the problem of public acceptance is even more critical. Either the problem is damn tough or we haven't been working hard enough to solve it (I suspect that both assertions are true). Public support for nuclear power has declined for a decade and a half, driven by a number of powerful forces and events. In March 1979, the movie The China Syndrome premiered, dramatizing the worst-case predictions of the earliest risk-assessment studies. Two weeks later, Three Mile Island (TMI) made the movie appear prophetic. Succeeding years have brought us Chernobyl and other major technological disasters, most notably Bhopal and the Challenger accident. The public has drawn a common message from these accidents--that nuclear (and other) complex technology is unsafe, that expertise is inadequate, and that government and industry cannot be trusted to manage nuclear power safely. These dramatic accidents and the distrust they have spawned have been reinforced by numerous chronic problems involving radiation, such as the discovery of significant radon concentrations in many homes, the continuing battles over the siting of facilities to store or dispose of nuclear wastes, and the disclosures of serious environmental contamination emanating from nuclear weapons facilities at Hanford, Fernald, Rocky Flats, and Savannah River.

Psychometric studies of risk perception

The nature and determinants of public attitudes and perceptions regarding nuclear power have been the focus of considerable research. The psychometric approach to studying risk perception (2) assumes that hazards can be characterized in terms of numerous characteristics or dimensions, analogous to the personality traits that characterize people. Nuclear power has a special distinction in the perception literature--it is, to date, the technological hazard with the most negative and most problematic constellation of traits. It stands apart in having qualities that make it fearsome and hard to manage socially and politically.

The mapping of nuclear power's "personality" began in the mid-1970s with a series of psychometric studies designed to determine why people were very concerned about some hazards and not others, and why these concerns often differed from experts' assessments of risk. An early study assessed perceived risk of death (for the US as a whole) from 30 activities and technologies. Three groups of lay people and a small group of risk assessment professionals took part in the study. The results demonstrated great concern regarding nuclear power (it had the highest perceived risk for two of the lay groups) and great disparity between the perceptions of lay people and the perceptions of experts (who placed nuclear power 20th from the top of the list of 30 hazards).

Some have argued that public concern about nuclear power reflects a concern about radiation risk in general, but the groups of lay people in this study rated another radiation technology, medical x-rays, rather low in risk (17th-24th) and the experts rated it relatively high (7th). Apparently it was not radiation per se that was a concern to these people, but radiation as one may get exposed to it through the technology of nuclear power.

In an attempt to go beneath the surface of these global judgments, respondents were also asked to rate nuclear power, x-rays, and other hazards on a number of dimensions or attributes presumed relevant to perception and acceptance of risk. These ratings showed that nuclear power had the dubious distinction of scoring at or near the extreme negative end for most of the characteristics. Its risks were seen as involuntary, unknown to those exposed or to science, uncontrollable, unfamiliar, catastrophic, severe (fatal), and dreaded. Medical x-rays, in contrast, had a much more benign profile. Nuclear power's perceived benefits were also assessed and were found to be extremely low. These results have since been replicated with many different populations in numerous countries.

Perceptions of risk associated with nuclear waste and its management are similarly negative (3). When asked to state whatever images or associations came to mind when they heard the words "underground nuclear waste storage facility," a representative sample of Americans could hardly think of anything that was not frightening or problematic. The disposal of nuclear wastes is a technology that experts believe can be managed safely and effectively. The discrepancy between this view and the perceptions of the public is indeed startling.

The perception of nuclear power as a catastrophic technology was studied in depth by Slovic, Lichtenstein, and Fischhoff (4). They found that, even before the TMI accident, people expected nuclear-power accidents to lead to disasters of immense proportions. When asked to describe the consequences of a "typical reactor accident," people's scenarios resembled scenarios of the aftermath of nuclear war. Replication of these studies after the accident found even more extreme "mages of disaster. The fact that the earliest technical risk assessments for nuclear power plants portrayed worst-case scenarios of tens of thousands of deaths and devastation over geographic areas the size of Pennsylvania likely contributed to such extreme images. These early projections received enormous publicity, as in the movie The China Syndrome.

Origins of nuclear fears

The origins of fears about nuclear energy appear deeply rooted in our social and cultural consciousness. Weart (5) argues that modern thinking about nuclear energy employs beliefs and symbols that have been associated for centuries with the concept of transmutation--the passage through destruction to rebirth. In the early 20th century, transmutation images became centered on radioactivity, which was associated with "uncanny rays that brought hideous death or miraculous new life; with mad scientists and their ambiguous monsters; with cosmic secrets of life and death; --and with weapons great enough to destroy the world--" (5).

But this concept of transmutation has a duality that is hardly evident in the imagery associated with nuclear power and nuclear wastes. Why has the evil overwhelmed the good? The answer undoubtedly involves the bombing of Hiroshima and Nagasaki, which linked this belief structure to reality. The sprouting of nuclear power in the aftermath of the atomic bombing led Smith (6) to observe: "Nuclear energy was conceived in secrecy, born in war, and first revealed to the world in horror. No matter how much proponents try to separate the peaceful from the weapons atom, the connection is firmly embedded in the minds of the public."

Signal value

During the past decade, research has also shown that individual risk perceptions and cognitions, interacting with social and institutional forces, can trigger massive social, political, and economic impacts. A theory aimed at describing how psychological, social, cultural, and political factors interact to amplify risk and produce ripple effects has been presented by Kasperson et al. (7). An important element of this theory is the assumption that the perceived seriousness of an accident or other unfortunate event, the media coverage it gets, and the long-range costs and other higher-order impacts on the responsible company, industry, or agency are determined, in part, by what the event signals or portends. Signal value reflects the perception that the event provides new information about the likelihood of similar or more destructive future mishaps.

The informativeness or signal value of an event, and thus its potential social impact, appears to be systematically related to the characteristics of the hazard. An accident such as a train wreck that takes many lives may produce relatively little social disturbance beyond the victims' families and friends, if it is part of a familiar system. However, a small accident in an unfamiliar system, such as a nuclear reactor, may have immense social consequences if perceived as a harbinger of further, and possibly catastrophic, mishaps.

The concept of accidents as signals helps explain society's strong response to problems involving nuclear power and nuclear wastes. Because these nuclear hazards are seen as poorly understood and catastrophic, accidents anywhere may be seen as omens of future disasters everywhere, and thus produce large social impacts.

A crisis of confidence

The research described above demonstrates extreme negative perceptions and attitudes associated with nuclear power. This degree of negativity is remarkable in light of the confidence most technical analysis have regarding the safety of nuclear technology. Chauncey Starr (8), has argued that "acceptance of any risk is more dependent on public confidence in risk management than on the quantitative estimates of risk --." Public fears and opposition to nuclear waste disposal plans can be seen as a crisis in confidence, a profound breakdown of trust in the scientific, governmental, and industrial managers of nuclear technologies.

Viewing the nuclear-waste problem as one of distrust in risk management gives additional insights into its difficulty. Social psychological studies have validated folk wisdom by demonstrating that trust is quickly lost and slowly regained (9). A single act of embezzlement is enough to convince us that our accountant is untrustworthy. Subsequent opportunities to embezzle that are not taken do little to reduce the distrust.

Thus, it is apparent that the odds are stacked against nuclear power. The nature of any low-probability/high-consequence threat is such that adverse events appear to demonstrate the dangerousness of nuclear technology but demonstrations of safety require a very long time, free of damaging incidents or incidents perceived as damaging. As noted earlier, the high signal value associated with nuclear power mishaps assures that any significant problem, anywhere in the world, will be brought to the public's attention, continually eroding trust.

The future of nuclear power

What are the chances for a rebirth of nuclear power, driven by new reactor designs, heightened awareness of the need for nuclear power, and growing awareness of the risks associated with other sources of energy? Certainly an increased perception of benefit or need will increase public tolerance, if not acceptance, of nuclear risks. Society's tolerance of nuclear weapons testifies to this fact. However, in the absence of revolutionary changes in the ways that risks are managed in our society, it is not likely that public trust, confidence, and acceptance can easily be regained. For example, although the consensus opinion of technical experts asserts that nuclear wastes can be sequestered with essentially no chance of any member of the public receiving a non-stochastic dose of radiation (10), public perceptions do not reflect this view. Why will the public be more likely to believe that the new generation of reactors are inherently safe? Weinberg (10) argues that special-interest environmental groups (skeptical elites) could turn the tide of public opinion by siding with nuclear power as a solution to the greenhouse problem. It appears likely, however, that environmentalists will embrace conservation and energy efficiency rather than nuclear power (11).

In conclusion, before we spend billions of dollars pursuing a path that is destined for failure, we should pause to confront the problem of trust. Restoration and preservation of trust in risk management needs to be given top priority. A solution to the problem of trust is not immediately apparent. The problem goes beyond the nuclear industry; for instance, the chemical industry is similarly troubled. The problem is not due to public ignorance or irrationality, but is deeply rooted in individual psychology and in the adversarial nature of our social, institutional, legal, and political systems of risk management (12). Public relations efforts won't create trust. Aggressive and competent government regulation, coupled with increased public involvement, oversight, and control, and a "trouble free" performance record, might.

We can be sure, however, that without a serious effort to address the problem of trust, neither public acceptance nor a rebirth of civilian nuclear power in the United States will be achieved.

1.	A.M. Weinberg, American Scientists Vol. 64, 16-24 (1976). 
2.	P. Slovic, Science Vol. 236, 280-285 (1987).
3.	P. Slovic, M. Layman and J. Flynn, Environment , April 1991, 6-11, 
4.	P. Slovic, S. Lichtenstein and B. Fischhoff, in Energy Risk 
	Management, edited by G. Goodman and W. Rowe (Academic, London, 1979),
5.	S.A. Weart, Nuclear Fear:  A History of Images (Harvard University 
	Press, Cambridge, 1988), 42.
6.	K.R. Smith, Energy Environment Monitor Vol. 4, No. 1, 61-70 (1988).
7.	R.E. Kasperson, O. Renn, P. Slovic, H. Brown, J. Emel, R. Goble, 
	J. Kasperson and S. Ratick, Risk Analysis Vol. 8, 177-187 (1988).
8.	C. Starr, Risk Analysis Vol. 5, 97-102 (1985). 
9.	J. Rothbart and B. Park, J of Personality and Social Psychology 
	Vol. 50, 131-142 (1986).
10.	A.M. Weinberg, in Proc of Waste Management '89 (University of 
	Arizona, Dept of Nuclear and Energy Engineering, 1989).
11.	J. Beyea, Forum for Applied Research and Public Policy Vol. 5, No. 3, 
	90-92 (1990).
12.	P. Slovic, Perceived Risk, Trust, and Democracy:  A Systems 
	Perspective, Rpt No. 93-3 (Decision Research, Eugene, Oregon, 1993).

The author is at Decision Research, 1201 Oak Street, Eugene, OR 97401. This paper is based on an article by the same title published in Proceedings of the First MIT International Conference on the Next Generation of Nuclear Power Technology, 1991, edited by M. Golay. Preparation of this manuscript was supported by a grant from the Alfred P. Sloan Foundation.

Symposium on Perception of Risk and the Future of Nuclear Power: Confronting the Crisis of Public Confidence in Science and Technology

Physics and Society presents here papers based on the four talks given at the invited session on Risk and Nuclear Power, held at the APS meeting in Washington, DC, on 14 April 1993.

Fears, Fantasies and Fallout

Spencer Weart

[This article is reprinted, by permission, from The New Scientist, 28 November 1992.]

The team that started up the first nuclear chain reaction, 50 years ago, was as nervous as if it was supervising the birth of Frankenstein's monster. If something went wrong in the primitive reactor pile as the control rod was gradually removed, simply replacing it would have reversed the process. As scientists they knew this--they had done the calculations to prove it. Yet one of them rigged up an additional neutron-absorbent rod to go in at the touch of a button. A second scientist added another rod, which would drop in automatically in case of trouble. A third hung a rod overhead by a rope, and stationed a reliable colleague alongside with an ax. Not satisfied, still another scientist organized a "suicide squad," with buckets of neutron-absorbent solution, ready to soak the pile and ruin it at the first sign of danger.

Was such caution beyond reason? In 1942 it was already clear that a nuclear reaction could "run away," overheating catastrophically and throwing out radioactive gas and dust that would endanger health for centuries. So there were sound reasons to fear nuclear energy. Yet there must be some point at which caution becomes excessive; people always mix irrational fears with their rational ones. Controlling the first chain reaction actually turned out, as everyone had calculated, to be as straightforward as tuning a radio.

Ever since nuclear energy was discovered at the turn of the century, it has touched a uniquely sensitive nerve. It has somehow become a source of more dread, and of more vehement and effective opposition, than any other technology. The scientists in 1942 already knew quite a lot about nuclear hazards. Besides their theoretical calculations, they had a generation of direct experience with the medical effects of radioactivity. These suggested nothing exceptionally frightening. What spurred the deepest anxieties was another heritage, far more ancient and laden with emotion.

Within a year of the discovery of nuclear energy, one of the pioneers, Frederick Soddy, the British chemist, had announced that the energy locked within atoms was so great that the Earth must be seen as a storehouse full of explosives. A man who could unleash this energy, he said, "could destroy the Earth if he chose." There was nothing new about the idea of an end to the world--Armageddon is a primal human concept. What was relatively new at the turn of the century was the idea that it could be brought on not by some act of God, nor by some cosmic catastrophe beyond human control, but by a group of people--even a single person. Journalists and science-fiction authors fueled the fear, warning that a careless nuclear experimenter could destroy the planet. In 1929, a writer for The New York Times suggested that not just the Earth but the entire Universe could be accidentally fired "like a train of gunpowder."

Scientists such as Ernest Rutherford, the British physicists, annoyed by the sensationalism suggesting that (as he put it) "some fool in a laboratory might blow up the universe unawares," tried to explain that the idea was scientific nonsense: if the world were so unstable, it would have disintegrated long since. But Rutherford was not successful: the notion of nuclear catastrophe had a fascination all its own.

Nor was it new to hear people exclaim that science was going too far. Almost all cultures have worried about men and women who poke unwisely into the great secrets of nature: first witches, sorcerers and alchemists, then such fictional figures as Faust and later Frankenstein. Now it was the turn of the mad scientist.

A typical example of this new stereotype appeared in The Invisible Ray, a 1936 film starring Boris Karloff. He played a scientist who tampered (as his mother warned him) "with secrets we are not meant to probe." The scientists devised a "radium ray projector," capable of blasting cities or curing people's illnesses, a sort of magic wand. He meant to use it only for good, but got a dose of his own weird radiation and began to glow in the dark. Having gone murderously insane, he crept about killing people with a touch of his hand.

The association of a new force with weapons was inevitable. Even before the First World War, physicists were speculating about nuclear arms. The phrase "atomic bomb" was first used by H.G. Wells, in a 1913 novel about a cataclysmic world war, The World Set Free. However, he also predicted a golden age that would come afterwards, the result of atomic bombs so terrible that they would mandate universal peace, while atomic energy would create a new Utopian society. This was in close accord with a very old structure myth: In the tales of many cultures, Armageddon in turn leads to the Millennium, a time of peace and happiness.

The hopes for nuclear energy were just as grandiose as the fears. After the First World War, newspapers saw nuclear science as a wonderful enterprise, leading toward an earthly paradise. By 1930, there were about 100 patent medicines on the market whose active ingredient was radium. Pastes, tonics, pills and suppositories promised to cure everything from warts to baldness. Mineral springs were proud of the radioactive content of their waters--something most of them do not advertise now.

The public was aware that nuclear radiation had a harmful side. Newspapers reported all the main problems: sterility, genetic mutations and cancer. Yet the news was also full of similar risks from other things, such as household chemicals. In the hands of competent doctors, people said, radiation would save far more lives than it would ever take.

Divine madness

But away from this optimism, there was still an undercurrent of fear about radioactivity, fueled by science fiction stories and horror movies. This undercurrent was tapped in 1945 when the US dropped the first nuclear bomb. People's responses were guided by the images already in their heads. As soon as President Truman revealed that an "atomic bomb" had been used, journalists began to talk of doomsday, hellfire and cosmic secrets. "For all we know," intoned an NBC radio commentator, "we have created a Frankenstein." By this time physicists were beginning to understand that nuclear forces are neither more nor less cosmic than more familiar electrical forces. Yet most people believed that there was something supremely mysterious, almost divine, in any manifestation of nuclear energy.

With the bombing of Hiroshima and Nagasaki, nuclear weapons came to symbolize all the horrors of modern technological warfare. For the first time, the idea of destroying civilization and the world became a technical reality, which people found hard to consider objectively. There is evidence that most people preferred to ignore the awful thoughts. At first they were glad to leave decisions to the experts. But in the late 1950s the undercurrent of fear began to emerge as a spur to public action.

Fallout from bomb tests became a major issue after gray radioactive ash from a hydrogen bomb test in 1954 covered and killed a Japanese fisherman, leading to vehement protests worldwide. Mothers began to worry about giving their children fresh milk, because it could be contaminated by strontium-90 from bomb tests. Something new was happening. Radioactivity was no longer seen as a mixture of white and black magic. It seemed only harmful, an ultimate pollutant.

People's anxiety took visible form in many popular films, such as Them! and Godzilla, about monsters created or released by radioactivity--giant ants, crabs, spiders, squids, even grasshoppers. These creatures were updated versions of the magician's demon and the mad scientist's creation, the monsters that always served as warnings (as the movies said explicitly) against those who "Went Too Far", who tried to grasp more than is proper. The implicit risk was authority, with its craving for power. The leaders of the protests against nuclear fallout understood this, and stated plainly that their main fight was against overweening military and political authority.

It made sense to protest against the spread of radioactive dust, but the protest leaders admitted that fallout was chiefly a stalking-horse for the greater problem of nuclear war itself. So long as nuclear weapons might destroy civilization, the word "nuclear" would carry a burden of fear, anger and distrust.

There was another aspect of nuclear energy that could not be shoved under the rug: civilian nuclear reactors were going into operation at many sites, with the first entirely commercial reactor starting up at Shippingport, Pennsylvania in 1957. Ever since 1942, scientists have demanded elaborate precautions against reactor explosions; two decades later, the imagery and language of monstrous and polluting damage, first inspired by nuclear weapons, transferred to the civilian nuclear industry. For example, radioactive wastes stir greater anxieties, and have provoked more devoted opposition, than any other industrial hazard. Polls show that the public sees nuclear waste as a far more difficult problem than most technical experts do, with a wider divergence than on any other issue. Yet the image of radiation as the most apocalyptic pollution was only part of a still larger picture. Now that thousands of nuclear weapons were hanging over everyone's head, modern technology no longer sounded entirely wonderful. What was most dreaded was the unknown, and nuclear technology seemed supremely mysterious.

Image problem

Nuclear experts, constrained by government rules of secrecy, could not shake off the aura of sorceress powers--and perhaps did not quite want to. Industrial and governmental officials got a reputation for haughtily brushing aside public concerns as ill-informed nonsense. The critics began to call the nuclear industry arrogant, secretive, heartless and dangerous. Nuclear energy began to stand for all the problems of modern bureaucracy and industrial power. People opposed nuclear reactors as a way of opposing all complex centralized power, including military, industrial and bureaucratic authority in general.

By now nuclear energy carries quite a burden. It is associated with images of weird polluting rays and mad scientists, the destruction inherent in modern war, with everything people dislike about technology, with impersonal and manipulative authorities, and behind that, always, with an ancient tradition of cosmic and secret forces of life and death. Such negative associations have become inseparable from the most seemingly rational discussion. For example, in 1989, three years after the Chernobyl accident, the government of Taiwan launched an elaborate and expensive "risk communication" program to promote public support for building a new reactor. Surveys showed that if the program made any difference, it was to increase public worries about reactors rather than alleviate them. Simply to be reminded of nuclear energy's power, even in the most reassuring context, was to become more anxious.

These strong, negative associations conceal an opportunity: If they can be dealt with, everyone will make progress toward handling feelings about science, technology and modern social authority in general. Hopes and fears must be respected and problems of reactors and weapons tackled. There must be a step-by-step improvement in the systems for power production and military security, nuclear and non-nuclear, in all their complex effects. It will take a long time to win confidence through truly safe practices, but it can be done.

It will be fruitless to work through some authority claiming to be rational and infallible--a set of scientists and bureaucrats who decide what is best for everyone. The only solution will come when the people who expect to benefit from a technology routinely respect the rights of the people who might be hurt by it. A familiar example is levying a fee on people who use mildly radioactive materials and must dispose of the waste, then giving the money to people who live near the proposed waste repository--money they can use, if they like, to hire their own experts and radiation monitors. In the long run, the way to a solution is to give everyone a share of power and a stake in the outcome. Dread of our future can only be removed when everyone has a part in helping to determine how we will share the benefits, and the risks, of whatever technology we must use.

The author is director of the Center for History of Physics of the American Institute of Physics in New York. He is the author of Nuclear Fear: A History of Images (Harvard University Press, 1988).

Public Perceptions of Nuclear Power:Symposium on Perception of Risk and the Future of Nuclear Power: Confronting the Crisis of Public Confidence in Science and Technology

Physics and Society presents here papers based on the four talks given at the invited session on Risk and Nuclear Power, held at the APS meeting in Washington, DC, on 14 April 1993.

Some Observations from Experience

Richard Wilson

The fear of radiation

To the embattled nuclear energy advocate the world seems a contrary place. A nuclear burner is simple. A physicist can understand it. This contrasts with the process of combustion of fossil fuels which I still find hard to understand in detail. Effects of radiation on people can be estimated somewhat better than the effects of other pollutants. In ordinary operation a power plant does almost nothing to public health. Accidents of course can happen. The worldwide effect of burning fossil fuels is so large that it equals the effect of the accident at Chernobyl in a single year. A nuclear plant is less unsightly than a fossil-fuel one, and some people prefer them to windmills. To the advocate, nuclear energy seems to offer an unlimited, environmentally benign source of energy to pull mankind out of poverty forever. Why is this not generally accepted? Is it because the public does not understand? To some extent this is true, and I address this next.

At its beginning just after the Second World War, scientists first insisted that nuclear matters be under civilian rather than under military control, and by 1960 had succeeded in declassifying most nuclear power information so that it could proceed in a completely open and transparent manner. This has happened to few technologies before. There are more public hearings and opportunities for public comment than in earlier technologies. This makes it peculiarly sensitive to the problems of openness; for those in search of something to do, it becomes the technology most easy to attack.

Bernie Cohen has explored some of the consequent misrepresentation by the press. Alarmist and incorrect stories about the effects of radiation on people often merit 30 articles on the major pages of major newspapers. A technically correct rebuttal often is on a minor page of one paper only. It is self-evident that newspaper editors want to sell newspapers. But why does this happen more to nuclear matters than to others, and why do we let it happen?

After the accident at Three Mile Island none of the major newspapers seemed to be able to keep their units straight. Rems and millirems per hour were thrown about with abandon. In contrast, the press releases from the Nuclear Regulatory Commission (NRC) were informative and accurate. When reporting to the Governor of Massachusetts as Chairman of a special task force to study what Massachusetts should do, I recommended that in all future accidents the newspapers be requested to print the official press releases verbatim--and comment afterwards what they will. As I was making this recommendation at our press conference, the TV cameras switched off and the reporters walked out. Why are they unwilling to do this part of their job of conveying information to the citizenry? I do not know.

Proliferation of nuclear weapons

In 1945, nuclear scientists insisted that nuclear fission should be under civilian and not military control. The Atomic Energy Commission (AEC) had both military and civilian applications. When public opposition to nuclear armament became strong in 1970, the civilian program was attacked also as being a mere "justification" for the military one. The old-fashioned liberal Democratic position was that nuclear power was good; expansion of nuclear arms was bad. Yet extremists on both sides seemed to deliberately confuse the two. I suspect Edward Teller, on one side, wanted the nuclear power community to support the Strategic Defense Initiative, and the management of the Union of Concerned Scientists on the other found that it was easier to get support and funding to oppose a power plant than to stop a bomb.

That the position of liberal Democrats had changed was enunciated in a letter to the New York Times by Bethe and Seaborg in September 1988. They were unsuccessfully trying to change the position of the Democratic Presidential Candidate, Michael Dukakis. Can we return to the old position with the new Democratic government? I believe that we can if the "nuclear industry," utilities, manufacturers, regulators, and even academics, address the issue head on. I argue the importance of this by a risk comparison.

The probability of nuclear war is still large--comparable to the risk of a nuclear power accident although it has certainly diminished since glasnost and perestroika, and elsewhere I commented on the positive effect of the Chernobyl accident in reducing this probability. The consequences are also far worse. Therefore we must be concerned about anything that gives even a small increase in that risk. If the existence of nuclear electric power is connected to the risk of proliferation of nuclear weapons and through that to the risk of nuclear war, nuclear power could well be considered unacceptable.

Here I make a challenge. How can we get the world-wide network of commercial nuclear power to help stop proliferation of nuclear weapons and hence reduce the risk of nuclear war? If this can be done, and to the extent that it can be done, nuclear power should be supported by all peace-lovers.

The world nuclear industry did not behave well in the 1970s and the public has reason to be suspicious. I outline some of the contributors to nuclear proliferation:

--In 1965 the French sold a heavy-water reactor to Israel. It is believed that this has been used for making plutonium for bombs.

--In the 1960s the Canadians sold heavy-water reactors to Taiwan and India, and these were used to make plutonium for bombs.

--In 1975 the French agreed to sell a plutonium-processing plant to Pakistan. Yet among the reasons for canceling the order was an insistence by the Shah of Iran that this be stopped as a condition for getting an order for two nuclear power plants.

--In 1973 the Germans agreed to sell a reprocessing plant to Brazil.

--It is reported that in 1973 the Germans helped South Africa with their isotope separation.

In 1946, nuclear physicists and others returning from the war did not want the atomic bomb to be under military control, and insisted upon a civilian AEC to oversee all uses of nuclear fission. This decision, however useful it may have been in controlling military excesses, laid deep problems for peaceful uses. For many years, military uses of nuclear energy, and with them military habits of secrecy, influenced the commission. A myth arose that bombs and nuclear power stations are inseparable, even though most power station engineers know less than many bright undergraduates about how to make a bomb, and no nation has ever used a nuclear power program in the quest for nuclear weapons. This mixing has led to official secrecy and a confusion of thought eagerly exploited by a few anti-nuclear scientists.

There is no doubt that the system and the people who are knowledgeable about a nuclear fuel cycle can be used to plan and build a fuel cycle for bomb making. But such people can also prevent clandestine bomb-making. This is a vital issue which needs far more discussion then I can give here.

The cost of nuclear energy

The public has shown repeatedly that they do not want to spend money. They might still support nuclear energy if it were cheap. In 1970 it was competitive with coal, oil and gas generation, and cheaper in some locations. Now it seems to be expensive. In order to understand what the cost might be in the future, it is therefore important to understand what has changed to make the cost increase. Unfortunately this is not easy.

In 1970 the busbar cost was 0.55 cents/kwh from Connecticut Yankee, and 0.828 cents/kwh from Yankee Rowe, although there was some federal subsidy for construction. By 1980 this had all changed. Some environmentalists were actively opposing nuclear energy, and costs had escalated.

Far more insidious has been a steady increase in "Operations and Maintenance" (0&M) costs. Leading nuclear scientists told the nuclear industry at the beginning of this last decade that "if you operate the nuclear power plants safely for the next 20 years, all will be well." They were overly optimistic and ignored the effect of increasing costs. Several events in the last year bring to our attention the effects of ignoring operating costs. For example, before 1970 the cost of operating Yankee Rowe was mainly pay-back of the "loan" or charge against capital. The 1970 cost of 0.82 cents/kwh is equivalent to a little over 2 cents/kwh in 1992 dollars. The construction cost must have gone down; utility company practice has been to charge the construction cost early; moreover inflation must have diluted the payments. But the operating costs went up; this has been due to safety improvements demanded by the NRC and also to an increase in O&M. A clue comes in the staffing of the plant. The number of employees went up threefold over this time. I do not have a further breakdown but it has been claimed that a large part of the increase in employees was due to increase in the number of security guards, which may or may not have been accompanied by an increase in security. This runs counter to all previous experience. One expects that cost will come down as the new technology is learnt! The costs of most technologies have followed a "learning curve"; with nuclear power we seem to have a "forgetting curve"! A learning curve is evident in subsets of the nuclear data; the later nuclear power plants built by Duke Power cost less than the earlier ones. But superimposed is an overall societal increase of cost.

I can see only one main reason for the cost increase: There has been a change in the perception of the need for expenditure on safety, which is the main determinant of cost.

Various alternate possibilities have been suggested:

--Public Utilities Commissions (PUCs) and the NRC have insisted on expensive equipment and staff additions without regard to cost. It is unclear whether this increased cost led to improved safety.

--Licensing delays led to an increase in construction costs because of interest charged on capital during the construction period especially since interest rates have increased because of inflation since 1970.

--Even more important than delays themselves is the huge dislocation caused by an "off and on" approach to regulation. It becomes almost impossible to schedule the arrival of crucial equipment, of skilled workers from a distance, and so forth. The ability of a regulator to force a small costly delay has become a weapon, augmented at times by an apparent vindictiveness of a frustrated regulator.

The delays in turn have been due in part to increased licensing requirements; part is from public opposition, and part may be due to construction by less competent utility companies. There is a wide variation in these cost increases, sometimes but not always associated with public opposition.

Reversing the trend

It is generally accepted that nuclear power in the United States is in deep trouble and that the industry will die unless something is done. However, there is less general acceptance of the reasons for this state of affairs, and even less consensus on whether it is desirable to revive it. Assuming that we wish to do so, what can be done to reverse the trend, and in particular how can public opinion be brought to bear?

I believe that the present situation has been brought about by a minority of adamant nuclear opponents, and that all parts of the nuclear industry have been lily-livered and failed to do what is right. Let me start with the regulatory structure.

When the NRC started, it inherited a ruling from the old AEC on control of levels of radioactivity. In this, the Commissioners ruled on the meaning of ALARA (as low as reasonably achievable) in radiation protection. After hearing form all parties, they ruled that radiation doses must be reduced if this could be done at a cost of $1,000 per man-rem. This was a number larger than suggested by any person or organization at the hearing. Somewhat later, the NRC developed a set of safety goals to guide its regulation. Among them is that the risk to anyone in the public be not more than 1% of the risk of comparable activities. By the nature of the discussion it is clear that these risks should not be calculated in a realistic way. Yet the Commission has not often used this number of $1,000/man-rem or the safety goals in its regulation and, when it has, has allowed the staff to use such a pessimistic ("conservative") approach to calculating risks that the number does not represent a real risk at all. I suggest that the Commission immediately take steps to use their safety goals and to calculate risks realistically.

In 1990 the Commission proposed for public comment a "Below Regulatory Concern" statement that they would not regulate anything which could be calculated to produce a dose of less than 10 mrem per year. When this was opposed by a vocal political program, the NRC in a typical display of timidity withdrew the proposal. They did not wait to see whether there was support from the scientists and the public. They could have kept the proposal and immediately held a generic public hearing, like the marathon public hearings for the emergency core cooling systems and ALARA held by the AEC in 1982. The courts behaved better. The arguments for Below Regulatory Concern were used in a petition for summary judgment in the Rancho Seco case a month after the NRC proposed the rule. This has been upheld in the appeals court. I therefore believe that the NRC could have easily withstood challenges in the Congress or in the courts. By their vacillation, the NRC is seen by the public (probably correctly) as not knowing what it is doing. This pleases nobody. Some people go even as far as to say that the NRC is the most anti-nuclear organization that they know of! The NRC should reintroduce the Below Regulatory Concern rule at once and show the public the courage which the public pay them for.

In discussions of waste disposal generally, we should steadily and strongly push for similar actions both by the EPA and the NRC. For example, we are spending some billions of dollars on waste disposal. Those spending the money might well be required to show that they are reducing radiation exposure by one man-rem per thousand dollars spent.

This sounds like a demand for government action rather than a discussion of public perception. But I argue that the very inconsistency of the EPA and NRC policy and actions are a considerable cause of the lack of confidence by the public in the regulatory process, and a cause of an unfavorable public perception.

We may not accomplish these ends in our confused country. But I am urging these ideas on my friends in the countries of the Pacific Rim and have reason to believe that they may be brighter and more courageous than we are. This might be seen as one more example of why the next century will be an "oriental century." The resurgence of nuclear power may come from the Orient. Let us also hope that if and when it is again economically and environmentally attractive for the USA, our country will follow close behind. Otherwise our economy will inevitably decline and we will become an undeveloping country.

The author is Mallinckrodt Professor of Physics at Harvard University and Director of the NE Regional Center of the National Institute for Global Environmental Change (NIGEC). This is an abbreviated version of the paper presented at the symposium.

Symposium on Perception of Risk and the Future of Nuclear Power: Confronting the Crisis of Public Confidence in Science and Technology

Physics and Society presents here papers based on the four talks given at the invited session on Risk and Nuclear Power, held at the APS meeting in Washington, DC, on 14 April 1993.

How Nuclear Power Controversies Become Amplified: Contrasts Between Technical Analysis and Public Expectations

Robert L. Goble

It is by now an old story in the US that technical analyses pertaining to nuclear power which were intended to reassure or reduce the concerns of an anxious public have led instead to increased anxiety and concern. Numerous examples can be found in discussions of reactor siting, of health effects from routine releases from reactors, of emergency planning for reactor accidents, and of radioactive waste management. Some of these stories have had a long run in the public debate.

A prominent example is the development and citing of probabilistic risk assessments dating from the Reactor Safety Study or, as it is frequently called, the Rasmussen report (1). The original intent was to incorporate probabilities into any discussion of nuclear accidents so that the public would not be obsessed with extremely unlikely worst-case possibilities. The effect was to heighten public awareness of accident possibilities and to promote concern and discussion about what to do about them. The continuing saga of high-level waste disposal assessments is another example. It offers a stark contrast between a technical community convinced that radioactive waste management is a straightforward and solvable problem and a public which views the thought of a radioactive waste repository with fear and revulsion (2). Each new assessment appears only to extend the gulf between these perceptions.

This phenomenon--the mismatch between the intent of analyses and their effects--asks to be explained. The question is usually posed: "Why does the public think and behave in ways we don't expect?" For those who like to confront crises in public confidence, there is a related question: How can we change their thinking and their behavior?

These are questions about social behavior and it is a good idea to consult social scientists such as Paul Slovic and Spencer Weart before relying on such simple hypotheses as: The public is ignorant and/or the public is misled by irresponsible journalists and anti-nuclear agitators.

Physicists are different

An alternative question, which I would like to pose today, is: Why do physicists and other technical people think and behave in ways that the public finds inappropriate and/or difficult to understand? As a follow up, I ask: Could the public be right about this?"

There is plenty of evidence that physicists are different from representative members of the public. One piece of evidence is the repeated misunderstandings which are my subject. Further evidence comes from opinion surveys, such as those discussed by Paul Slovic. For instance, here is an interesting survey question (the respondent can strongly agree, agree, disagree, or strongly disagree): "When the risk is very small, it is okay for society to impose that risk on individuals without their consent." Physicists significantly more often agree or strongly agree with this sentence than do members of the public (3).

Here is my simple hypothesis (analogous to simple hypotheses about the public) about how physicists and other scientific experts compare with members of the general public: Physicists tend more to wish to resolve controversies and tend to want to do so using methods of science.

The implications of this comparison can be explored by considering where the focus might be placed in the survey question. A member of the public might well skim over the first phrase, "When the risk is very small --", which sounds technical, and go on to the rest of the sentence, "--it is okay for society to impose that risk on individuals without their consent". This raises immediate concerns: one needs to face up to the possibility of irreconcilable conflict between individual interests and societal interests; that means making choices about values and social mechanisms; and such choices may well be informed by memories of past experience when risk language was part of the discussion --how well was the process of imposing risks handled by society?

A physicists, in contrast, might focus on the first phase, "When the risk is very small --". It refers to scientific analysis; it suggests taking the limit of zero risk (and physicists enjoy taking limits--that is why we invented calculus); and it offers the hope that the social difficulties raised in the second phrase should disappear in that limit.

Thinking of this sort has spawned the idea of "acceptable risks" or "below regulatory concern," and has been an important stimulus to useful analysis. However, when presented with such analysis in its naked form, members of the public might simply add it to their experience as another example of what they were worried about in the first place, technicians skimming over social complexities at their expense.

Table 1, adapted from a table by the German social psychologist H. P. Peters (4), offers a richer description of differences between physics (and other experts) and members of the public than is given by my simple hypothesis.

Table 1.  Differences in the worldview of nuclear policy experts and lay people
Approach used by experts		Approach used by lay people
Narrow scientific problem definitions	Open problem definitions
Complex scientific models		Simple scientific models
Naive social model			Complex social model
Precise scientific terminology		Imprecise terminology
Proceed from evidence to claims		Don't separate values and evidence
Cost-benefit value perspective		Multifaceted value perspective

A physicist's "mind set"

The differences, in either a simple or complex characterization, can be thought of as representing a physicist's "mind set." Technical presentations on nuclear power issues for the past decades exhibit four characteristic manifestations of this mind set that sometimes lead to severe problems in communication with the public, giving, in effect, misleading or confusing advice. These are the recurrent tendencies:

--Excessive optimism about technological capabilities and excessive confidence in technical answers. An on-going example has been the high-level radioactive waste assessment program. It has been plagued by promising, in the words of the National Research Council, "unattainable levels of safety" under a rigid schedule that "is unrealistic, given the inherent uncertainties of this unprecedented activity" (2).

--Treatment of a partial answer to a question as if it were the whole answer. An interesting example has been the long delays in including "external initiators" in probablistic risk assessments for nuclear power plants, while many experts have quoted the partial numbers as estimates of the number of people who might be killed by nuclear power.

--Redefinition of a question, without securing consensus on the redefinition. Perhaps the best example of this phenomenon has been the scientific definition of "risks" from nuclear power. Particularly likely to be confusing are characterizations in terms of "numbers of deaths per year" or "days of lost life expectancy."

--Making claims outside one's area of expertise. Here I could cite a long and depressing litany. It will suffice to remind you of how many hasty assertions have been made about radiation epidemiology.

It is important to keep in mind that each of these tendencies is a "virtue" in the pursuit of science, and ones which we attempt to instill in our graduate students. Thus, a belief in problem-solving and a willingness to venture outside the immediate boundaries of knowledge are essential motivation in research. The effort to find those parts or aspects of a physics question which are measurable has been the impulse behind experimental physics, and the redefinition of questions has been the hallmark of theoretical advances.

But scientific virtues may become pathologies in the area of public policy. Thus each of these tendencies provokes a public response. Excessive optimism and confidence provokes the response "you promised." A partial answer provokes the response "you forgot" or (worse) "you are hiding" Redefinition of a question provokes the response "I don't understand--you are trying to fool me." Making claims outside one's area of expertise, provokes the response "scientists never agree about anything."

Can communication between physicists and the public be improved? We must not forget why these tendencies correspond to scientific virtues, and we do not want to give up doing science. Nuclear power risk assessments, for instance, were a brave venture and have contributed immeasurably to our understanding of nuclear technology. But we also must not forget that there are very serious limits to a scientific approach to public policy.

There are key social-science insights to assimilate: Public fears and distrust are serious and stem from real experience which has included broken promises and false impressions. Responsive technical analysis must deal with issues and questions as they are framed by the public. Some policy differences are not resolvable by science, so that technical fixes are not always regarded as helpful. Communication is best regarded as dialogue rather than instruction. Here are two approaches the physics community might take in attempting to improve its communication with the public on nuclear power:

Recommend corrective policies. A vigorous effort to recommend a comprehensive package of sensible major policy shifts in areas deemed critical by the public might shake up the present situation. The four critical areas are: (1) radioactive waste disposal, where the need is to address near term planning as well as long term storage; (2) nuclear safety, where a coherent treatment of new and old reactors and of emergency planning is needed; (3) clean up of DOE facilities, where adequate public oversight mechanisms are needed; (4) nuclear disarmament, where are US weapons policy, nuclear proliferation, and possible incidents stemming from the trade in nuclear materials, are at issue. Note that while the last two items are not in the domain of nuclear power, they are inextricably attached to public concerns about nuclear power. For this approach to have a chance of making a difference, the package has to cover all four items, to advocate real change in the way things are now done, and to encourage explicit ongoing mechanisms for responding to public concerns.

Recognize our limitations. Both experts and lay people blur together facts, knowledge, guesses, values, and desires about nuclear power and policy. We physicists could try to distinguish: (1) what we know as physicists--technical analysis for which there is true consensus; (2) what are critical technical issues for which expertise in other fields is called for; (3) what we believe because of our special experience and interest in nuclear technology; (4) beliefs where physicists genuinely disagree with each other; (5) critically, on what issues is it inappropriate for physicists to claim special standing.

I would leave you with two final questions: Are these two approaches compatible? Can we make either of them work in practice?

Acknowledgments: I thank Miriam Forman for trying the interesting experiment of setting up a dialogue between physicists and social scientists. I also acknowledge my department to two physicists and two social scientists, C. Hohenemser, R. Socolow, R. Kasperson, P. Slovic. They have shaped my views on nuclear power issues over the years; however, none of them should be blamed for the particular perspective presented here.

1.	Reactor Safety Study, US Nuclear Regulatory Commission, Washington 
	DC, 1975
2.	Paul Slovic, James Flinn, and Mark Layman, Science  Vol 254, 
	1603-1607 (1991)
3.	H. Jenkins-Smith et al.  Politics and Scientific Expertise: 
	Scientists, Risk Perception, and Nuclear Waste Policy (1993)
4.	Hans Peters PR Magazin September 1992, 39-50

The author is at the Physics Department, Program on Environment, Technology, and Society, Clark University, Worcester, MA 01610