The Science and Politics of Climate
By Freeman J. Dyson
Freeman J. Dyson
In the nineteen-sixties the fluid dynamicist Syukuro Manabe was running global climate models on the supercomputer at the Geophysical Fluid Dynamics Laboratory in Princeton. Manabe began very early (before it became fashionable) to run models of climate with variable amounts of carbon dioxide in the atmosphere. He ran models with carbon dioxide at two and four times the present abundance, and saw in the computer output the rise in average ground temperature that is now called Global Warming. He told everybody not to believe the numbers. But the politicians in Washington believed. They wanted numbers, he gave them numbers, so they naturally believed the numbers.
It was not unreasonable for politicians to believe Manabe's numbers. Politics and science are two very different games. In science, you are not supposed to believe the numbers until you have examined the evidence carefully. If the evidence is dubious, a good scientist will suspend judgment. In politics, you are supposed to make decisions. Politicians are accustomed to making decisions based on shaky evidence. They have to vote yes or no, and they generally do not have the luxury of suspending judgment. Manabe's numbers were clear and simple. They said if the carbon dioxide goes up, the planet will get warmer. So it was reasonable for politicians to believe them. Belief for a politician is not the same thing as belief for a scientist.
Manabe's numbers were unreliable because his computer models did not really simulate the physical processes going on in the atmosphere. Over and over again he said that his purpose when he ran computer models was not to predict climate but to understand it. But nobody listened. Everyone thought he was predicting climate, everyone believed his numbers.
The biosphere of the earth contains four reservoirs of carbon: the atmosphere, the ocean, the vegetation and the soil. All four reservoirs are of comparable size, so that the problem of climate is inescapably mixed up with the problems of vegetation and soil. The intertwining between the four reservoirs is so strong that it makes no sense to consider the atmosphere and ocean alone. Computer models of atmosphere and ocean, even if they can be made reliable, give at best a partial view of the problem. The large effects of vegetation and soil cannot be computed but must be observed and measured.
The way the problem is customarily presented to the public is seriously misleading. The public is led to believe that the carbon dioxide problem has a single cause and a single consequence. The single cause is fossil fuel burning, the single consequence is global warming. In reality there are multiple causes and multiple consequences. The atmospheric carbon dioxide that drives global warming is only the tail of the dog. The dog that wags the tail is the global ecology: forests, farms and swamps, as well as power-stations, factories and automobiles. And the increase of carbon dioxide in the atmosphere has other consequences that may be at least as important as global warming - increasing crop yields and growth of forests, for example. To handle the problem intelligently, we need to understand all the causes and all the consequences.
Several successful, important programs of local observation have been started in recent years. One program is measuring directly the fluxes of carbon dioxide moving between the atmosphere and the biosphere. This is done by putting instruments on towers above the local trees or other vegetation. In daytime in the summer, the vegetation is vigorously absorbing carbon dioxide. At night or in winter, the flux is going the other way, with plants giving off carbon dioxide by respiration. The soil also gives off substantial fluxes of carbon dioxide, mostly from respiration of microbes and fungi. The instruments do not distinguish between vegetation and soil. They measure the total flux leaving or entering the atmosphere.
During the last few years, instrumented sites have been built in many countries around the world. Within a few years, we will know for sure how much of the carbon released by fossil fuel burning is absorbed by forests and how much by the ocean. And the same technique can be used to monitor the carbon fluxes over agricultural croplands, wetlands and grasslands. It will give us the knowledge required, so that we can use the tools of land management intelligently to regulate the carbon in the atmosphere. Whether we manage the land wisely or mismanage it foolishly, we shall at least know what good or harm we are doing to the atmosphere.
The amount of money spent on local observations is small, but the money has been well spent. The Department of Energy is funding another successful program called Atmospheric Radiation Measurements (ARM). ARM's activities are mainly concentrated at a single permanent site in Oklahoma, where systematic observations of radiation fluxes in the atmosphere are made with instruments on the ground and on airplanes flying at various heights. Measurements are made all the year round in a variety of weather conditions. As a result, we have a database of radiation fluxes, in a clear sky and in cloud and between clouds.
One of the most important measurements is made by two airplanes flying one above the other at different heights. Each airplane measures the fluxes of radiation coming up from below and down from above. The difference measures the local absorption of radiation by the atmosphere. The measured absorption of sunlight turns out to be substantially larger than expected. The expected absorption was derived partly from theory and partly from space-based measurements. The discrepancy is still unexplained. If it turns out that the anamolous absorption measured by ARM is real, this will mean that all the global climate models are using wrong numbers for absorption.
A third highly successful program of local measurements is called Acoustic Thermometry of Ocean Climate (ATOC). It is the brainchild of Walter Munk at the Scripps Institution of Oceanography. ATOC uses low-frequency underwater sound to measure ocean temperatures. A signal is transmitted from a source on top of a seamount at a depth of three thousand feet near San Francisco, and received at six receivers in deep water around the north Pacific. The times of arrival of signals at the receivers are accurately measured. Since the speed of propagation depends on temperature, average temperatures of the water along the propagation paths can be deduced.
The main obstacle that Walter Munk had to overcome to get the AOTC project started was the opposition of environmental activists. This is a long and sad story which I don't have time to tell. The activists decided that Munk was an evil character and that his acoustic transmissions would endanger the whales in the ocean by interfering with their social communications. They harassed him with lawsuits, delaying the project for several years. Munk tried in vain to convince them that he also cared about the whales and was determined not to do them any unintentional harm. In the end, the project was allowed to go forward with less than half of the small budget spent on monitoring the ocean and more than half spent on monitoring the whales. No evidence was found that any whale ever paid any attention to the transmissions. But the activities are continuing their opposition to the project and its future is still in doubt.
During the two years that the ATOC system has been operating, seasonal variations of temperature have been observed, giving important new information about energy transport in the ocean. If measurements are continued for ten years and extended to other oceans, it should be possible to separate a steady increase of temperature due to global warming from fluctuations due to processes like El Niño that vary from year to year. Since the ocean is the major reservoir of heat for the entire climate system, a measurement of ocean temperature is the most reliable indicator of global warming. We may hope that the activists will one day admit that an understanding of climate change is as essential to the preservation of wildlife as it is to the progress of science.
To summarize what we have learned, there is good news and bad news. The good news is that we are at last putting serious effort and money into local observations. Local observations are laborious and slow, but they are essential if we are ever to have an accurate picture of climate. The bad news is that the climate models on which so much effort is expended are unreliable because they still use fudge-factors rather than physics to represent important things like evaporation and convection, clouds and rainfall.
Besides the general prevalence of fudge-factors, the latest and biggest climate models have other defects that make them unreliable. With one exception, they do not predict the existence of El Niño. Since El Niño is a major feature of the observed climate, any model that fails to predict it is clearly deficient. The bad news does not mean that climate models are worthless. They are, as Manabe said thirty years ago, essential tools for understanding climate. They are not yet adequate tools for predicting climate. If we persevere patiently with observing the real world and improving the models, the time will come when we are able both to understand and to predict. Until then, we must continue to warn the politicians and the public: don't believe the numbers just because they come out of a supercomputer.
Freeman J. Dyson, professor emeritus of physics at the Institute for Advanced Study in Princeton, New Jersey, is the recipient of the 1999 APS Joseph Burton Forum Award, and author of a number of books about science for the general public. His most recent is The Sun, the Genome, and the Internet, which will be published this year.
©1995 - 2016, AMERICAN PHYSICAL SOCIETY
APS encourages the redistribution of the materials included in this newspaper provided that attribution to the source is noted and the materials are not truncated or changed.
Editor: Barrett H. Ripin
Associate Editor: Jennifer Ouellette