American Physical Society Sites|APS|Journals|Physics Magazine
- American Physical Society Sites
- Meetings & Events
- Policy & Advocacy
- Careers In Physics
- About APS
- Become a Member
Scientists who planned to attend the canceled 2020 APS March Meeting discuss their research and address obstacles faced when developing climate models
By Abigail Eisenstadt
The climate system is dynamic, and its behavior is irregular, making it hard to anticipate the full severity of anthropogenic effects on climate. Creating climate models that accurately predict these changes demands a comprehensive understanding of the physical, biological, and chemical processes on Earth.
However, collecting information about these processes can be difficult: Some phenomena, like turbulence inside clouds, are impossible to observe with existing technologies. Other information, like the rate at which plants take up carbon dioxide and emit water vapor, are difficult to constrain with model equations. High-resolution models that can test each possible parameter and simulate these processes are too computationally expensive to run on a global scale.
Overcoming the challenges of modeling climate has been an ongoing subject of rigorous discussion among physicists and the topic was scheduled to have a dedicated session sponsored by the Topical Group on Physics of Climate during the canceled 2020 APS March Meeting. Michael Ghil, Tapio Schneider, and Katherine Dagon, who would have all been part of this session, are tackling different aspects of climate modeling through their research.
“A big concern that the climate community has become aware of is that, aside from the relatively smooth change of mean temperatures, there may also be other sudden [climatological] changes,” said Ghil, a physicist and professor at the University of California, Los Angeles. He has developed a new framework for climate modeling, synthesizing long-term climate trends with abrupt weather patterns, like the El-Niño Southern Oscillation.
One of the challenges in understanding climate is combining long-term behavior with sudden changes like ENSO (the El Nino Southern Oscillation), shown here (red indicates higher sea level and thus higher temperature).
Natural weather patterns respond in surprising ways to manmade climate change. Current climate models try to account for these surprises by estimating the overall impact of factors like climate forcing. Positive climate forcing refers to the surplus of sunlight, or heat, that remains on Earth once Earth radiates its own heat into space. Positive anthropogenic climate forcing occurs when manmade atmospheric pollution increases the amount of solar energy trapped on Earth, causing a gradual increase in global warming and changes in the climate's natural variability.
Ghil’s modeling framework combines intrinsic climate oscillations like the El Niño–Southern Oscillation with long-term anthropogenic warming trends. Major weather and climate patterns are either nearly periodic (like day and night) or irregular. Periodic climate patterns repeat in exact, equal intervals, while irregular patterns can be either deterministically aperiodic—for example, non-random with irregular intervals—or random. Anthropogenic forcing is aperiodic and deterministic: it does not exhibit exact repetitions but it is not random either. Ghil’s framework defines the climate system's behavior as including both deterministically chaotic processes and random ones.
Predicting climate volatility also requires observational data, but information about small-scale processes that impact global climate, like turbulent motion inside clouds, can be virtually impossible to obtain.
“If we want to predict how climate will change, first we must predict how the physical system will change. The equations governing all of that are essentially equations of classical physics,” said Schneider, climate scientist and professor of environmental science and engineering at the California Institute of Technology. “The challenge is that we have to solve [these equations] for the entire planet, and we have to solve them for scales of motion that range from millimeters to the planetary scale.”
He and his colleagues study how global cloud behavior will evolve as climate change progresses. Understanding global cloud dynamics requires information about small details in the turbulence of clouds and the micro-scale physics of droplet and ice crystal formation. Schneider’s team has developed coarse-grained models—which build a picture of overall cloud behavior by starting from molecule interactions—to represent these processes.
But calibrating these models of cloud dynamics and quantifying their uncertainties involves running simulations hundreds of thousands of times. Calibration ensures the model fits the data in the best way possible. Quantifying uncertainty helps scientists predict potential climate risks. However, the calibration and quantification process requires some of the world’s largest supercomputers, rendering it computationally expensive.
To overcome this obstacle, Schneider and his team developed an algorithm, combining ideas from data assimilation and machine learning, that accelerates the model calibration time to around 1000 runs—roughly 1000 times faster than existing modeling methods. The formula lessens how much computation is needed, or reduces climate models’ cost.
Using neural networks to derive data for climatological processes that are difficult to observe—like the rates of carbon dioxide and water vapor exchange through leaves’ pores, called stomatal conductance—could also boost models’ computational efficiency and reduce uncertainty.
“We’re using machine learning to build a simpler model, essentially to replicate the behavior of the complex climate model,” said Dagon, a climate physicist at the National Center for Atmospheric Research.
She and her colleagues employed machine learning to quantify uncertainty and to streamline the modeling process. They used artificial neural networks, or emulators, to train a simple model to provide estimates for certain climate variables, like global photosynthesis or CO2 uptake by plants. They then assessed the simpler model’s accuracy by comparing its results to a more complex model’s predictions. The simple model mirrored the complex model’s predictions without as much data because it was able to generate its own simulated data. It also took less time to run, making it more computationally efficient.
“Once we have an emulator, we can optimize parameter values, like factors in the equation to calculate photosynthesis, that are a very large source of uncertainty in climate predictions,” said Dagon. “We can use these machine learning tools to see how much uncertainty is coming from those parameters.”
Although climate is irregular, models are costly, and observational small-scale data is limited, physicists throughout the world are working together to improve existing climate models.
“This is rapidly evolving and intensely energetic work... I think we all feel a great sense of urgency because climate is changing very rapidly, and we’d like to provide a prediction of how that change will happen before it does,” said Schneider.
The author is the former Science Communications Intern at APS.
©1995 - 2023, AMERICAN PHYSICAL SOCIETY
APS encourages the redistribution of the materials included in this newspaper provided that attribution to the source is noted and the materials are not truncated or changed.
Editor: David Voss
Staff Science Writer: Leah Poffenberger
Contributing Correspondent: Alaina G. Levine
Publication Designer and Production: Nancy Bennett-Karasik