STR Masthead

Article title: Narrowing Uncertainties.

DOZENS of climate models are in use throughout the world, and their predictions for global warming as a result of a doubling in atmospheric carbon dioxide vary from about 1 to 5°C over the next 30 years. A globally averaged increase of 1°C might not matter in many parts of the world, but the larger increase could mean vast changes in snowpack, rainfall, water availability, crop production, and ocean levels, affecting billions of people.

Predictions and accompanying margins of error are used constantly, for example, to foresee where the economy is headed, determine how much will be needed in the Social Security fund for aging boomers, estimate future oil production from a particular well, anticipate the efficacy of a new drug, or determine the chance of a terrorist attack in a U.S. city. Occasionally, the magnitude of the uncertainty can rival or even exceed the value of the prediction.

How to reduce uncertainty can be unclear, in part, because it can take many forms. For example, uncertainty may exist in regard to the assumptions and inputs to a model, the errors associated with experimental data, or the approximations inherent in the physics, numerical algorithms, and mathematics of the model itself. Furthermore, a prediction may include uncertainties from many factors that may be interrelated.

“At Livermore, significant advances in uncertainty quantification have been made in the weapons program” says Richard Klein, a theoretical astrophysicist in the Laboratory’s weapons program and a professor of astronomy at the University of California at Berkeley. Several years ago, Lawrence Livermore and Los Alamos national laboratories worked together to develop an improved methodology for assessing the performance of nuclear weapon systems without nuclear testing. (See S&TR, March 2004, A Better Method for Certifying the Nuclear Stockpile.) Known as quantification of margins and uncertainties, the work entailed systematically combining the latest data from computer simulations, past nuclear tests, nonnuclear experiments, and theoretical studies to quantify confidence factors for the key potential failure modes in each weapon system in the stockpile.

Recognizing the applicability of this work to a wide range of scientific fields, a large collaboration at Livermore began studying uncertainty quantification (UQ) and error analysis. Klein leads this three-year Laboratory Directed Research and Development Strategic Initiative involving more than 20 scientists from four organizations: Weapons and Complex Integration, Physical and Life Sciences, Computation, and Engineering. “The experts in software, mathematics, statistics, and physics from these organizations create highly complex models and routinely deal with uncertainty,” says Klein .“With this research, our goal is to get them speaking the same language and advancing the science of UQ.”

Organizations from around the world have also been searching for ways to identify sources of uncertainty to improve the predictive capability of models. The Livermore project, which began in October 2009, brings to the table not only the Laboratory’s unique combination of expertise but also some of the largest, most powerful computers in the world.

Graph showing global average surface air temperature computed by 11 climate models. Dark blue lines show the global average surface air temperature computed by 11 climate models run using the same simulation protocols. Thin gray lines result from more than 1,000 simulations using only the Community Atmosphere Model and different combinations of input parameters. Applying uncertainty quantification methods to a single climate model increases the spread in calculated temperature, which makes apparent the effects of uncertainties.

Simulation of surface air temperature using the Community Atmosphere Model. Variations in uncertain physical parameter inputs can dramatically affect climate model output variables. This simulation by the Community Atmosphere Model shows the magnitude of the changes in surface air temperature as a result of varying a single parameter at different points in a 21-dimensional parameter hypercube. Similar maps are used to identify and rank important sources of uncertainty. (Rendering by Kwei-Yu Chu.)

First on the Agenda
The project is focused primarily on quantifying uncertainty in climate prediction, where the consequences of uncertainty are vast. Carbon dioxide in the atmosphere is increasing, and climate change is upon us. As stated in Climate Change 2007, the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, “Warming of the climate system is unequivocal, as is now evident from observations of increases in global average air and ocean temperatures, widespread melting of snow and ice and rising global average sea level.” However, the dozens of predictive climate models now in use are not in agreement about what Earth will be like in another 50 or 100 years. New methodologies are needed to more rigorously quantify uncertainties so that sources of uncertainty can be reduced where possible. Improvements in UQ achieved in the Laboratory project will undoubtedly spin back to the weapons program and other Laboratory projects.

The primary U.S. climate model, known as the Community Climate System Model, is managed by the National Center for Atmospheric Research in Boulder, Colorado. The model simulates Earth’s past, present, and future global climate. The Laboratory has a long history of involvement in atmospheric research through its National Atmospheric Release Advisory Center and works primarily on the Community Atmosphere Model, one component of the larger model.

Atmospheric scientist Curt Covey, who has been involved in climate modeling for more than 20 years, notes that uncertainties have always been addressed in climate models. “However, applying UQ at the same level of rigor as it is being used in the weapons program is new.” Working with Covey are Don Lucas, John Tannahill, and Yuying Zhang of the Laboratory’s Atmospheric, Earth, and Energy Division.

The Curse of Dimensionality
UQ for climate modeling and other predictions is typically performed with an ensemble of models. Because computing power is limited, scientists identify a small subset of input quantities (usually 7 to 10) that they think are the dominant source of prediction uncertainties. The ensemble is produced by running the model thousands of times using differing combinations of input quantities. These inputs are constrained by available data and have their own associated uncertainties. Statistician Gardar Johannesson, who refers to this method as the “shotgun approach,” says, “The process takes many computer runs, and the result of some simulations may be inconsistent with existing real-world data. Those cases narrow down the space of possible realizable parameter combinations.” Traditional approaches to UQ focus on parameter variations from a center point where all parameters assume their most likely values. This approach misses the corner regions, which may contain the most important possibilities for future scenarios—scenarios that may be less likely but more consequential.

Input parameters and their associated uncertainties are known to statisticians as dimensions, and the more dimensions, the less merry the statistician’s task. Two dimensions are easy enough to solve, as are three. Beyond that, the difficulty of accommodating all the different uncertainties grows exponentially, outstripping the capacity of the most powerful computers—a problem known as “the curse of dimensionality.”

Charles Tong, a mathematician on the project, likens the curse of dimensionality to the old story of a blind man trying to identify an elephant by touching it part by part. Feeling the eyelashes leads to one conclusion about what the animal looks like, while feeling the trunk yields a different conclusion. And so it is with climate models. Upwards of 100 parameters can influence simulation predictions in climate models, and each has associated uncertainties, leading to very different results.

Uncertain climate model parameters include the humidity at which clouds form, the size of liquid droplets that make up clouds, the size at which droplets convert to rain, and many more. Covey’s team narrowed 100 or so climate parameters to the 21 most important for initial UQ studies. With a 21-dimensional hypercube, more than 2 million corners exist, and traditional Monte Carlo calculation methodologies (the shotgun approach) examine only a miniscule fraction of the total volume. Johannesson and Tong, together with physicists Bryan Johnson and Scott Brandon and mathematician Carol Woodward, are working to develop tools that reduce dimensional requirements. They are, for example, determining which parameters are most sensitive to changes in other parameters. Mathematician Timo Bremer is working to develop a new topological method for expressing dimensionality.

Community Atmosphere Model simulation.

The colored surfaces in this image show the effects of varying three parameters in simulations using the Community Atmosphere Model across their uncertainty ranges. Specifically, the parameters are associated with cloud-forming processes, and the surfaces represent the amount of thermal energy that escapes Earth’s atmosphere. The variation in energy along the contour surfaces corresponds to about 5 watts per square meter. (Rendering by Kwei-Yu Chu.)

A Predictive Pipeline
The collaboration’s goal is a UQ computational “pipeline” that is self-adapting and self-guiding. It incorporates all data—assumptions, inputs, known errors, the relative importance of each variable on the output of the model, and approximations inherent in the physics and mathematics of the model itself—and, through a continuing series of iterations, “learns from itself.” The UQ pipeline will adaptively sample. That is, it is designed to know where the most important responses and sensitivities are located in the vast field of 21 or more dimensions and select the most important sample points in that space.

According to computer scientist David Domyancic, the pipeline will save expensive computer time and ultimately will be an automated decision-making tool. Klein says, “The pipeline will advance the process of integrating theory, simulation, and experiment—a major leap forward in UQ technology.”

Many of the same methodologies used successfully for stockpile stewardship are being applied to climate modeling as well as to target design for inertial confinement fusion experiments at the Laboratory’s National Ignition Facility. As UQ expands into other fields under Livermore’s direction, Klein hopes to establish a UQ institute at the Laboratory. “I see the work we are doing now as the first brick in the institute.”

—Katie Walter

Key Words: climate modeling, predictive pipeline, uncertainty quantification (UQ).

For further information contact Richard Klein (925) 422-3548 (klein4@llnl.gov).


S&TR Home | LLNL Home | LLNL Site Map | Top
Site designed and maintained by TID’s Web & Multimedia Group

Lawrence Livermore National Laboratory
Operated by Lawrence Livermore National Security, LLC, for the
U.S. Department of Energy’s National Nuclear Security Administration

Privacy & Legal Notice | UCRL-TR-52000-10-7/8 | July 14, 2010