Analyzing climate variability with ensembles of simulations

Gerald Potter and Michael Wehner,
Lawrence Livermore National Laboratory

Research Objectives

It has long been known that the chaotic nature of the atmosphere causes the detailed long-term prediction of weather events to be impossible. However, climate - the average of the weather - may be simulated by performing long-term integrations of weather and averaging the results. Using this technique, the details of the instantaneous simulated weather are not actually realized in the true climate system. However, given a realistic-enough model, the simulated events are realizable as possible states of the true system. By averaging over a long-enough period, simulated climate statistics may be generated that may be legitimately compared with observations of the true climate system.

Computational Approach

Recent advances in high-performance computing have revealed that the model average climate statistics also possess a degree of variability. This is borne out by several investigators who have performed ensembles of climate simulations where each realization differs only slightly in the initial conditions. Not surprisingly, the degree of predictability depends on the field in question, the length of the temporal averaging, the geographic location, and the season of the year.

Our principal tool for simulating the climate is the LLNL parallel atmospheric general circulation model (AGCM). This finite-difference model of the global atmosphere, based on the UCLA model, is parallelized using straightforward two-dimensional domain decomposition techniques. The code is highly portable across all leading distributed memory parallel computing architectures. Our best performance to date has been achieved on the 512-processor NERSC T3E.

Accomplishments

We have previously completed an ensemble of 20 atmospheric simulations of the decade 1979-1988. Using standard statistical techniques, we related the variability of the model output and the tolerance and statistical certainty required on this output to calculate the required minimum ensemble size. The accompanying figure shows the number of realizations required to calculate the decadal averaged seasonal surface temperature to within 0.5 degrees Kelvin at 95% statistical certainty. As seen in this image, this ensemble size is relatively large despite a high degree of averaging. For shorter averaging periods, such as only a single season, the required size is significantly larger.

Significance

The implication for climate change prediction is serious. Long-term climate simulations tax the capabilities of even the most powerful computers. It is desirable that the resolution of global models substantially increases from that currently used in order to better simulate regional features. Comprehensive climate models also must include processes other than atmospheric circulation, such as ocean circulation, sea ice processes, biological processes, and atmospheric chemistry. These model improvements further increase the demand for computer resources. Our work now implies that single calculations are not sufficient to access certain aspects of the simulated climate, further increasing the computational burden.

URL

http://www-pcmdi.llnl.gov/


 

The number of atmospheric general circulation model calculations required to compute the decadal averaged seasonal surface temperature to within 0.5 degrees Kelvin at 95% statistical certainty.



Next Page
Back to Table of Contents