PCCM2.1 is a parallel version of the NCAR Community Climate Model, CCM2, implemented for MIMD massively parallel computers using a message-passing programming paradigm. The parallel implementation was developed for the Intel Paragon with 1024 processors and the IBM SP2 with 128 processors. The code can be easily ported to other multiprocessors supporting a message-passing programming paradigm, or run on machines distributed across a network with PVM..
The parallelization strategy decomposes the problem domain into geographical patches and assigns each processor the computation associated with a distinct subset of the patches. With this decomposition, the physics calculations involve only grid points and data local to a processor and are performed in parallel. Using parallel algorithms developed for the semi-Lagrangian transport, the fast Fourier transform and the Legendre transform, both physics and dynamics are computed in parallel with minimal data movement and modest change to the original NCAR CCM2 source code.
Sequential history tapes are written and input files (in standard history tape format) are also read sequentially by the parallel code to promote compatibility with production use of the model on other computer systems. Support for the Intel Paragon Parallel File System (PFS) is also provided for production runs. Restart (checkpoint) data are written in parallel format on the supported systems.
Computational rate for PCCM2 can be given by the time required to simulate one day. The following graph shows the computational rate versus the number of processors for a number of parallel computing platforms. The machines include the Cray Research Inc. C90 and T3D, the Thinking Machines CM-5, the IBM SP2 and the Intel Paragon XPS. The times reported are for T42 and T170 horizontal resolution and 18 vertical levels. All times are for double precision (64bit) arithmetic. ``Performance comparison using PCCM2''.
PCCM2.1 was developed for the Department of Energy
CHAMMP program by a collaboration of researchers from
Oak Ridge National Laboratory,
Argonne National Laboratory
and the National Center for Atmospheric Research.
Questions and comments may be addressed to either
bbd@ornl.gov
or
itf@mcs.anl.gov
.