1998 Annual Report
Biological and Environmental Research

Modular Ocean Model Code Developments

C. H. Q. Ding, R. K. Owen, and H. Anand, Lawrence Berkeley National Laboratory
R. Pacanowski and V. Balaji, Geophysical Fluid Dynamics Laboratory

GFDL scientists used MOM for a data assimilation experiment employing 1983 El Niño data.

 

Research Objective

The goal of this project is to develop the Geophysical Fluid Dynamics Laboratory's (GFDL's) Modular Ocean Model (MOM) so that it will run effectively and efficiently on massively parallel high performance computers for large-scale, high-resolution, century-long ocean simulations. Techniques and software tools that are developed are expected to be relevant and useful to the entire climate research community.

Computational Approach

MOM uses finite-difference methods to solve primitive equations, including flow, energy, heat, and other tracer components on regular grids. The initial scalable implementation of MOM version 3 (MOM3) achieves moderate parallelism through one-dimensional domain decomposition, i.e., distributing the latitude dimension among the processors. MOM development efforts to date have focused on data input/output (I/O) and the implementation of scalable I/O. Bit-reproducibility of results is also a requirement in any parallel development of MOM.

Accomplishments

Systematic runs on the Cray T3E indicated good scaling to a larger number of processors for the three-dimensional baroclinic and tracer portions of the model. These arrays ran 30 times faster on 32 processors than they did on one processor. However, the two-dimensional barotropic portion of the model did not scale as well (15 times faster on 32 processors). The barotropic portion is solved by an explicit free surface method. The scaling discrepancy was due to a larger-percentage communications overhead for two-dimensional arrays.

The snapshot part of data I/O in MOM3 was reimplemented, speeding up this part dramatically (by a factor of 50). The old I/O scheme, inherited from a memory-limited environment, wrote out field configurations one latitude row at a time. The flexibility of the common data format netCDF resolved the problem of index switching between data in memory and data in disk files.

In the new I/O scheme, index switching is done first, then the entire three-dimensional configuration is written in one shot. This reduces the file system overhead significantly and results in the 50-fold speedup.

With the assistance of S. Luzmoor of SGI/Cray and G. Davis of Unidata, several important problems concerning the netCDF library were resolved, making it ready for use in MOM3 and other climate-related research. In the parallel distributed memory T3E environment, arrays defined with the unlimited dimension suffered from synchronization and overwrite problems, which were fixed. NetCDF was interfaced with the new Cray file system, so that a subset of processors can now open a global file; before, all processors had to open the file (which was often inconvenient). These innovations demonstrated that netCDF can be used efficiently in a real, large-scale application.

Significance

GFDL scientists are preparing for a large-scale, eddy-resolving southern ocean simulation. This unprecedented high-resolution, decade-long simulation depends critically on the efficiency of the MOM3 codes we are developing.

Climate researchers worldwide also use MOM for climate and ocean modeling. This community will benefit from running MOM on state-of-the-art, high performance computers, as well as common platforms and workstations -- made possible by efficient use of cache-based processor architectures, significantly improved data I/O, and a more convenient user interface.


 INDEX  NEXT >>