Annual Report
2001
TABLE OF CONTENTS YEAR IN REVIEW SCIENCE HIGHLIGHTS
YEAR IN REVIEW

Computational Science at NERSC  
Director's
Perspective
 
Computational Science at NERSC
NERSC Systems and Services
High Performance Computing R&D at Berkeley Lab
Basic Energy Sciences
Biological and Environmental Research
Fusion Energy Sciences
High Energy and Nuclear Physics
Advanced Scientific Computing Research and Other Projects
Model protein-substrate system used in a molecular dynamics simulation of yeast chorismate mutase. See details.

As a national facility for scientific research funded by the Department of Energy, Office of Science (DOE SC), NERSC served 2,130 scientists throughout the United States in FY 2001. These researchers work in DOE laboratories, universities, industry, and other Federal agencies. Figure 1 shows the proportion of NERSC usage by each type of institution, while Figures 2 and 3 show laboratory, university, and other organizations that utilized around 100,000 MPP processor hours or more during the year. Computational science conducted at NERSC covers the entire range of scientific disciplines, but is focused on research that supports the DOE's mission and scientific goals, as shown in Figure 4.

Allocations of computer time and archival storage at NERSC are awarded to research groups, regardless of their source of funding, based on an annual review of proposals. As proposals are submitted, they are subjected to peer review to evaluate the quality of science, how well the proposed research is aligned with the mission of DOE SC, the need for computing resources, and the readiness of the specific application and applicant to fully utilize the computing resources being requested.

DOE's Supercomputing Allocations Committee (SAC, see Appendix D) reviews and makes award decisions for all requests, reflecting their mission priorities. In addition, NERSC's Program Advisory Committee (PAC, see Appendix B) peer reviews certain requests and also makes award decisions. The SAC allocates 60 percent of NERSC resources, while the PAC allocates 40 percent.

Two other groups provide general oversight: the NERSC Policy Board (Appendix A) advises the Berkeley Lab Director on the policies that determine the impact and performance of the NERSC Center, and the NERSC Users Group (Appendix C) advises the NERSC management and provides feedback from the user community. DOE program management is provided by the Office of Advanced Scientific Computing Research (Appendix E), with advice from the Advanced Scientific Computing Advisory Committee (Appendix F).

This section of the Annual Report gives an overview of the research supported by NERSC and points out some of the year's achievements, which are described further in the Science Highlights section.



NERSC MPP usage pie chart
 
Figure 1. NERSC MPP usage by institution type, FY01.
 


Leading DOE lab usage pie chart
 
Figure 2. Leading DOE laboratory usage at NERSC, FY01 (>100,000 MPP processor hours).
 

 

 
Leading academic and related usage at NERSC FY01 chart
 
Figure 3. Leading academic and related usage at NERSC, FY01 (>100,000 MPP processor hours). Includes industry and non-DOE government labs
 

 

 
NERSC MPPP usage by scientific discipline chart
 
Figure 4. NERSC MPP usage by scientific discipline, FY01.
 

Advanced Scientific Computing Research

Research sponsored by DOE's Office of Advanced Scientific Computing Research (ASCR) is having an impact in a number of scientific fields. For example, predicting the formation of pollutants in flames requires detailed modeling of both the carbon chemistry in the flame and nitrogen chemistry that leads to pollutants. Recent flame simulations, performed using a parallel adaptive mesh refinement algorithm, modeled the behavior of 65 chemical species and 447 reactions; the computed results showed excellent agreement with experimental data and provided new insights into the mechanisms of pollutant formation.

Computational studies are also contributing to our understanding of transverse jets, which have been studied extensively for their relevance to aerodynamics and to environmental problems such as pollutant dispersion from chimneys or the discharge of effluents into the ocean. The mixing and combustion dynamics of transverse jets are important to a variety of engineering applications, including industrial furnaces, gas turbines, and oil wells.

ASCR supports a variety of research in computer science and applied mathematics. Some of the more general research involves developing and implementing highly efficient and scalable algorithms for solving sparse matrix problems on various classes of high-performance computer architectures. Sparse matrix problems are at the heart of many scientific and engineering applications, including fusion energy, accelerator physics, structural dynamics, computational chemistry, groundwater simulations, etc. Improved algorithms are expected to have significant impact on the performance of these applications.

Research jointly funded by ASCR and the Office of Biological and Environmental Research has resulted in a global optimization algorithm that performed very well in an international competition to test the performance of various methods of protein structure prediction. The global optimization method produced the best prediction for one of the most difficult targets of the competition, a new fold protein of 240 amino acids. This method relies less on information from databases and more on energy and solvation functions, enabling it to do a reasonable job of discriminating misfolds from correct folds even in relatively novel proteins.


Basic Energy Sciences

NERSC provides computational support for a large number of materials sciences, chemical sciences, geosciences, and engineering projects sponsored by DOE's Office of Basic Energy Sciences.

Computational materials scientists continue to make important discoveries about a wide range of technologically significant materials, including silicon carbide semiconductors, cadmium selenide nanocrystals, colossal magnetoresistance oxides, and iron-manganese/cobalt interfaces. One research group is developing a first-principles approach to the atomic-scale design of novel catalytic materials, tailored to perform specific reactions with the desired activity and selectivity. Custom-designed catalysts could have widespread applications in pollution prevention technologies, fuel cells, and industrial processes.

Alex Zunger, one of the pioneers of first-principles methods for predicting properties of solids, was named the 2001 recipient of the Rahman Award by the American Physical Society. The award is presented annually to an individual for "outstanding achievement in computational physics research." Zunger, head of the Solid State Theory Group at DOE's National Renewable Energy Laboratory in Colorado, was cited for his "pioneering work on the computational basis for first-principles electronic theory of solids." Zunger developed theoretical methods for quantum-mechanical computations and predictions of the properties of solids through the use of atomic numbers and the laws of quantum physics. Over the 15-plus years since this work was first published, his techniques have become the standard tools for predicting properties of solids from first principles. Zunger's current research has achieved detailed predictions of the effects of nanoscale atomic structures on the electronic and optical properties of semiconductor systems.

Recent simulations of electron-CO2 scattering represent the first time that all aspects of an electron-polyatomic collision, including not only the determination of the fixed-nuclear electronic cross sections but also a treatment of the nuclear dynamics in multiple dimensions, has been carried out entirely from first principles. This computational approach to collision processes will soon be extended to allow study of systems containing more than two electrons. Electron collision processes are central to the problems of interest to DOE, playing a key role in such diverse areas as fusion plasmas, plasma etching and deposition, and waste remediation.


Biological and Environmental Research

DOE's Office of Biological and Environmental Research is a major supporter of global climate studies as well as computational biological research using NERSC resources. Because the emission of carbon dioxide into the atmosphere from fossil fuel combustion is potentially a major contributor to global warming, the global carbon cycle and carbon sequestration are important areas of research.

This year the highest-resolution-ever global simulations of direct injection of CO2 into the oceans indicated that this may be an effective carbon sequestration strategy. In the simulation, approximately 80% of the injected carbon remains in the ocean permanently, while the 20% of the carbon that leaks back to the atmosphere does so on a time scale of several hundred years. If these results are confirmed by experiments, direct injection of CO2 into the ocean could play an important role in mitigating global warming.

Climate researchers are also developing the first comprehensive coupled climate/carbon cycle model in the U.S. This model will allow better predictions of future climate, because it will take into account feedback effects of climate change on the absorption of carbon by the ocean and the terrestrial biosphere—effects that are ignored in present U.S. climate models. This model will be more useful to policymakers because it will use CO2 emission rates, rather than atmospheric CO2 concentrations, as the fundamental input, thus allowing direct assessment of the climatic impact of specified rates of fossil fuel burning.

Global climate simulations are typically performed on a latitude-longitude grid, with grid cell sizes of about 300 km. Although coarse-resolution simulations can provide useful information on continental and larger scales, they cannot provide meaningful information on regional scales. Thus, they cannot provide information on many of the most important societal impacts of climate change, such as impacts on water resource management, agriculture, human health, etc. A team from Lawrence Livermore National Laboratory, using supercomputers there as well as NERSC's IBM SP, recently ran a global climate change model at 50 km resolution, the highest spatial resolution ever used. Preliminary analysis of the results seems to indicate that the model is very robust to a large increase in spatial resolution, which may improve the realism of the model on both global and regional scales.

 
Japanese pufferfish (Fugu rubripes)
 
Figure 5. The genome of the Japanese pufferfish (Fugu rubripes) was draft sequenced using the efficient but computationally demanding whole genome shotgun sequencing method. The unusually short fugu genome will provide important clues for deciphering the human genome.
 

On the biological front, a substantial shortcut to the information embedded in the human genome has been taken by an international research consortium with the completion of a draft sequence of the genome of the Japanese pufferfish Fugu rubripes (Figure 5). The DOE's Joint Genome Institute (JGI) was one of the leaders of the consortium. Although the fugu genome contains essentially the same genes and regulatory sequences as the human genome, the entire fugu genome is only one-eighth the size of the human. With far less so-called "junk DNA" to sort through, finding genes and controlling sequences in the fugu genome should be a much easier task. The information can then be used to help identify these same elements in the human genome.

The fugu genome is the first vertebrate genome to be draft sequenced after the human genome, and the first public assembly of an animal genome by the whole genome shotgun sequencing method. After the genome was chopped up into pieces that are small enough to sequence, the challenge was to reassemble the genome by putting together nearly four million of these overlapping fragments. Solving this puzzle was made possible by a new computational algorithm, JAZZ, that was developed at JGI to handle large genome assembly projects. JGI used NERSC's IBM SP to develop and test JAZZ, which reconstructs contiguous genome sequences by overlapping the short subsequences.


Fusion Energy Sciences

Research funded by the Office of Fusion Energy Sciences reached a milestone this year with the first simulation of turbulent transport in a full-sized reactor plasma by researchers at the Princeton Plasma Physics Laboratory. This breakthrough full-torus simulation, which produced important and previously inaccessible new results, used 1 billion particles, 125 million spatial grid points, and 7,000 time steps. It was made feasible by a new generation of software and hardware—better physics models and efficient numerical algorithms, along with NERSC's new 5 teraflop/s IBM SP.

This simulation dealt with a key issue in designing fusion reactors, namely, the realistic assessment of the level of turbulent transport driven by microscopic-scale instabilities (ITG modes). Turbulent transport is a major concern because it can lead to loss of plasma confinement. Up until very recently, the effects of turbulent transport have been assessed by extrapolating to larger reactors the transport properties observed in smaller experimental devices. This approach relied on models of transport scaling that have often stirred debates about reliability. The new simulation of a reactor-sized plasma produced a result that was as welcome as it was surprising—as the size of the plasma increased, the turbulent transport reached a plateau and leveled off. This "rollover" from one scaling mode to another will be an important new topic for future research.

Major progress has also been made in understanding magnetic reconnection, a sudden release of magnetic energy which is the cause of large-scale explosive events in plasma systems. These events include solar flares, storms in the earth's magnetosphere, and sawtooth instability in tokamak experiments. In magnetic reconnection, the magnetic field self-annihilates in locations where it reverses direction, transferring its energy to plasma flows and intense, high-energy beams. The result in fusion experiments is sawtooth instability, which involves sudden changes in magnetic topology and plasma temperature.

The short time scale of this energy release cannot be explained by resistive magnetohydrodynamic models, which break down at the small spatial scales where magnetic reconnection occurs. At these scales, recent simulations show that whistler and kinetic Alfvén waves dominate the dynamics. The dispersive property of these waves causes reconnection to occur quickly in the simulation, consistent with observations, even when the out-of-plane magnetic field is large and/or the system size is very large. The new model resolves the longstanding discrepancy in the energy release time between models and observations of magnetic reconnection, and recently reported evidence of whistler waves in the magnetosphere bolsters the theory. Computations at NERSC were essential to the discovery and demonstration of the role of dispersive waves.

High Energy and Nuclear Physics

The DOE Office of High Energy and Nuclear Physics sponsors major experimental facilities and theoretical studies, as well as computational simulations and analyses of experimental data.

One of this year's most significant physics discoveries provides a solution to a 30-year-old mystery—the puzzle of the missing solar neutrinos. Since the early 1970s, several experiments have detected neutrinos arriving on Earth, but they have found only a fraction of the number expected from detailed theories of energy production in the Sun. This meant there was something wrong with either the theories of the Sun, or our understanding of neutrinos.

New data from the Sudbury Neutrino Observatory (SNO) shows that the solution lies not with the Sun, but with the neutrinos, which change as they travel from the core of the Sun to the Earth. The new results show that the total number of electron neutrinos produced in the Sun is just as predicted by solar models, but the neutrinos are oscillating in transit, changing in type or "flavor" from electron neutrinos (the flavor produced in the sun) to muon or tau neutrinos. SNO results also show that solar neutrinos do have some small mass, but not enough to account for much of the dark matter in the Universe. These results will require some adjustments to the Standard Model of fundamental particles.

The SNO collaboration has relied on NERSC's HPSS facility for data storage and distribution, and on the Parallel Distributed Systems Facility (PDSF) for algorithm development, detector performance studies, simulations to reduce background data, and analysis of the first year's data.

The STAR detector at Brookhaven National Laboratory also has performed well during its first year's run and has made significant progress in mapping out the soft physics regime at the Relativistic Heavy Ion Collider (RHIC). The measured particle ratios were found to be consistent with quark coalescence. The elliptic flow measured in STAR is larger than observed at lower energies, and is in agreement with calculations using hydrodynamics; this suggests thermalization at an early stage of the collision. Currently, the STAR project yields about 300 gigabytes of data daily which are transferred from Brookhaven to the NERSC HPSS facility. STAR has about 30 terabytes of data in the NERSC HPSS system; extrapolating from current usage, the project is expected to have about 100 terabytes in HPSS by the end of FY 2002.

Among the year's many accomplishments in lattice QCD studies, one research team carried out a production run on a quenched 204 lattice with lattice spacing at 0.15 fm. The size of each direction is thus 3 fm, which is the largest volume that any lattice calculation has attempted. They also pushed the pion mass to as low as ~200 MeV, which is also a record, and found many interesting features of chiral symmetry at such a low mass and large volume. For example, they found that the zero mode contribution to the pion propagator is not negligible at small time region, and its contamination needs to be avoided when extracting pion properties such as the pion mass and decay constant. They also found that after nonperturbative renormalization, the pseudoscalar meson decay constant is very constant in the range between the strange and up/down quark mass.

< Table of Contents Top ^
Next >