STR Masthead

Article title: A Grand Way to Visualize Science; article blurb: The Laboratory's high-performance computing machines allow researchers to push the frontiers of science.

WHETHER supercomputers are used for modeling the effects of climate change or the complex nature of molecular dynamics, they play a major role at Livermore in scientific discovery. The Laboratory’s high-performance computing (HPC) machines consistently make headlines as some of the fastest, highest capability computers in the world, enabling simulations of physical processes that could not be investigated by experiment alone.

Livermore’s HPC systems are predominantly dedicated to performing simulations in support of the National Nuclear Security Administration’s Stockpile Stewardship Program for maintaining a safe, secure, and reliable nuclear deterrent. Time is also dedicated on these machines for other research that is important to the Laboratory’s mission and programs, including projects funded through the Laboratory Directed Research and Development (LDRD) Program, which promotes potentially high-payoff projects.

In 2006, Livermore expanded access to its HPC resources to the broader Laboratory community through the Institutional Unclassified Computing Grand Challenge Awards. This program aims to further scientific innovation and at the same time advance supercomputing capabilities. “With Grand Challenge projects, we are looking to push the frontiers of computational science and to achieve scientific breakthroughs that would not be possible without HPC resources,” says Fred Streitz, director of Livermore’s Institute for Scientific Computing Research and chief computational scientist in the Laboratory’s Physical and Life Sciences Directorate. “As a result of the Grand Challenge projects, we’ve seen teams develop new, more complex algorithms that enable them to substantially progress their research.” Last year, more than 400 million computing hours (number of processors times number of hours) were allocated to teams from across the Laboratory working on unclassified, mission-relevant projects that meet Grand Challenge objectives. In at typical year, 10 to 12 projects are given allocations for a one-year term, with more time provided to those projects that have greater Laboratory relevance and promise for maximum benefit from use of HPC machines. Says Streitz, “Projects awarded allocations are expected to receive high-level recognition from mission sponsors, the computing community, and the scientific community at large.”

Divvying Up the Goods
Laboratory researchers apply for Grand Challenge allocations once a year during a formal call for proposals. Each proposal goes through a rigorous review process, which is managed by Streitz and Brian Carnes, director of Multiprogrammatic and Institutional Computing (M&IC). The M&IC program manages time allocations on all the Laboratory’s unclassified supercomputers.

Streitz and Carnes spearhead the Institutional Grand Challenge Awards Committee, which arranges for each proposal to be peer-reviewed by internal and external subject-matter experts. Based on these experts’ assessments and input from the committee, M&IC, and the Laboratory’s Institutional Science and Technology Office, time allocations and priority are awarded. The projects are judged on five main criteria: quality and potential impact of proposed science or engineering; significance and potential impact of proposed computation; ability to effectively utilize a high-performance, institutional computing infrastructure; quality and extent of internal and external collaborations; and alignment with the Laboratory’s science and technology strategic vision.

This article highlights five Tier 1 projects—those with the highest allocations—that were awarded in November 2010 to teams studying a wide range of high-energy-density (HED) physics topics. Simulations are run on the unclassified BlueGene/L and Sierra machines, both of which perform more than 200 trillion floating-point operations per second (teraflops). At the end of this year, award winners will formally present their work during a weeklong Laboratory colloquium.

A Universal Mystery
Understanding the origin and evolution of our universe has been a key scientific pursuit for nearly a century. The big bang theory suggests that in an instant no longer than a trillionth of a second occurring roughly 13.7 billion years ago, our universe began as a hot, dense plasma. Almost immediately, this primordial mixture rapidly expanded and cooled, producing the first elementary particles and their masses, and causing quarks and gluons—the building blocks of all nuclear material—to bind together into stable, nuclear particles, in particular, into protons and neutrons.

Billions of years later, exactly how these events unfolded still remains a mystery. In an attempt to help answer the tough questions associated with the origins of the universe, a team led by Livermore physicists Pavlos Vranas, Tom Luu, and Ron Soltz is using its Grand Challenge allocation to simulate the particle events that occurred shortly after the big bang. “We are performing numerical simulations to unravel the mechanism by which mass was created in the visible universe,” says Vranas. “We are also calculating the transition and thermal properties of the quark–gluon plasma and the properties of the nuclear force as it emerges from the interactions of quarks and gluons.”

Experimentally, particle accelerators, such as the Large Hadron Collider in Switzerland, the Tevatron at Fermi National Accelerator Laboratory (Fermilab), and the Relativistic Heavy Ion Collider at Brookhaven National Laboratory, create conditions similar to those that occurred after the big bang. Inside accelerators, particles are smashed together at extremely high energies, generating different particles that in turn provide details about the interaction itself. Simulations on supercomputers help explain current experimental findings and predict future results.

With the help of the unclassified BlueGene/L, the team, which also includes researchers at Yale University, University of California at Davis, Columbia University, University of Washington, Boston University, Fermilab, and Argonne and Los Alamos national laboratories, is modeling a technicolor theory in an attempt to simulate how nuclear particles obtain mass. (Technicolor theories are models of physics beyond the standard model that address the mechanism through which elementary particles acquire mass.)

The team is also modeling quantum chromodynamics (QCD), the theory that explains how quarks and gluons interact to form stable, nuclear particles. In both technicolor theories and QCD, particle interactions are modeled in a four-dimensional grid of points connected with links, known as a lattice. (See S&TR, January/February 2008, Quark Theory and Today’s Supercomputers: It’s a Match.)

“For the first time, we have discovered strong evidence of mass generation in technicolor theories,” says Vranas. “We’ve also begun incorporating our own lattice calculation of the QCD equation of state into a detailed, multistage model of a heavy-ion collision. This research is improving our understanding of how QCD manifests itself as nuclei and how these nuclei interact.”

Ultimately, the team’s research will help predict experimental results from the newest particle colliders. Vranas says, “High-performance computing may be the only way to understand the behavior of strongly interacting physics processes such as technicolor and QCD. It is an indispensable tool for probing deeper into the cosmic mysteries of our time.”

A Carbon Conundrum
How matter began to form after the big bang is just one of the many secrets the universe holds. Another mystery that scientists have been sleuthing for decades involves the process by which stars, such as our Sun, and red giants, such as Betelgeuse, fuse helium particles to create all the carbon in the universe, including that on planet Earth. “The ability to know for certain how fusion happens in stars, especially carbon formation, is considered one of the Holy Grails in nuclear physics,” says Livermore physicist Erich Ormand. Stars such as Betelgeuse are also responsible for the enigmatic process that results in helium and carbon fusing together to create oxygen.

To solve this fusion mystery, scientists must not only define the structure of light nuclei but also identify the exact mechanisms by which these particles interact. “We have used past Grand Challenge allocations to simulate the structure of these nuclei,” says Ormand. “During this Grand Challenge cycle, we are attempting to simulate how the more realistically structured particles interact with each other.” The work is being performed in collaboration with TRIUMF in Canada, University of Arizona, Iowa State University, San Diego State University, and Ohio State University.

At the heart of this research lies the need to develop a deeper understanding of nuclear properties, in particular, how nucleons interact and bind together to create atomic nuclei. For decades, scientists have sought to develop a first-principles approach to nuclear properties, but their efforts have been thwarted by a lack of adequate theory and the computational power necessary to run the calculations. Fortunately, over the last 10 years, significant advances have been made in both areas that could bring this goal to fruition. “With Livermore’s supercomputers, we can simulate two- and three-body reactions, such asdeuterium–tritium fusion, occurring at low energies,” says Ormand. “Our research has already shown that pair-wise (two-body) interactions between nucleons are strongly augmented by triplet (three-body) interactions, which not only make nuclei more tightly bound but also alter their low-lying structure such as ground-state spin.”

The team is using the no-core shell model combined with the resonating group method and state-of-the-art interaction codes to simulate the quantum states of particles involved in fusion processes. “Our approach is one of the few available today that is capable of simultaneously describing bound and scattered states in light nuclei based on first principles,” says Ormand.

Current experiments are limited in their ability to create the right conditions for studying complex nuclear properties, but HPC machines coupled with advanced codes enable researchers to delve into never-before-seen physical processes. “With these simulations, we can model fusion reactions occurring at temperatures much lower than those that can be achieved in experiments,” says Ormand. Ultimately, what the team discovers about nuclear properties will be instrumental in expanding scientific understanding of the complex physical processes involved in the universe, including the fusion reactions that occur in stars and in experiments performed at the Laboratory’s National Ignition Facility (NIF).

Graph showing results from a technicolor simulation.
A Grand Challenge team uses a technicolor theory to attempt to explain how elementary particles obtain their mass. The team's recent simulation results show that as the input mass is decreased to zero, nonzero particle masses are generated solely as a result of the inherent technicolor dynamics. That is, the inherent technicolor dynamics are capable of producing mass on their own without requiring theorists to set it in "by hand." Technicolor simulations may one day be able to explain why the electron has the mass we observe.

Building a “Solid” Understanding
NIF–the world’s most energetic laser—will be a hub for conducting the next generation of HED physics experiments and, as such, will provide scientists with a new, more powerful tool for studying materials under extreme temperatures, pressures, and strain rates. (See S&TR, April/May 2010, A Stellar Performance.) Researchers will need supercomputers to analyze the atomic-level processes occurring within these experiments. Livermore physicist Robert Rudd says, “Simulations are like microscopes that allow us to see minute details that would not be visible by reviewing the experimental data.” A team led by Rudd, along with collaborators from the University of Oxford, is simulating the extreme deformation of solid metals—tantalum and vanadium—under ramp-wave compression.

The team’s research is directly correlated with a three-year LDRD strategic initiative led by Bruce Remington that is developing the capability on multibeam lasers, such as NIF, to both drive high-pressure ramp waves in solids and to generate x rays for in situ diffraction studies. X-ray diffraction techniques allow scientists to visualize the crystal lattice structure—the arrangement of atoms—within materials, which affects the material’s behavior. “Using our Grand Challenge allocation, we can model what is happening in an experiment and account for every single atom in the simulated material,” says Rudd. “Our goal is to predict the microscopic processes of plasticity that occur in laser-driven material experiments and to develop novel predictive simulations of plasticity in high-pressure ramp waves.”

Understanding how ramp waves initiate phase changes in materials is of particular interest to scientists because unlike shock waves, ramp waves generate a relatively small amount of heat in materials during compression. As a result, materials remain in their solid state longer, enabling them to be compressed at much higher pressures. “With these experiments, we can test a material’s behavior at relatively low temperatures,” says Rudd. The ability to compress materials at lower temperatures and higher pressures is relevant to many Laboratory missions including fusion research.

Modeling ramp waves requires more computational power than shock wave simulations because ramp wavefronts are less abrupt than shock waves. They also occur over larger temporal and spatial scales. Through its Grand Challenge allocation, the team now has access to more processors than before on the Laboratory’s HPC machines. With the Livermore-developed molecular dynamics code dccMD, the researchers can use hundreds of thousands of processors at increased efficiency to model how atoms exert force on other atoms in a material. The code incorporates a modeling approach called generalized pseudopotential, which considers quantum mechanical bonding of the atoms. The latest version of dccMD handles the large density variations associated with dynamic compression. This upgraded code improves the quantitative predictive simulation of solid deformation for ramp-wave compression experiments.

In conjunction with the LDRD strategic initiative, these simulations will facilitate development of a new laser-based x-ray diagnostic for probing the lattice structure and properties of materials at ultrahigh pressures. “Through this Grand Challenge allocation,” says Rudd, “we will simulate and interpret diffraction signals derived from next-generation HED experiments and provide an improved predictive modeling capability for Laboratory research.”

Plot showing simulation results of two- and three-body nuclear reactions occurring at low energies.
Another Grand Challenge team uses Livermore's supercomputers to simulate two- and three-body nuclear reactions occurring at low energies. This plot shows the first ab initio calculation (red line) of a deuterium–helium-3 fusion reaction as a function of energy for the incident deuterium nucleus compared with laboratory experimental data (points). Ab initio theories allow scientists to extrapolate data to the very low energies required for modeling stars.

The Fast and the Curious
Achieving ignition at NIF is a key mission at Lawrence Livermore. The National Ignition Campaign, currently under way, is designed to demonstrate thermonuclear burn and energy gain for the first time in a laboratory. Although indirect-drive inertial confinement fusion is the predominant method by which NIF scientists are attempting to attain ignition, other methods are also being researched, including a technique called fast ignition. In this process, the target is first heated and compressed using the main laser, then a separate high-intensity, ultrashort-pulse laser ignites the fuel. Fast ignition has the potential to offer higher energy gains and has more relaxed requirements for implosion velocity and symmetry compared to conventional indirect-drive experiments.

For several years, Livermore physicist Andreas Kemp has been studying ultraintense laser–plasma interactions for fast ignition and for new types of HED physics experiments at NIF. To this end, Kemp has garnered multiple Grand Challenge allocations over the last five years. As part of earlier projects, Kemp helped develop a collisional particle-in-cell code called PSC that scales to thousands of processors for modeling short-pulse laser interaction with preformed plasma. This code has been further extended so that within a simulation, it can handle the orders-of-magnitude changes in plasma density found in experiments while achieving significantly improved efficiencies. Understanding laser–plasma interactions at ultrahigh intensities, and electron transport and velocity distribution, are paramount to making fast ignition a reality.

Most recently, Kemp has been working with colleagues at Livermore, the University of Nevada at Reno, and the University of Munich in Germany to further improve the PSC code. “We now propose to push the state of the art of short-pulse modeling by combining our particle-based hybrid algorithm with the radiation hydrodynamics code Hydra,” says Kemp. The team will use this new version of the code to simulate the properties of solid-density plasmas. “We are able to model HED physics experiments on Livermore’s Titan laser with almost quantitative accuracy. Our computational goal is to facilitate the first integrated simulation of a fast-ignition experiment that includes everything from the hydrodynamics of the capsule implosion to the high-power-laser interaction and heating of the dense core to ignition conditions,” says Kemp. The simulations represent laser–plasma events occurring over picosecond timescales and will serve as a basis for designing targets for fast-ignition experiments.

“The sheer magnitude of the Grand Challenge computing allocations and the quality and consistency of the machines enable us to conduct simulations of the temporal and spatial domains found in actual experiments, which would not be possible with conventional allocations,” says Kemp. “The work we are undertaking is ambitious, but with the Laboratory’s supercomputers, we can create more realistic simulations that will eventually lead to a full-scale computer model of a fast-ignition experiment.” Similar to the work done by Rudd, this research supports an LDRD strategic initiative for developing advanced inertial confinement fusion designs and diagnostics that will elucidate high-resolution details of HED experiments.

Rendering of a solid material compressed in a ramp wave.
As a solid material is compressed in a ramp wave, shear stresses can develop that are sufficiently strong to cause defects to form in the crystal lattice. Shown here is the nucleation of twins from a perfect single crystal of tantalum. The red spheres represent atoms in twin boundaries. Surrounding atoms are transparent to show the morphology of the twin. (Rendering by Kwei-Yu Chu.)

A Plethora of Particles
In 2009, Livermore scientists created the largest concentration of laser-generated positrons ever diagnosed in a laboratory setting by irradiating a millimeter-thick gold target with an ultrafast, high-energy laser. (See S&TR, January/February 2009, Simulations Explain High-Energy-Density Experiments.) Until that time, the prevailing belief was that more positrons could be effectively produced using ultrathin, foil targets a few micrometers in thickness. However, by simulating the laser–plasma–solid interaction on Livermore supercomputers, physicist Scott Wilks showed that thicker targets were a more efficient vehicle for generating positrons, thus increasing the yield by many times. LDRD funded the initial work on this project.

When a laser pulse hits the thicker target, a plasma is created on the surface of the material that contains electrons and ions. The strong electric field produced by the laser accelerates the electrons within the plasma, resulting in electrons with extremely high kinetic energies of more than 1 megaelectronvolt, the approximate threshold energy for electron–positron pair production. These hot electrons can interact with atoms in the target to either create electron–positron pairs directly, or instead create photons that in turn interact with other atoms in the solid to create a new electron–positron pair. Although adding the additional step of first creating photons may seem roundabout, it produces many more electron–positron pairs because the cross section for creating the pairs is about 100 times higher if one uses photons instead of electrons directly. These simulations coupled with experimental data proved that this process generated tens of billions of positrons in about a picosecond (a billionth of a second).

Using this year’s Grand Challenge allocation, Wilks and collaborators from Ohio State University are applying a hybrid particle-in-cell plus fluid code, known as LSP, to simultaneously model positron generation and hot-electron excitation in three dimensions. LSP provides large-scale plasma simulations that model laser–plasma interactions. The model generates and tracks charged particles and photons through solid dense plasmas and calculates the interaction between particles and the electromagnetic fields they create. “This work will allow us to perform, for the first time, an integrated simulation that includes all the physical effects believed to occur in laser-generated pair production, and to specifically see the connection between the escaping positrons and the hot electrons that created them,” says Wilks. Although LSP is not typically used for positron research, the team has developed a plug-in Monte Carlo code that works in conjunction with LSP to effectively model positron origination and propagation.

Electron energy and distribution cannot be measured through experiments alone because the electron excitation process occurs in front of the solid surface and the large electric potential inhibits electrons from escaping the solid. However, because the positron energy is directly correlated with the electron and photon energies that created it, this data can provide insight into the actual electron distributions inside the solid. Using their Grand Challenge allocation, Wilks and his colleagues have access to the computational resources necessary for simulating every aspect of these interactions. The data they obtain about the excitation and propagation of hot electrons and positrons can be used to design experiments for a variety of HED physics applications. “This knowledge could have tremendous implications for several applications involving ultraintense lasers,” says Wilks. “Tabletop acceleration of ions may be useful in cancer therapy and homeland defense, and the successful demonstration of fast ignition may lead to the practical application of nuclear fusion for power generation.”

Simulation of the effect of preformed plasma in a fast-ignition target.
A Grand Challenge team is performing particle-in-cell simulations to explore the effect of preformed plasma in a fast-ignition target that uses a cone-shaped component to focus the laser into the dense core. In the results shown here, the two white lines indicate the 100-millijoule and 7.5-millijoule prepulses that were injected into the target before the main pulse during two separate simulations. The simulated K-alpha x-ray images compare well with experimental results.

Simulation showing physical effects in laser-generated electron-positron pair production.
A Grand Challenge allocation is allowing researchers to perform, for the first time, an integrated simulation that includes all the physical effects believed to occur in laser-generated electron–positron pair production and to see the connection between the escaping positrons and the hot electrons that created them. A superposition of the accelerating electric field (surface plot) on the rear of a millimeter-long target at the time positrons (colored dots) escape shows three “bunches” of positrons at three distinct times: inside the target (far left), exiting the target (middle, near peak of accelerating field), and after completely leaving the target (far right). The relative height of each bunch is a measure of the positrons’ energy and indicates that the positrons gain several tens of megaelectronvolts as they exit the target.

Seeking the Unknown
Livermore’s Grand Challenge program is an excellent mechanism by which scientists can achieve scientific breakthroughs. “When we allot time on our supercomputers, we want researchers to use these machines to expand their scientific scope and go beyond what is currently known in their fields,” says Streitz. Over the last five years, Grand Challenge allocations have provided Livermore scientists with access to some of the most advanced tools for conducting research, allowing them to take scientific exploration to whole new dimensions and to visualize science in truly “grand” ways.

—Caryn Meissner

Key Words: Grand Challenge Program, high-energy-density (HED) physics, high-performance computing (HPC), Institute for Scientific Computing Research, Multiprogrammatic and Institutional Computing (M&IC), supercomputing.

For further information contact Fred Streitz (925) 423-3236 (streitz1@llnl.gov).


S&TR Home | LLNL Home | LLNL Site Map | Top
Site designed and maintained by TID’s Web & Multimedia Group

Lawrence Livermore National Laboratory
Operated by Lawrence Livermore National Security, LLC, for the
U.S. Department of Energy’s National Nuclear Security Administration

Privacy & Legal Notice | UCRL-TR-52000-11-7/8 | July 12, 2011