Understanding Extreme Materials

Exascale

Can Exascale Computing Help Us Understand Extreme Materials?

Some things are difficult to understand—higher math, relationships, the appeal of reality TV—whereas other things are understood to be difficult—brain surgery, two-year olds, learning to speak Finnish. Then there’s the response of a material hit by a shock wave, which is not only difficult to understand, but trying to simulate it, even using the world’s most powerful computers, is sufficiently difficult that it currently can’t be done.

A shock wave is an extremely energetic disturbance that moves through matter at supersonic speeds. Like a flash flood tearing through a slot canyon, it arrives without warning. Matter suddenly finds itself immersed in the wild pressure and temperature maelstrom that trails the wall-like shock front. As the shock propagates through, say, a solid, it generates enormous mechanical stresses that can deform, crack, even shatter the material. Even if there is no structural damage, will the material properties be the same as they were before?

Only select groups of people—demolition experts, makers of body armor, certain types of physicists—know that the answer to that question is “We don’t know” and are frustrated by it. But the much larger materials-science community is similarly frustrated by a related problem: the inability to produce the next generation of so-called extreme materials that can survive and function in extreme environments. The core of an advanced nuclear reactor is an extreme environment. So is the radiation-filled vacuum of near-Earth space or any environment where a shock wave comes to visit.

Extreme materials deserve our attention because if researchers could create polymers that withstand high temperatures and pressures, alloys that resist corrosion, or Earth-friendly materials that can tolerate excessive exposure to chemicals, radiation, or electromagnetism, then a bevy of already-thought-of advanced technologies could come off the drawing boards and possibly turn our world into the sustain able, energy-secure übercosm we’d all like it to be. But the materials community hasn’t been able to produce designer materials, and a 2009 Department of Energy (DOE) report, Scientific Grand Challenges for National Security, suggests that what’s lacking is a “predictive, mechanistic understanding of real materials,” a real material having a more complicated microscopic structure than a simple material such as a single crystal of pure copper.

“We can model simple metals pretty well,” says Tim Germann, a physicist at Los Alamos and an expert on materials modeling, “and have had some success with more complex materials. But our ability to predict the properties of real, engineering-scale materials in extreme environments is close to nil.”

Extreme materials and shocked matter are of particular interest to scientists at Los Alamos National Laboratory because one of the Laboratory’s missions is to ensure the continued safety, reliability, and performance of our nation’s nuclear deterrent. It so happens that the performance of a nuclear weapon depends intimately on how its components fare when hit by the shock waves generated inside the detonated device.

After five decades of nuclear tests followed by another two decades of laboratory experiments, computer simulations, and hands-on inspections, weapons scientists know how the weapons in the nuclear arsenal work and how to keep them safe. They know the weapons will perform as expected when triggered properly and won’t perform at all when not—devices will not go nuclear if dropped or jarred.

But in the absence of any future nuclear tests, how long can such certainty be maintained? The interior of a nuclear weapon is an extreme environment. The radioactive decay of the nuclear materials produces radiation that changes the internal structure of the weapons components, atom by atom. All of the weapons in the stockpile were originally fielded decades ago, so at what point does the sum of many individually insignificant changes become significant? The answer is not known to any acceptable degree of accuracy, and gaining such knowledge will require the ability to simulate chunks of matter containing perhaps a billion billion atoms, simulations so challenging that they will take an ultra-supercomputer operating at phenomenal speed to do them. That means moving on up to the exascale.

Read the entire story (pdf). Article courtesy of 1663, Los Alamos National Laboratory's science and technology magazine.

For more information, contact This e-mail address is being protected from spambots. You need JavaScript enabled to view it at Los Alamos National Laboratory.