2012

January 2012

ASCR Monthly Computing News Report - January 2012



This monthly survey of computing news of interest to ASCR is compiled by Jon Bashor (JBashor@lbl.gov) with news provided by Argonne, Fermi, Lawrence Berkeley, Lawrence Livermore, Los Alamos, Oak Ridge, Pacific Northwest and Sandia national labs.

In this issue:

Research News
Bubbles Help Break Energy Storage Record for Lithium Air-Batteries
PNNL Scientists Use NERSC to Plot the Great Gas Hydrate Escape
PNNL Researchers, Collaborators Develop Novel Uncertainty Quantification Algorithms
Team Uses NERSC to Develop Largest-Ever 3D Color Map of Universe
Researchers Use Jaguar to Better Understand Solar Storm/Magnetosphere Interaction
PNNL Researchers Develop Automatic Computer Learning for Power Grid Modeling

People
Argonne's Snir Honored as One of HPCwire's "People to Watch" in 2012
Warren Washington's Keynote Address Features ALCF Groundbreaking Simulation
Sandia Researcher Pochev Lectures in Netherlands and Singapore
ESnet Staff Share Expertise, 100G Experiences at Joint Techs Meeting

Facilities and Infrastructure
Powered by NERSC, a Database of Billions of Genes Can Cloud Computing Meet Scientific Computing Requirements of DOE Researchers?
The ALCF Is Getting Q'd Up: First Racks of Blue Gene/Q Arrive
OLCF Computer Scientists Collect Computing Tools for Next-Generation Machines

Outreach and Education
OLCF Staffers Share Career, Internship Info with Next Generation of Women in Physics

Bubbles Help Break Energy Storage Record for Lithium Air-Batteries

One of the biggest weaknesses of today's electric vehicles (EV) is battery life-most cars can only go about 100-200 miles between charges. But researchers hope that a new type of battery, called the lithium-air battery, will one day lead to a cost-effective, long-range EV that could travel up to 300 miles or more between charges. Using supercomputers at the National Energy Research Scientific Computing Center (NERSC) and microscopy, a team of researchers from the Pacific Northwest National Laboratory (PNNL) and Princeton University recently built a novel graphene membrane that could produce a lithium-air battery with the highest-energy capacity to date.

Composed of an atom-thick sheet of carbon, graphene is both the strongest and most conductive material ever measured. The new membrane is built from graphene formed around bubbles, and resembles broken eggshells. The researchers believe that this black, porous material could replace traditional smooth graphene sheets in lithium-air batteries, which become clogged with tiny particles during use. Because the material does not rely on platinum or other precious metals, the researchers say that its potential cost and environmental impact are significantly less than current technology.

Read more.External link

Graphene Membrane

Using a new approach, the PNNL team built a graphene membrane for use in lithium-air batteries, which could one day replace conventional batteries in electric vehicles. Resembling coral, this porous graphene material could replace the traditional smooth graphene sheets in lithium-air batteries, which become clogged with tiny particles during use.

PNNL Scientists Use NERSC to Plot the Great Gas Hydrate Escape

For some time, researchers have explored flammable ice for low-carbon or alternative fuel or as a place to store carbon dioxide. Now, a computer analysis of the ice and gas compound, known as a gas hydrate, reveals key details of its structure. The results show that hydrates can hold hydrogen at an optimal capacity of 5 weight-percent, a value that meets the goal of a Department of Energy standard and makes gas hydrates practical and affordable.

The analysis is the first time researchers have accurately quantified the molecular-scale interactions between the gases - either hydrogen or methane, aka natural gas - and the water molecules that form cages around them. A team of researchers from the Department of Energy's Pacific Northwest National Laboratory published the results in Chemical Physics Letters journal online December 22, 2011. The simulations were performed on the Carver and Hopper systems at DOE's National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory (Berkeley Lab) in California.

Read more.External link

Trapped Membrane

The methane trapped in frozen water burns easily, creating ice on fire.

PNNL Researchers, Collaborators Develop Novel Uncertainty Quantification Algorithms

Realistic representation of stochastic inputs associated with various sources of uncertainty in the simulation of fluid flows leads to computationally prohibitive high dimensional representations. To address this problem-known as the "curse of dimensionality"-PNNL researchers and their collaborators at Brown University investigated the use of adaptive ANOVA decomposition as an effective dimension-reduction technique in modeling steady incompressible and compressible flows. Researchers developed three different adaptivity criteria and compared the adaptive ANOVA method against three other uncertainty quantification methods (sparse grid, Monte Carlo and quasi-Monte Carlo) to evaluate its relative efficiency and accuracy. Researchers demonstrated that even draconian truncations of the ANOVA expansion lead to accurate solutions with a speedup factor of three orders of magnitude compared to Monte Carlo; and at least one order of magnitude compared to sparse grids for comparable accuracy.

So far the adaptive ANOVA method has been successfully applied to quantify the uncertainty in stochastic flow benchmark problems. Researchers are striving to further improve the novel adaptive ANOVA method, and to apply the approach to quantify the uncertainty in carbon sequestration, predicting global climate change and large-scale power systems. This researchExternal link will be published in the Feb. 20, 2012 issue of the Journal of Computational Physics.

Team Uses NERSC to Develop Largest-Ever 3D Color Map of Universe

Using NERSC systems, Berkeley Lab scientists and their Sloan Digital Sky Survey (SDSS) colleagues have produced the biggest 3D color map of the universe ever. The team also achieved the most accurate calculation yet of how matter clumps together - from a time when the universe was only half its present age until now.

"The way galaxies cluster together over vast expanses of the sky tells us how both ordinary visible matter and underlying invisible dark matter are distributed across space and back in time," says Shirley Ho, an astrophysicist at Berkeley Lab and Carnegie Mellon University, who led the work. "The distribution gives us cosmic rulers to measure how the universe has expanded, and a basis for calculating what's in it: how much dark matter, how much dark energy, even the mass of the hard-to-see neutrinos it contains. What's left over is the ordinary matter and energy we're familiar with."

For the present study, Ho and her colleagues first selected 900,000 luminous galaxies from among over 1.5 million such galaxies gathered by the Baryon Oscillation Spectrographic Survey, or BOSS, the largest component of the still-ongoing SDSS III. The galaxies chosen for this study populate the largest volume of space ever used for galaxy clustering measurements. Their brightness was measured in five different colors, allowing the redshift of each to be estimated. The team's results were presented January 11 at the annual meeting of the American Astronomical Society in Austin, Texas, and have been submitted to the Astrophysical Journal. They are currently available online at http://arxiv.org/abs/1201.2137External link.

Researchers Use Jaguar to Better Understand Solar Storm/Magnetosphere Interaction

A team led by Homa Karimabadi, a space physicist at the University of California-San Diego (UCSD), in close collaboration with William Daughton at Los Alamos National Laboratory, used the Oak Ridge Leadership Computing Facility's Cray XT5 Jaguar supercomputer to better understand the processes giving rise to space weather. Specifically, Karimabadi has used about 30 million hours on Jaguar to study how plasma spewed from the Sun interacts with Earth's magnetosphere, investigating exactly what happens when the two meet and precisely what gets through and why.

"One of the surprising outcomes of our research is the ubiquity and nature of turbulence in the magnetosphere," said Karimabadi. "This is important since turbulence implies more efficient mixing of the plasma and fields, and after all, space weather arises because the plasma and fields emanating from the sun can penetrate and mix with the plasma and fields of Earth's magnetosphere."

With petascale computers, though, it has now become possible to model magnetic reconnection, including kinetic effects-the individual particles that make up the ionized gas called plasma ejected by the sun. "If you want to model the plasma correctly," said Karimabadi, "you have to model and resolve the orbits of the electrons and ions in a plasma. With petascale computing we can now perform 3D global particle simulations of the magnetosphere that treat the ions as particles, but the electrons are kept as a fluid," said Karimabadi. "It is now possible to address these problems at a resolution that was well out of reach until recently."

The team published its researchExternal link in Nature PhysicsExternal link

PNNL Researchers Develop Automatic Computer Learning for Power Grid Modeling

Using GPS time-stamped data collected from high speed sensors, PNNL researchers are making computers learn automatically correct model parameters of the power grid. The models will then be used to perform analyses to get the health status, power transfer limits, and other information important for the secure operation of the grid.

The ultimate goal is to update continuously all models of the elements in the power grid, including conventional generators, solar and wind power plants and load centers, with more accurate parameters identified through real-time measurements. The approach for the automatic computer learning process is called particle filtering. The filter combines the knowledge of the power grid model structure and parameter distributions with sensor data to produce an optimal estimation of parameter values, resulting in model responses closely matching behaviors of the actual system. So far the particle filtering approach has been successfully applied to models of conventional generators at PNNL. The team, including Shuai Lu, Ning Zhou,and Guang Lin, is striving to make the computation work under real-time constraints, and to apply the approach to renewable power plants and load models.

Contact: Mary Anne Wuennecke, maryanne.wuennecke@pnl.gov

People

Argonne's Snir Honored as One of HPCwire's "People to Watch" in 2012

Argonne National Laboratory's Marc Snir has been named one of HPCwire's "People to Watch" in 2012.External link These individuals are selected from leaders in academia, government, industry, and vendor communities, who HPCWire believes will influence high-performance computing in the near future and beyond.

Snir heads Argonne's Mathematics and Computer Science (MCS) Division, which has long been a world leader in advanced computing and scalable software. His research focuses on parallel architectures and algorithms, programming models, tools, and performance analysis. In addition to his position at Argonne, Snir is Michael Faiman and Saburo Muroga Professor at the University of Illinois at Urbana-Champaign (UIUC). He also continues to serve as co-PI of the petascale Blue Waters project at the National Center for Supercomputing Applications, and he is co-director of the Illinoisâ€INRIA Center for Petascale Computing.

Contact Marc Snir

Warren Washington's Keynote Address Features ALCF Groundbreaking Simulation

On January 23, INCITE PI Warren Washington of the National Center for Atmospheric Research presented vital results from his INCITE project on climate research at the 92nd Annual Meeting of the American Meteorological Society. Washington gave the keynote address at the 12th Presidential Forum, the theme of which was "Technology in Research and Operations-How We Got Here and Where We're Going." Washington's talk included an animation of a recent groundbreaking simulation performed on the Argonne Leadership Computing Facility's Blue Gene/P, Intrepid, with Version 5 of the NSF/DOE Community Atmosphere Model (CAM5).

"We are on the threshold of simulating the global high-resolution atmosphere circulation on decadal and century time scales, and this animation demonstrates this new capability," Washington said.

The two-year simulation ran at 1/8-degree global resolution, with full prognostic aerosols and sea-surface temperature from observations. It was performed on 64K cores of Intrepid, running at 0.25 simulated-years-per-day and taking 25 million core-hours. This is the first simulation using both the CAM5 physics and the highly scalable spectral element dynamical core. The animation of total precipitable water clearly shows hurricanes developing in the Atlantic and Pacific.

Contact: Warren Washington, wmw@ucar.edu

Sandia Researcher Pochev Lectures in Netherlands and Singapore

Sandian Pavel Bochev was one of the six invited speakers at the recent 36th Woudschoten conference in Netherlands. The conference, organized on a yearly basis since 1976 by the Werkgemeenschap Scientific Computing (WSC) highlights new topics and developments in scientific computing by renowned experts from abroad, and is attended by virtually all Dutch and Flemish computational scientists. Bochev also participated in the program on "Multiscale modeling, simulation, analysis and applications" organized by the Institute of Mathematical Sciences, Singapore, where he delivered one of the plenary lectures at the workshop on "Mathematical Theory and Computational Methods for Multiscale Problems."

Contact: Pavel Bochev, pbboche@sandia.gov

ESnet Staff Share Expertise, 100G Experiences at Joint Techs Meeting

Twice a year, staff from ESnet and Internet2, two of the leading research and education networks, meet to exchange information and experiences on topics of mutual interest. The meetings, known at the Joint Techs, also draw an international crowd of networking experts. The Winter 2012 meetingExternal link was held Jan. 22-26 in Baton Rouge, La., where ESnet staff gave a series presentations, including:

ESnet staff also made other contributions to the program. Eli Dart played a leading role in programming Tuesday's special focus on data, and Brian Tierney, Dart and Eric Pouyoul taught a well-attended workshop on "Achieving a Science DMZ." At the GLIF (Global Lambda Integrated Facility) meeting held concurrently, Inder Monga made three presentations.

Facilities/Infrastructure

Powered by NERSC, a Database of Billions of Genes

Microbes are microscopic organisms that live in every nook and cranny of our planet. Without them, plants wouldn't grow, garbage wouldn't decay, humans wouldn't digest food, and there would literally be no life on Earth, or at least not as we know it. By examining the genetic makeup of these "bugs," scientists hope to understand how they work, and how they can be used to solve a variety of important problems like identifying new sources of clean energy.

One important tool for this analysis is the IMG/M (Integrated Microbial Genomes with Microbiome Samples) data management system, which supports the analysis of microbial communities sequenced by the Department of Energy's Joint Genome Institute (JGI). With computing and storage support from the National Energy Research Scientific Computing Center (NERSC), this system currently contains more than 3 billion microbial genes-more than any other similar system in the world.

"Last December IMG/M crossed the boundary of 1 billion genes recorded in the system, and we wouldn't have been able to reach this important milestone without NERSC," says Victor Markowitz, Chief Informatics Officer and Associate Director at JGI. Markowitz also heads the Lawrence Berkeley National Laboratory's Biological Data Management and Technology Center (BDMTC). Since the December milestone, billions more genes have been recorded in the system.

Read more.External link

Can Cloud Computing Meet Scientific Computing Requirements of DOE Researchers?

After a two-year study of the feasibility of cloud computing systems for meeting the ever-increasing computational needs of scientists, Department of Energy researchers have issued a report stating that the cloud computing model is useful, but should not replace the centralized supercomputing centers operated by DOE national laboratories.

Cloud computing's ability to provide flexible, on-demand and cost-effective resources has found acceptance for enterprise applications, and as the largest funder of basic scientific research in the U.S., DOE was interested in whether this capability could translate to the scientific side. Launched in 2009, the study was carried out by computing centers at Argonne National Laboratory in Illinois and Lawrence Berkeley National Laboratory in California. Called Magellan, the project used similar IBM computing clusters at the two labs. Scientific applications were run on the systems, as well as on commercial cloud offerings for comparison.

At the end of the two years, staff members from the two centers produced a 169-page report (PDF | 10MB)External link with a number of findings and recommendations. Overall, the project members found that while commercial clouds are well suited for enterprise applications, scientific applications are more computationally demanding, and therefore the computing systems require more care and feeding. In short, the popular "plug and play" concept of cloud computing does not carry over to scientific computing.

Read more.External link

The ALCF Is Getting Q'd Up: First Racks of Blue Gene/Q Arrive

Engineers from the Argonne Leadership Computing Facility (ALCF) and IBM have made significant strides in building the infrastructure that will support Mira, the ALCF's 10-petaflops Blue Gene/Q system-and not a moment too soon. The first two racks of the new system-named Cetus and Vesta-arrived at the datacenter on January 4. Cetus will be used for science application development and debugging, and Vesta will be used for system software development and testing.

Once Cetus and Vesta are installed and pass a series of clearance criteria, the first researchers to access them will be those selected in 2010 through the Early Science Program.External link In addition, key developers of essential computational tools for users will be given access to the system to complete the process of porting the tools for use on the new Blue Gene/Q architecture.

Said ALCF's Deputy Director Susan Coghlan, "With a system this innovative and powerful, we have a slew of researchers and collaborators eagerly vying for time on the new Q, even though full production could be a year away. We're very pleased with the installation progress to date and hope that it continues to run ahead of our scheduled milestones."

For more information, visit the ALCF websiteExternal link.

OLCF Computer Scientists Collect Computing Tools for Next-Generation Machines

Researchers using the OLCF's resources can foresee substantial changes in their scientific application code development in the near future. The OLCF's next supercomputer, a Cray XK6 named Titan with an expected peak speed of 10-20 petaflops, will use a hybrid architecture of conventional, multipurpose central processing units (CPUs) and high-performance graphics processing units (GPUs), which, until recently, primarily drove modern video game graphics. Titan is set to be operational by early 2013.

With Titan's arrival, fundamental changes to computer architectures will challenge researchers from every scientific discipline. Members of the OLCF's Application Performance Tools (APT) group are working to make the transition as smooth as possible. The group is working to ensure that researchers receiving allocations on leadership-class computing resources will not have to spend huge amounts of time learning how to effectively use their codes as the OLCF shifts to hybrid computing architectures.

An application tool can do anything from translating one computer programming language into another to finding ways to optimize performance. In an interview with HPCwire, APT Group Leader Richard Graham explained that many of the same tools that helped Jaguar advance beyond the petascale threshold will also play important roles in getting Titan up and running toward the exascale-a thousand-fold increase over petascale computing power. However, just like scientific application codes, software tools will have to be scaled up for a larger machine.

Read the interview.External link

Outreach and Education

OLCF Staffers Share Career, Internship Info with Next Generation of Women in Physics

The number of female physicists at Oak Ridge National Laboratory (ORNL) increased greatly on January 13 as 107 female physics students from 50 universities and institutions toured the ORNL campus. The undergrads were attending the Third Annual Southeast Conference for Undergraduate Women in PhysicsExternal link (SCUWP), a four-day event held on the University of Tennessee (UTK) campus, as part of the Conference for Undergraduate Women in Physics held concurrently at six academic locations across the nation. Hai Ah Nam, a computational scientist at the OLCF, and Channa Palmer, ORNL university recruiter, led the SCUWP group through tours of ORNL's historic Graphite Reactor, Spallation Neutron Source, OLCF, and the Everest Powerwall, a 30-foot screen for scientific visualizations.

"The goal of the SCUWP is to increase recruitment and retention of women to physics and to improve their career prospects," said Christine Nattrass, postdoctoral researcher at the UTK Physics Department and chair of the conference organizing committee. "Most of these young women studying physics had never been to or met someone who worked at a national lab. By incorporating a tour of ORNL into the conference, these women gained a better understanding of opportunities available at national laboratories."

After the tour, students were welcomed to a lunch and talks by Jim Roberto, associate laboratory director for science and technology; Michelle Buchanan, associate laboratory director for physical sciences; and National Center for Computational Sciences Director Jim Hack, all of whom shared the diverse and groundbreaking research occurring across the lab. The students also got practical advice from a panel of ORNL staffers on careers at a national lab and application information for the many internship and job opportunities at theBredesen Center for Interdisciplinary Research and Graduate EducationExternal link,Research Alliance in Math and ScienceExternal link, and Oak Ridge Institute for Science and EducationExternal link.

Contact: Jayson Hines, hinesjb@ornl.gov

Last modified: 7/23/2012 10:42:41 AM