Archive of past stories
Age of uncertainty (December 12, 2011)
Calculating degrees of uncertainty will be important to understand
how much faith to place in the highly detailed models exascale
computing will enable later this decade. But first, researchers
led by Sandia National Laboratories need to get a grasp on
uncertainty quantification itself.
Full story
Extreme teamwork (August 23, 2011)
Specific science programs must overcome a number of shared obstacles
if they’re to run on the next generation of supercomputers.
Scientists are linking with mathematicians and hardware and software
experts to meet these challenges.
Full story
The cloud versus E. coli (July 5, 2011)
When biologists around the world clamored for computer power to
quickly research bacteria suspected of killing dozens and sickening
thousands in Europe, Argonne National Laboratory researchers
turned to the Magellan cloud testbed for rapid results.
Full story
Scaling up for security (May 5, 2011)
Powerful new computers will expand the scientific toolkit
for safeguarding the nation. Exascale machines will help
guarantee the readiness and safety of nuclear weapons,
search for threats and predict the impact of tsunamis and
other natural disasters.
Full story
Physics to the max (February 25, 2011)
Scientists probing the physics of phenomena ranging from subatomic
particles to black holes are bumping up against the limits of computing
power. The next generation of big machines is needed, they say, to
answer some of the universe’s fundamental questions.
Full story
Getting a reaction (November 11, 2010)
Growing interest in nuclear energy is fueling research into computer
models designed to get more out of reactors now in operation and
to develop next-generation plants. Oak Ridge National Laboratory
and North Carolina State University researchers lead a collaboration
to transform modeling capability for the nuclear industry.
Full story
Watching the detectors (July 15, 2010)
Flipping the standard approach to processing detector data could
make sensor networks better at spotting smuggled nuclear materials
and other sources of low-level radioactivity, Oak Ridge National
Laboratory and Purdue University scientists say.
Full story
Slick solution (November 3, 2009)
Soap isn’t simple. The surfactants in detergents, shampoos
and other oil-removing and grease-lifting substances are
difficult-to-design molecular concoctions. That’s why
industry and academic researchers are using computer simulations to
test them and find possible
ways to reduce chemical waste.
Full story
Cleaning up coal (September 11, 2009)
Scientists are using the world’s most powerful computer for open
science to guide engineers designing bigger, more efficient reactors
to convert coal into synthetic gas. The work could help make a
ubiquitous energy source gentler on the environment.
Full story
Subterranean blues (June 26, 2009)
Understanding how water carries contaminants underground – and
how those chemicals react with soils – is key to stopping the
spread of radionuclides left by nuclear fuel production. A research team’s
simulations are helping decipher the process so sites managers can choose
the best cleanup methods.
Full story
Keeping the beat (May 7, 2009)
For millions of people, disease can derail the sequence of chemical and
electrical events that make heart muscles contract, causing dangerous
arrhythmias. Scientists are using computer models to understand these
problems.
Full story
Going underground (February 9, 2009)
Two brothers are developing mathematical models of subsurface flow and
transport to better understand things like how contaminants migrate
through groundwater. They're also quantifying the uncertainty of these
and other models.
Full story
Code collaboration (June 4, 2007)
Ravi Samtaney knew computer resources beyond anything practically available
would be required to run codes he designed to simulate a reactor the size
of ITER, the international experiment to develop commercially feasible fusion
energy.
Full story
Filling cavities virtually (April 16, 2007)
When they’re miles around and buried underground, it’s tough to
improve particle-smashing accelerators – unless you use simulation,
as a collaboration between computer researchers and physicists did.
Full story