Back

Cycle of Grand Challenge projects computed

(Download Image) Teams with winning proposals were allocated time starting this week on the Lab’s supercomputers, including Atlas, a 44.2-teraFLOPs (trillion floating-point operations per second) machine, shown here.

Research proposals ranging from molecular imaging, climate change, astrophysics, inertial confinement fusion, catalytic chemistry and seismic and nuclear explosion monitoring were among the 17 projects to be allocated time on Laboratory supercomputing resources under the fourth annual Institutional Unclassified Computing Grand Challenge Awards announced this week.

The computing Grand Challenge program awards allocations totaling more than 7.5 million CPU-hours/week to projects that address compelling, Grand Challenge-scale, mission-related problems that push the envelope of capability computing.

"These significant allocations are an essential component of our science and technology investment strategy," said Dona Crawford, associate director for Computation. "The Computing Grand Challenge program leverages the supercomputing expertise developed in support of Stockpile Stewardship to promote game-changing, open scientific research, which will in turn strengthen our ability to execute our national security missions. I look forward to the unprecedented scientific and technical advances that will result from the success of these proposals."

Project proposals were reviewed by both internal and external referees. Fred Streitz, director of the Institute for Scientific Computing Research (ISCR) chaired the internal scientific review panel. Criteria used in selecting the projects included: quality and potential impact of proposed science and/or engineering, impact of proposed utilization of Grand Challenge computing resources, ability to effectively utilize a high-performance institutional computing infrastructure, quality and extent of external collaborations and alignment with the Laboratory science and technology strategic vision. Decisions were based on review scores, with input from the Multi programmatic and Institutional Computing (M&IC) program and the Institutional Science and Technology Office.

"The total allocation requested exceeded the computer cycles available by almost a factor of four," Streitz said. "Although the strength of the proposals made the selection process challenging, it is exciting to see so many teams willing to take on the challenge of extending the state-of-the-art in scientific computing."

Teams with winning proposals were allocated time starting this week on Atlas, a 44.2-teraFLOPs machine and uBGL, a 229.4-teraFLOP machine — systems dedicated to unclassified research through the Laboratory's Multiprogrammatic & Institutional Computing program. Almost 400 million CPU hours have been allocated to these projects over the next year in two categories, Tiers 1 and 2. Tier 1 projects will receive a higher allocation and a higher priority. Central processing unit or CPU time is measured across the multiple CPUs in a computer. For example, two CPU hours can be one CPU used for two hours or two CPUs used for one hour. High-performance computers generally consist of thousands of CPUs; the Atlas system contains 9,216 CPUs, while uBGL has 81,920 CPUs.

The Institutional Unclassified Computing Grand Challenge Awards program is now in its fourth year and represents the rapid growth of supercomputing resources being made available to unclassified research. Over the last 12 years, high-performance computing resources dedicated to unclassified institutional research have increased more than a thousand fold from 72 gigaFLOPS in 1999 to more than 400 teraFLOPS today. Additional unclassified high-performance computing resources are expected to be brought on line this year.

Listed below are the Tier 1 (top) and Tier 2 (bottom) Grand Challenge proposals selected this year for initial allocations:

PI
Title
Tier 1
Kyle CaspersenFirst Principles Modeling of Planetary Atmospheres
Cathy ChuangVery-High-Resolution Atmospheric Simulations with Interactive Chemistry and Sectional Aerosol Microphysics
Stephan Hau-RiegeSingle Molecule Imaging at LCLS
Rob HoffmanSupernova Grand Challenges on Atlas
Andreas KempLarge-Scale Modeling of Intense Short-Pulse Laser Interaction for HEDLP and Fast Ignition
Richard KleinThe Advance of UQ Science with Application to Climate Modeling, ICF Design and Stockpile Stewardship Science
Tier 2
Tarabay AntounMesoscale Modeling of Granular Materials under Dynamic Loading
Peter EggletonStellar Structure and Evolution with Djehuty
George GilmerModeling of Ultra-Short Laser Pulse Ablation
Nir GoldmanFirst-Principles Computational Design of Catalytic Materials
Patrick Huang Ab Initio , Atomic-Scale Simulations of the Electric Double Layer at the Alpha-Alumina(0001)/Electrolyte Interface
Jeff LatkowskiDynamic Chamber Processes for LIFE
Kevin KramerHigh-Fidelity Burnup Analysis and Subsequent Full-System Modeling for Laser Inertial Fusion Energy (LIFE)
John MoriartyToward Petascale Atomistic Simulations with Quantum-Level Accuracy (Alloys and Dynamic Fracture)
Arthur RodgersGlobal High-Resolution Simulations of Seismic Waves from Nuclear Explosions
Robert RuddAlloying Effects on Materials Strength from First Principles
Xueqiao XuGyrokinetic and Fluid Simulations of Turbulence and Transport in Tokamak Plasmas
Nov. 13, 2009