‘Grand Challenge’ science projects tap into supercomputing resources at Lawrence Livermore National Laboratory

April 20, 2007

‘Grand Challenge’ science projects tap into supercomputing resources at Lawrence Livermore National Laboratory

Lawrence Livermore National Laboratory’s institutional “Grand Challenge” scientific computing program has allocated 83.7 million CPU hours to 17 research projects ranging from astrophysics, chemistry, materials and biosciences to earthquake and climate simulation on Laboratory supercomputers.

“The Atlas Grand Challenge program makes high performance computing resources available to compelling science projects vital to U.S. Department of Energy (DOE), National Nuclear Security Administration (NNSA) and Laboratory (LLNL) missions,” said Cherry Murray, deputy director for Science and Technology. “Our institutional program aligns with national security needs and serves as a complement to the DOE Office of Science effort to allocate time on high performance computing systems to research projects of national interest in energy, environment, bioscience and physical science.”

Research projects were selected by the Laboratory’s Deputy Director for Science and Technology, the Laboratory Strategic Program Board and the Laboratory Science and Technology Office. To be considered, proposals had to “address a grand-challenge-scale, mission-related problem that promises unprecedented discoveries in a particular scientific and/or engineering field of research, and if successful, will result in high-level recognition by the scientific community at large.”

For example, a multi-institutional group of climate scientists will conduct the most detailed multi-decade global climate simulations ever performed as part of the larger effort to understand how climate changes over time. “By performing these climate simulations of unprecedented detail and fidelity and making them available to the climate research community, we have the potential to dramatically advance our understanding of climate and how it may be affected by anthropogenic factors,” said Dave Bader of Lawrence Livermore National Laboratory (LLNL), principal investigator on the project, which also includes scientists from the Scripps Institute of Oceanography, the National Center for Atmospheric Research, the University of Michigan, Los Alamos National Laboratory and Oak Ridge National Laboratory.

Grand Challenge recipients have been allocated time on Atlas, the new 44 teraFLOP (trillion floating operations per second) machine and Thunder, a 22 teraFLOP machine -- systems dedicated to unclassified research through the Laboratory’s Multi-programmatic and Institutional Computing program. Central processing unit or CPU time is measured across the multiple CPUs in a computer. For example, two CPU hours can be one CPU used for two hours or two CPUs used for one hour. High performance computers generally consist of thousands of CPUs; the Atlas system contains 9, 216 CPUs.

Over the last 10 years, high performance computing resources dedicated to unclassified institutional research have increased more than a thousand fold from 72 giga (72,000 million) FLOPS in 1997 to 81 teraFLOPS today.

 “High performance computing is today an integral part of the scientific process and can serve to advance science in areas of vital interest to the country and the global community, including energy, bioscience and medicine, and climate change,” said Dona Crawford, associate director for Computation at LLNL. “The new high performance computing resources we have put in place for unclassified research leverage the supercomputing expertise and infrastructure we have in place for stockpile stewardship. The Atlas Grand Challenge projects will in turn benefit our national security missions.”

Physical science grand challenge projects include: Accretion Disks Around Rapidly Rotating Black Holes; Dislocation Patterns – Fractals or Cells?; Study of Ramp Compression of Metals Using Molecular Dynamic Simulations; Toward a Predictive Capability for Laser Backscatter in National Ignition Facility Ignition Targets; Advanced Explosion and Earthquake Simulations; Supernova Grand Challenges; Longtime Scale Shock Dynamics of Astrophysical Ices; Modeling Nuclear Materials for Advanced Nuclear Reactors; Realistic Shock Response of Transition Metals via Atomistic Simulation; and From Quantum Chromodynamics to Nuclei to Stars.

Bioscience grand challenge projects include: Understanding Catalytic and Structural Properties of Proteins in Biological Processes Through First Principles Simulations; Molecular Simulations of Huge Biomolecular Complexes for Single Molecule X-Ray Imaging and New Anti-Viral Strategies; Simulation of Membrane Protein and Nanodisc Particles; and From Rubisco to Polio Virus: Tailored Design of Polymer Thin Film Transistors.

Atlas System Details
  • 1152 nodes
  • 8 processors/node
  • Processor type: AMD dual-core
    Socket F Opteron @2.4 GHz
  • 44.24 TFLOPS system peak performance
  • 16 GB memory/node
  • 18,432 GB of total RAM
  • Infiniband interconnect
  • Lustre parallel file system
  • CHAOS Linux OS
Thunder System Details
  • 1024 nodes
  • 4 processors/node
  • Processor type: Intel Itanium2
    Madison Tiger4 @1.4 GHz
  • 22.9 TFLOPS system peak performance
  • 8 GB memory/node
  • 8,192 GB of total RAM
  • Quadrics switch interconnect
  • Lustre parallel file system
  • CHAOS Linux OS

“Institutional computing has been an essential component of our science and technology investment strategy and has been instrumental in helping us achieve recognition in many scientific and technical forums,” Murray said. “We must continue to leverage these resources to continue generating headline scientific and technical accomplishments that benefit the nation and the global community.”

Collaborators on these Grand Challenge projects are from Stanford University, Florida State University, University of California (UC) at Los Angeles, UC Berkeley, UC Santa Barbara, UC Santa Cruz, University of Southern California, Harvard University, Iowa State University, San Diego State University, College of Charleston, University of Michigan, University of Pennsylvania, University of Minnesota, University of Illinois Urbana-Champaign, University of Nevada Reno, Dartmouth College, Massachusetts Institute of Technology, State University of New York at Stony Brook, University of Oxford, UK, University of Augsburg, Augsburg, Germany, Russian Academy of Sciences, Yekaterinburg, Russia, RIKEN, Institute of Physical and Chemical Researchers, Wako, Japan, Scripps Institution of Oceanography, National Center for Atmospheric Research, Institute of Physical Chemistry, Universitat Zurich, Argonne National Lab,  Lawrence Berkeley National Lab, Oak Ridge National Lab, Los Alamos National Lab, Pacific Northwest National Lab, General Atomics and Tech-X Corporation. 

Founded in 1952, Lawrence Livermore National Laboratory has a mission to ensure national security and to apply science and technology to the important issues of our time.  Lawrence Livermore National Laboratory is managed by the University of California for the U.S. Department of Energy’s National Nuclear Security Administration.