AS the beam of a flashlight is to the illumination of a modern sports stadium, so is the power of an average personal computer to the Department of Energy's Blue Pacific supercomputer. And now, the Blue Pacific, located at Lawrence Livermore National Laboratory, is about to be eclipsed by another even higher performance supercomputer. Dubbed White, the monster machine developed by IBM for Lawrence Livermore is the latest computing system in DOE's Accelerated Strategic Computing Initiative (ASCI)-a program whose mission is to produce a system capable of calculating at 100 teraops (trillion floating-point operations per second) by 2004. "On a logarithmic scale," says Mark Seager, Livermore's principal investigator for ASCI platforms, "we're halfway to that goal with the 10-teraops ASCI White."




One Program—Three Laboratories

The goal of the Accelerated Strategic Computing Initiative (ASCI) is to provide the numerical simulation capability needed to model the safety, reliability, and performance of a complete nuclear weapon-from start to finish. The three national laboratories working on this initiative are Lawrence Livermore, Sandia, and Los Alamos. Project leaders at each laboratory, guided by the DOE's Office of the Deputy Administrator for Defense Programs, work with ASCI's industrial and academic partners. The overriding challenge is to synchronize the various technological developments. For example, sufficient platform power must be delivered in time to run new advanced codes, and networking capabilities must be in place to enable the various parts of the system to behave as one
ASCI's program elements are Applications Development, Platforms, Pathforward, Problem-Solving Environment, Alliances, Visual Interactive Environment for Weapons Simulation, Verification and Validation, and Distributed and Distance Computing.






The Clock Ticks for the Stockpile
The push to produce a 100-teraops machine is tied to the aging of the U.S. nuclear weapons stockpile and the country's commitment to maintaining and preserving a nuclear deterrent. With the advent of the U.S. nuclear test moratorium in 1992, the rules of the game changed for the nuclear weapons stockpile. No longer can the reliability, safety, and performance of these weapons be confirmed by underground nuclear tests. Originally designed with lifetimes of 20 years or more, these weapons must now be kept ready to serve the country indefinitely.
Doug Post, associate leader for the Laboratory's Computational Physics Division, explains, "In the early days, when the Lab was in the business of developing nuclear weapons, the overriding design question was `Would it work?' Back then, we had nuclear testing to prove out the designs. It wasn't really necessary to be efficient as long as Mother Nature got to vote." As the number of underground tests shrank from hundreds to tens per year and the simulation codes and computers running them became more capable, the design process for nuclear weapons and supporting computer codes became more efficient. "After the test ban, the question became `How long?' Aging nuclear weapons had never been an issue, because we'd always replaced old systems with new designs before their retirement dates. And we'd had underground nuclear tests to confirm the viability of the stockpile."
The answer to that question became DOE's Stockpile Stewardship Program. Weapon reliability, safety, surety-all once verified by underground nuclear tests-must now be confirmed using nonnuclear experiments, historical underground nuclear test data, and high-fidelity computer simulations.
A key component of the Stockpile Stewardship Program, ASCI is pushing computational power far beyond present capabilities so weapon scientists and engineers can, with confidence, simulate the aging of nuclear weapons and predict their performance. If high-performance supercomputing were constrained to continue developing at a normal pace as predicted by Moore's Law (that is, computer speed doubling every 18 months to 2 years), the capability necessary to perform these calculations would still be a long way from reality by the year 2004. The three national laboratories that are part of ASCI-Lawrence Livermore, Los Alamos, and Sandia-have teamed up with the supercomputing industry to accelerate the development of high-performance supercomputing in the marketplace and meet this critical timeline.
The 10-teraops IBM White machine is scheduled to arrive at Lawrence Livermore this summer. With more than two-and-a-half times the horsepower of Blue Pacific, White is an important rung in the ASCI ladder. "One hundred teraops is the entry level we need to do full-scale simulations of exploding nuclear weapons," Seager explains. "These simulations must take into account three-dimensional, multiple-physics, high-resolution, coupled calculations. ASCI White will shed light on those areas of physics we don't yet understand, so that when we have the 100-teraops machine in 2004, we'll have a better handle on the issues."





Beating Moore's Law
To reach its 2004 goal, ASCI is building the world's fastest supercomputers using commercially manufactured parts. "Because we need to move faster than Moore's law allows," explains Seager, "we're building our supercomputers by taking thousands of processors-basically just like the PC on your desk-and linking them together. For ASCI White, we are tying together 8,192 processors to get a corresponding increase in capability and speed."
The processors in White are based on IBM's latest chip-the Power 3-II. Sixteen processors are grouped into a symmetric multiprocessor node; 512 nodes form the system. Code developers take advantage of the fact that proximity plays an important role in the speed of communication. Thus, the processors within a node can pass information between themselves quickly, while information between nodes moves a bit slower. Housing and bringing power to this enormous parallel machine are a project in and of themselves. For the machine to replace White, Lawrence Livermore is constructing the Terascale Simulation Facility. (See the box below.)
As Seager points out, "When we speak of accelerating the technology, we're talking about more than just the computer. Everything around the computer platform must also develop at an increased pace—applications, infrastructure, networks, archives, visualization tools. In fact, for every dollar spent on computer hardware, the initiative spends two on software." (See the box entitled Tomorrow's Supercomputer Today.)




The Terascale Simulation Facility

The time scales and demands of ASCI require a facility of extraordinary scope: the Terascale Simulation Facility (TSF). The $89-million facility will house a computer complex and data assessment and networking capability. It will require a significant increase in electrical power, mechanical support, physical space, and networking infrastructure. The TSF will provide an acre of raised computer floor and a power plant equivalent to that servicing a city of 15,000 homes.
ASCI White, with a delivery date of mid-2000, will reside next door to the proposed TSF in Building 451, where power and cooling systems have been tripled to meet the requirements of the 10-teraops supercomputer.








Building on Blue's Triumphs
It's not easy to develop applications software for these parallel machines, notes Post. Many of the algorithms used in past weapons-related codes were not designed with parallel computing in mind. Plus, they were written to examine physics in two dimensions, not three, and had to run on machines much less capable than today's workstations. Consequently, simplified assumptions about the physics processes and the geometries involved are part of those codes. To produce the high-fidelity simulations required for stockpile stewardship, ASCI codes must remove these assumptions and replace them with much more accurate methods based, in many cases, on a first-principles approach.
As part of ASCI, the Laboratory's code developers set about writing codes to take advantage of the architecture of this new generation of supercomputers. Simulation codes for ASCI include nuclear performance codes that can predict the details of an exploding nuclear weapon, materials modeling codes that use molecular dynamics to study the long-term degradation of materials under the influence of low-level radiation, and multiphysics codes that simulate what happens within an inertial confinement fusion target during implosion. Many of these codes have run on the ASCI Blue Pacific machine with impressive results.





Modeling a Primary
In one notable ASCI Blue Pacific triumph, Lawrence Livermore, with strong support from IBM, completed the first major ASCI scientific milestone in November 1999-the first-ever three-dimensional simulation of an explosion of a nuclear weapon primary. (Nuclear weapons have two main components: the primary or trigger, and the secondary, which produces most of the energy.) Demonstrating the ability to computationally simulate, visualize, and analyze what happens to each of a nuclear weapon's components is a critical step in simulating an entire nuclear weapon's explosion in three dimensions.
The simulation required about 300,000 megabytes of random-access memory (RAM). For comparison, a conventional desktop computer typically comes with only about 100 megabytes of RAM. The calculation—which would have taken 30 years on a desktop—ran for more than 20 days on the Blue Pacific using 1,024 processors. The complex computer model, referred to as the burn code, employed tens of millions of zones-hundreds of times more than a comparable two-dimensional simulation using traditional codes. The calculations produced 50,000 data files containing 6 terabytes of numerical information. Just looking at this much information presents a formidable challenge.
Tom Adams, an associate division leader in Defense and Nuclear Technologies Directorate, explains, "We had so much data that we needed to use the ASCI machine to analyze the results as well." ASCI's visualization software was also put to the test. Results were displayed as movies, in which scientists could look inside the primary to see what was happening at different points in time, and as graphic renditions of temperatures, pressures, and so on.
"This burn code `milepost' was originally set in 1996 as a target to hit before the year 2000," continues Adams. "It was a daunting objective that some thought we'd never meet. In accomplishing it, we've shown that the ASCI program is on track, that the hardware and software systems are working, and that we can bring in results. The run also tested the teamwork of the people from the physics code teams, the Computation Directorate, and IBM. They all worked hard to solve the problems and make this happen. We see this exercise on Blue Pacific as a model for what we plan to do with White."
Adams expects that with White, the simulations will get more detailed. "We'll be able to model a larger part of the full weapon system, input more details on the weapon's configuration, and use more complex coupled physical models. All of this will improve the fidelity of our simulations."
The next ASCI scientific milestone is to model the secondary, something that will require White's increased power and speed. "In addition, we're moving from this, our first complete calculation, to something that nuclear weapon scientists and engineers can use for analyzing the stockpile," says Adams. "We're working with these users to apply the codes to current stockpile stewardship issues."





Dissecting ICF
Several of the codes developed for ASCI examine various physics processes related to inertial confinement fusion (ICF). For instance, one three-dimensional code examines what occurs inside an ICF pellet when x rays hit the pellet's surface. The code models the fluid motion within the capsule as temperature, pressure, and density increase. It also models the transport of x rays emitted by the high-temperature material and the energy released from the fusion process. Another effort focuses on methods for modeling radiation transport. Seager notes, "Radiation transport is central to many physics applications, including nuclear weapons, inertial confinement fusion, plasma processing, and combustion. Many production codes in ASCI spend about 80 percent of their time claculating radiation transport. Doing these calculations more efficiently is a big time-saver."
Another parallel code, which ran on the ASCI Blue Pacific in 1999, simulated the flux of fusion neutrons that comes out of the Nova laser target chamber. This high-fidelity simulation had to take into account complex three-dimensional geometries, a wide range of distances (from the 6-meter-diameter test chamber to a target pellet less than 1 millimeter in diameter), and a number of different materials (air, aluminum, gold) with densities varying by a factor of 108. This calculation employed more than 160 million zones with over 15 billion unknowns and took 27 hours to solve on 3,840 processors.




From Shock Wave to Buckyballs
In other first-time-ever Blue Pacific calculations, researchers explored such diverse areas as turbulence, ab initio molecular dynamics, and quantum chemistry.
What happens when a shock wave passes through an interface of two fluids of differing density? A detailed simulation on 5,832 processors aimed at answering this question netted a team from Lawrence Livermore, the University of Minnesota, and IBM the prestigious Gordon Bell Award last November. The simulation, the largest calculation of its type, achieved a greater level of detail than any previous turbulence simulations. The effort has applications in a variety of disciplines, including supersonic propulsion, combustion, and supernova evolution.
Researchers at the Laboratory used 3,840 processors to simulate from first principles the molecular interactions of hydrogen fluoride, an extremely toxic and corrosive byproduct of insensitive high-explosive detonations. Little experimental data are available on hydrogen fluoride, particularly at high temperatures and pressures. Quantum molecular dynamics simulated the interaction of hydrogen fluoride and water at the microscopic level-the only input being the identities of the atoms and the laws of quantum mechanics. This ASCI simulation involved 600 atoms with 1,920 electrons. The simulations provided crucial insight into the properties of hydrogen fluoride-water mixtures at high pressures and temperatures, adding to the understanding of how insensitive high explosives perform. (See S&TR July/August 1999, On Target: Designing for Ignition, for more information about quantum molecular dynamics and this simulation.)
A research collaboration from Lawrence Livermore, the University of California at Berkeley, and Sandia National Laboratory at Livermore used ASCI Blue Pacific to perform the largest first-principles quantum chemistry calculations ever done. One of the initial applications was to determine the three-dimensional structure and electronic state of the carbon-36 "buckyball," one of the smallest, most stable members of the buckminsterfullerene family of compounds. Possible applications for these unusual compounds include high-temperature superconductors and precise delivery of medicines to cancer cells. Previous quantum chemical calculations narrowed down the possible structure of carbon-36 to one of two possibilities, shown below right. Experimental data favored the structure in (a), but theoretical results slightly favored the one in (b). The two structures have different chemical properties, so determining which is more stable was critical to understanding this compound. A high-fidelity ASCI calculation provided the definitive-and unexpected-answer. As it turned out, the structure in (a), the one favored by the experimentalists, was the most stable. "This exercise was a reminder of how low-fidelity simulations with their simplifications and interpolations can lead to catastrophic results," notes Seager. "In a weapons calculation, for instance, you don't want to get the wrong answer to the question `Will it work?'"







Universities Logging On
Significant scientific work on ASCI Blue Pacific is proceeding not only at the national laboratories, but also at associated universities through DOE's Academic Strategic Alliance Program. This program aims to engage the U.S. academic community in advancing science-based modeling and simulation technologies. Although the specific computing problems universities are tackling do not directly involve nuclear weapons research, the methodologies and tools being developed can be applied to all of ASCI's areas of research.
The program funds five major centers of excellence. Each uses multidisciplinary teams working over the long term to provide large-scale, unclassified simulations that represent ASCI-class problems. The centers collectively have access to up to 10 percent of the ASCI computing resources at the three national laboratories. The projects are part of a 10-year program, in which projects come up for renewal after five years.
"The problems being studied in these projects are comparable in their complexity to those involving nuclear weapons," explains Dick Watson, Lawrence Livermore's manager for the program. "We expect that through this program, the laboratories and universities will see revolutionary advances in both the physical and engineering sciences and the mathematical and computer sciences."
The Center for Simulating Dynamic Response of Materials, based at the California Institute of Technology, is developing simulation codes for its virtual shock physics test facility. Simulations have applications to materials design, oil exploration, earthquake prediction, and environmental analysis.
Stanford University's Center for Integrated Turbulence Simulations focuses on jet-engine simulations. Researchers at this center are developing massively parallel codes for high-fidelity flow and combustion simulation.
The Center for Astrophysical Thermonuclear Flashes at the University of Chicago is studying the physics of supernovas, including the physics of ignition, detonation, and turbulent mixing of complex fluids and materials. Once completed, the integrated code will provide the highest resolution calculation ever done showing how these stellar outbursts begin.
The University of Illinois Urbana- Champaign is home to the Center for Simulation of Advanced Rockets. This center plans to provide a detailed, whole-system simulation of solid propellant rockets. Earlier this year, center staff completed a simplified version of a three-dimensional integrated rocket simulation code. The next-generation code will characterize various burn scenarios, and the fully integrated code will address potential component failures. Research will benefit technologies such as gas generators used for automobile air bags and fire suppression.
Finally, the University of Utah is focusing on the physics of fire at the Center for the Simulation of Accidental Fires and Explosions. These problems draw on fundamental gas- and condensed-phase chemistry, structural mechanics, turbulent flows, convective and radiative heat transfer, and mass transfer.
At Lawrence Livermore, all of the alliance work is being conducted on the unclassified sector of ASCI Blue Pacific.







Building a Model for the Future
"One of the key challenges to fielding the fastest supercomputers in the world is that this scale of computing requires a new operational model in order to succeed," says Seager. "In the past, the model has been a large, but fundamentally traditional, scientific computing center. ASCI needs to be run as if it were an experimental research facility. The scientific applications being developed today promise a level of physical and numerical accuracy that is more like that of a scientific experiment than a traditional numerical simulation. The effort required to run a full-scale, three-dimensional scientific application is like running a big experimental weapon test. In a way, this operational change parallels the changing role of large-scale experiments in the physics world."
For a long time, the two accepted branches of physics have been theory and experiment. Yet, there are experiments that, for various reasons, are unrealistic- either the conditions can't be created in the laboratory, or it's far too dangerous or expensive to do so. And, in many cases, the theory has been too complex to analyze, even with many simplified assumptions. "All that has changed with the coming of ASCI's supercomputers," says Seager. "Computer simulation is now poised to become the new branch of science, on the same level as experimentation and theory."
—Ann Parker




Key Words: Academic Strategic Alliance Program, Accelerated Strategic Computing Initiative (ASCI), ASCI White, Blue Pacific, Stockpile Stewardship Program, Terascale Simulation Facility (TSF).

For more information contact Mark Seager (925) 423-3141 (seager1@llnl.gov).

Also see the following Web sites:

  • ASCI at Lawrence Livermore, www.llnl.gov/asci/
  • ASCI at Los Alamos, www.lanl.gov/asci/
  • ASCI at Sandia, www.sandia.gov/ASCI/
  • ASCI's Academic Strategic Alliances Program, www.llnl.gov/asci-alliances/

    ABOUT THE SCIENTIST

    MARK SEAGER is the principal investigator for the Accelerated Strategic Computing Initiative's terascale system at Lawrence Livermore National Laboratory. His interests are currently in large, scalable system architecture, performance, and integration. Seager received his B.S. in mathematics and astrophysics from the University of New Mexico at Albuquerque in 1979 and a Ph.D. in numerical analysis from the University of Texas at Austin in 1984.







    Back to June 2000 // Science & Technology Review 2000 // Science & Technology Review // LLNL Homepage