aeronautical engineers designing passenger aircraft in virtual wind
tunnels to molecular biologists designing anticancer drugs in a
virtual laboratory, computer simulation is often the research tool
of choice. At Lawrence Livermore, home to some of the most powerful
supercomputers in existence, computer simulation is a growing part
of every research effort. Indeed, the pages of Science & Technology
Review are increasingly devoted to Laboratory employees' pioneering
uses of simulation in fields as diverse as materials science, environmental
remediation, and the safe stewardship of nuclear weapons.
But the increasing use of
computer simulation has raised fundamental questions. Where is simulation
taking science and engineering research? When, if ever, can simulation
techniques replace experimental observation? Can scientists really
describe "reality" with computer simulations?
Last October, some 60 of
the nation's leading simulation experts gathered at Lawrence Livermore
to try to answer these questions and explore ways to advance their
craft. In discussions that ranged from the philosophy of science
to the pitfalls of software, participants passionately cited the
accomplishments and limitations of their rapidly evolving field
(see boxes below).
The workshop, called "Barriers
to Predictive Simulation in Science and Engineering," was held at
the University of California at Davis Department of Applied Science,
a center of graduate research and training located adjacent to Lawrence
Livermore. Laboratory physicist Giulia Galli Gygi, a workshop organizer,
said the session was envisioned as a way for experts to explore
the entire range of barriers to fully predictive simulations. "Although
every discipline has its own simulation challenges, we wanted to
bring together the best people in the different fields to look for
areas where there were common challenges," she said.
Lawrence Livermore has been
one of the leading simulation centers in the world since the 1950s.
Laboratory computational biologist and workshop chair Mike Colvin
notes, however, that simulation has become such an important tool
for every industry and research field that Laboratory researchers
have much to learn from other research centers. Consequently, they
are seeking to strengthen their collaborations with colleagues nationwide.
Colvin, who was a dynamic force behind this workshop, is at the
forefront of such efforts.
Partners with Theory
Lawrence Livermore's Dave
Cooper, associate director for Computation, told attendees that
simulation has become a full partner with theory and experimentation.
He pointed to the significant accomplishments of the Department
of Energy's Accelerated Strategic Computing Initiative (ASCI), a
vital element of its Stockpile Stewardship Program to assure the
safety and reliability of the nation's nuclear weapons. Cooper said
ASCI has demonstrated that high-resolution simulations of nuclear
detonation can be performed efficiently on supercomputers using
thousands of relatively simple microprocessors working in tandem.
He asked participants what
would be required to make similarly revolutionary simulation advances
in their disciplines. For example, he asked what barriers would
need to fall to accurately predict the exact path of a hurricane.
Lawrence Livermore physicist
Berni Alder, one of the founders of computer simulation, gave a
personal perspective on the growth of the field. "There's too much
emphasis on building new machines," said Alder, who did pioneering
work in the 1950s using computers that could describe only 100 molecular
collisions per hour. Alder's seminal simulations in the early 1960s
on Lawrence Livermore's LARC (Livermore Advanced Research Computer,
the supercomputer of its day) changed kinetic molecular theory,
showing that simulations can significantly affect a scientific field.
Several speakers noted that
experiment and theory must evolve together, with each needing the
other. However, they described the challenge of comparing even closely
related experiments and simulations. "There is not always an obvious
relationship between the two—we don't understand all that is involved,"
said Galli Gygi. She said that setting up a good simulation is similar
to setting up a good experiment in that "you have to ask the right
Participants discussed the
observation of famed British physicist Paul Dirac, one of the pioneers
of quantum mechanics, that even if all of the relevant equations
are known, a simulation is often impossible to conduct because it
would require far too many supercomputers far too many years to
complete. "The fact is," said Colvin, "to simulate a chemical or
biological process, you can't simply throw a bunch of atoms together
and try to use brute force computational approaches."
of Barriers to Simulation
knowing the appropriate scientific questions to address with
knowing the underlying equations to describe the phenomena
Intrinsic limitations to computability.
- Inability to meaningfully analyze simulation data.
of experimental data to initialize or validate simulations.
to scale algorithms to increased model size and resolution.
in the speed and efficiency of computer hardware.
Instead of trying to describe
a complex chemical or biological process entirely in terms of the
underlying quantum mechanics equations, some simulations are broken
into a hierarchy of size and time scales, each involving a different
simulation method. Such multiscale modeling was discussed with considerable
enthusiasm, although a number of major challenges remain. Under
development at Lawrence Livermore (see S&TR, December
Materials over Time and Space") and elsewhere, multiscale modeling
was seen as essential because a "single numerical scheme is not
feasible in materials and chemistry," according to Princeton University's
"Multiscale is the only way
to go," Car said, but integrating the different length and time
scales represents a formidable barrier. As an example, he discussed
the challenge of combining a simulation based on quantum mechanics
with one based on classical physics. Lawrence Livermore physicist
Tomas Diaz de la Rubia agreed that combining scales is vital for
accurate materials models. He also noted that real materials contain
impurities and other imperfections that are not addressed in ideal
David Ceperley from the University
of Illinois said the multiscale approach was mandatory in part because
the largest computers can now handle simulations of up to one billion
particles, but real-world problems have vastly more particles. "We're
never going to be able to do 1023
particles [1012 is a trillion],
so we need to do multiscale," he said.
While it was clear that computer
simulation is an important tool for both theorists and experimentalists,
Galli Gygi asked if simulations could lead to a major scientific
discovery. "Are computational tools an essential part of the discovery
path, or will they be?" Most argued that it remained an important
open question in most fields, but that with the steady advances
in computers and software, the answer would inevitably become "yes."
Some, however, questioned whether simulation could ever discover
a new field such as, say, superconductivity.
Lawrence Livermore engineer
Kim Mish observed a distinction between scientific and engineering
simulations when it came to discovery. "Science is concerned about
fundamental truth, whereas engineering is an integrative process
about systems you know a lot about," he said.
physicist Berni Alder's pioneering computer simulation work
was published in Physics Review in 1962. Shown here is an example
of Alder's research performed on Lawrence Livermore's LARC supercomputer.
This simulation tracked 870 particles over time and contributed
to the understanding of matter.
quantum-level simulation of a mixture of hydrogen fluoride and
water molecules at high temperatures and pressures took 15 days
on the Accelerated Strategic Computing Initiative machine at
Lawrence Livermore. (Image by Francois Gygi, Lawrence Livermore.)
Still Have Limits
Several speakers discussed
the limits of simulation validity, especially in simulations involving
many phenomena. Paul Dimotakis, from the California Institute of
Technology, noted that many things still cannot be computed, especially
those containing heterogeneous materials and phases. Burning a piece
of paper involves two phases of matter (soot particles and gases)
and more than 2,000 chemical reactions involving more than 100 chemical
species. Simulating such a system is probably beyond present capabilities,
Another multiphenomena simulation
is global climate modeling, which must take into account atmospheric
physics, ocean physics, the effects of Earth's orbit, human activities,
and the details of clouds, aerosols, water, and ice. UCLA's James
McWilliams said climate modeling has matured as a simulation tool
that involves many phenomena continually changing and affecting
each other. He cited two grand challenges in the field: turbulence
and pattern recognition. Although existing theories don't yet interface
well with observed behavior, "We're learning an enormous amount
from simulations," he said.
The University of Michigan's
Joyce Penner, a former Lawrence Livermore scientist, traced the
increasingly refined model of global warming that has been extended
to include scattered radiation, aerosols, biomass burning, soot,
and sulfates. How the subsystems combine is impossible to reconcile
in full detail, she said. Nevertheless, climate modelers are closing
in on the problem of long-term climate prediction.
Lawrence Livermore's Philip
Duffy explained the monumental task of simulating climate change
that necessitates taking 1 million time steps to calculate grids
of areas that are several hundred kilometers per side. Even with
such a coarse resolution, it may take up to two months to complete
a simulation. "Higher resolution is the holy grail of climate modeling,"
he said. Improvements will come, he suggested, from better computer
designs and better representations of data, as well as better understanding
how volcanic eruptions, solar variability, and aerosols affect the
models running on multiprocessor computers are divided into
subdomains so that each processor handles a limited range of
latitude and longitude and unlimited depth. This approach is
typically the most efficient because it minimizes the amount
of time spent exchanging information among processors, allowing
the computer to perform the maximum number of calculations per
second. (Image courtesy of Philip Duffy, Lawrence Livermore.)
Simulation's New Frontier
Caltech's William Goddard
predicted that in the next three years, advanced simulations would
reveal the structure and function of many proteins and enzymes.
Biologists worldwide, including those at Lawrence Livermore, are
studying how proteins--polymers consisting of up to many thousands
of atoms--fold in one-thousandth of a second into three-dimensional,
functional structures measuring 2 to 3 nanometers in diameter.
Stanford University's Michael
Levitt called biology "the ideal system for simulation" and drew
similarities between mechanical engineering simulations and protein-folding
simulations. Protein-folding studies have been influenced by experiments
conducted in the Critical Assessment of Structure Prediction project,
which is managed by a team in Livermore's Biology and Biotechnology
Research Program Directorate. In those experiments, the amino acid
sequences of proteins are posted on the Internet, and researchers
from around the world predict the corresponding three-dimensional
structures. The correct structures are concurrently determined experimentally
by x-ray crystallography, and the predictions are revealed at a
biannual conference. Workshop participants discussed whether this
blind process could be valuable in other fields as a means to test
different simulation software.
Lawrence Livermore biologist
Elbert Branscomb, the first director of the DOE Joint Genome Institute,
described a major challenge: simulating the regulatory control of
genes. The genome's regulatory logic is "profound and complex,"
he said, with the locations of regulatory mechanisms seemingly "chaotic
and crazy." He compared building a computer model of gene regulation
to one describing the functioning of a computer chip. "The real
barrier is the complexity barrier," he said.
Arising from the Workshop
in raw computer speed is not the sole barrier to progress
of outcome varies, ranging from discovering fundamental physical
laws to determining the most efficient airfoil shape.
observation of Paul Dirac that all of the relevant equations
are known but remain difficult to solve.
on initial conditions (for example, crack formation, turbulence
in some systems may limit results.
of robustness, in which many emergent behaviors are insensitive
to model detail and starting conditions.
of comparing even closely related experiments and simulations.
to overcome limits of simulation, especially in multiphenomena
of computers to do more than just numerics—they need to help
set up grids, evaluate outputs, and analyze experimental data.
development and management are major hurdles and perhaps lend
themselves to interdisciplinary collaboration.
role of blind prediction experiments in protein folding could
have valuable applications to other fields.
for solving problems with existing methods is easier to get
than funding for developing better methods.
material models are required for realistic computational simulations
of macroscopic phenomena.
Need Basis in Reality
Barrett from Los Alamos National Laboratory described novel software
that his group has developed to help authorities better respond
to emergencies. The software simulates a host of situations such
as bioterrorism, earthquakes, or commercial power-grid outages.
The software includes models to find how to reduce congestion, thereby
allowing faster emergency response. "Computer simulations have become
a commonplace, but artful tool for addressing these problems," he
Lawrence Livermore engineer
Dave McCallen discussed what can happen when seismic engineering
models are not based on real experiments: "Things can go bad when
we don't fully understand the physics of the process." McCallen
cited a newly constructed bridge that collapsed in the 1971 San
Fernando Earthquake because "we didn't know then how bridges vibrate."
McCallen said engineers now
have adequate computer power to model regional seismic activity
and the response of structures. He pointed to a collaboration between
the Laboratory and the University of California at Berkeley on the
seismic response of long-span bridges (see S&TR, May
1999, "The Revelations
of Acoustic Waves" and December
1998, "A Shaker
Exercise for the Bay Area"). The major 1999 earthquake in Taiwan
provided a wealth of ground-motion data that validated the occurrence
of huge ground displacements that were produced in Lawrence Livermore
simulations. The predictions made by these simulations had originally
been considered by many seismic experts to be unrealistically large.
"Our ability to compute has vastly outstripped our ability to validate,"
In that respect, participants
drew a distinction between verification and validation: verification
involves making sure models and equations have been implemented
correctly while the process of validation ensures that the simulation
K. K. Muraleetharan of the
University of Oklahoma said it was difficult to get data on the
material properties of soils for use in simulating the seismic response
of new structures. Muraleetharan cited two other barriers to civil
engineering simulations: a litigation-driven society and the reluctance
of people in his field to try new approaches.
Workshop speakers made it
clear that in some systems, predictive accuracy may always be limited
by dependency on the precise details of initial conditions. For
example, the propagation of a crack in a material is affected by
what goes on at the crack's very tip. As a result, said Northwestern
University professor Ted Belytschko, realistically predicting the
formation of cracks is still problematic.
Simulations involving climate
change also have a high sensitivity to starting conditions. UCLA's
McWilliams noted that numerical weather prediction is 50 years old,
but predictions are useful for only about one week in advance. "There
is a fundamental limit of predictability because small disturbances
become amplified," he said.
Nobel Prize-winning Livermore
physicist and Stanford professor Robert Laughlin suggested that
some physical properties seem to be protected from sensitivities
to starting conditions and model details, and he encouraged the
workshop participants to seek out such systems for simulation. One
example of such a protected system is a phase transition, such as
when water turns to ice.
spatial resolution is needed to accurately forecast both regional
and large-scale climate change using global climate models.
Typical global models (top image) use grid cells with horizontal
sizes of 250 to 300 kilometers, preventing accurate forecasts
for specific geographical regions (for example, California).
The bottom image shows preliminary results of a simulation at
50-kilometer resolution performed at Lawrence Livermore on computers
of the Accelerated Strategic Computing Initiative. At this resolution,
much more accurate forecasts should eventually be possible.
(Images by Philip Duffy, Lawrence Livermore.)
Belytschko also described
the virtual proving grounds of U.S. car manufacturers that model,
for example, a car's suspension system. A virtual car can be run
over a pothole, with the resulting stress on the suspension system
measured. A realistic simulation must include thousands of welds,
some of which inevitably fail, sometimes because of poor workmanship.
Predicting which welds will fail, when they will fail, and why is
a tough challenge.
Scientist Jacqueline Chen
of the Sandia National Laboratories Combustion Research Facility
described efforts to simulate turbulent mixing and combustion found
in diesel engines. Turbulent mixing, an irregular process of stirring
and mixing, greatly enhances combustion by creating a flame area.
Chen noted that in recent experiments by Sandia's John Dec, laser
diagnostics of diesel combustion reactions involving 80 to 100 atmospheres
of pressure and temperatures of 2,000 kelvins have significantly
changed the conceptual understanding of diesel combustion. Further
advances in the fundamental understanding of the relevant physics
will be aided by first-principles numerical simulations.
the chaotic nature of turbulence is one of computer simulation's
great challenges. This three-dimensional image of laboratory
data shows the tremendous complexity of a turbulent jet. (Image
courtesy of Paul Dimotakis, California Institute of Technology.)
speakers gave credit to the unprecedented power of current computers,
raw computer speed was not identified as the sole barrier to progress
in simulation. Livermore physicist Bill Nellis challenged the conventional
thinking that the key to better simulations was more powerful computers.
He said that there are too many "brute force" simulations with not
enough thought behind them. "The key to doing good science is using
your head," he said. "You need as much intuition to do computation
as you need to do experiments."
said it made sense to focus more on developing advanced algorithms
because the increased speed of computing has been as much due to
improvement in algorithms as to new hardware. He also noted that
it was easier to develop algorithms on personal computers than on
speakers said there was plenty of room for both greater computational
power and better algorithms. They voiced their concerns, however,
that policymakers excessively emphasize multiparallel computing
designs, such as ASCI machines using thousands of microprocessors.
Different kinds of machines with fewer but more powerful processors
would work better for some whole-system problems such as climate
science, which involves interweaving data from physical, chemical,
and biological processes.
participants cited the development and management of simulation
software as a key barrier. Several regretted that there is no Moore's
Law for software. A common request was for robust software with
intuitive interfaces that could be used by any engineering student
or by a small engineering firm. Lee Taylor, from TeraScale LLC in
Albuquerque, New Mexico, raised the issue of accessibility for small
firms that may not be able to purchase an advanced computer, yet
recognize that simulation "lets you design closer to the limits."
combustion in an engine demands close links to experiments.
(a) Images of OH and CH in an actual flame-vortex interaction
and (b) an image of OH from a simulation illustrate the complexity
of modeling chemical response to fluid-chemistry interactions.
(Image (a) is provided courtesy of The Combustion Institute,
and (b) is by Jacqueline Chen, Sandia National Laboratories,
idea was to make software writing more efficient, perhaps by collaboration.
UCLA's McWilliams said, "We all do everything for ourselves. We
keep reinventing simple solutions that take lots of hours to develop
and debug." Several participants urged free exchange of software,
but others cautioned that some institutions have cultural barriers
against sharing software developed in-house.
suggestion was an interdisciplinary collaboration on common software
components. Although such an effort would necessitate going beyond
rigidly defined academic disciplines, many felt that the time had
come for researchers to have a broader knowledge of science and
technology. In that light, Princeton's Car suggested creating new
positions such as "software physicist" or "software chemist."
also voiced concern that Silicon Valley was drawing off some of
the best software minds. Lawrence Livermore's Krzysztof Fidelis
commented, "We need better software, but we live in a world where
software experts tend to go to industry."
participants observed that it is much easier to get funding to solve
problems with existing methods than to develop better methods. Many
agreed with William Goddard of the California Institute of Technology
that funding was needed to explore new directions that offered long-term
but sizable payoffs. Unfortunately, funding sponsors want a particular
problem solved and are not interested in funding better methods
that may not directly solve their problem.
and more, seismic engineers are using powerful computer models
to design seismic retrofits to existing structures. The simulation
above shows a severe earthquake in Northern California, and
the simulations at right show how the earthquake affects the
San Francisco-Oakland Bay Bridge. The topmost one of those simulations
is of the bridge before the earthquake, and the bottom two show
are helping to reveal the three-dimensional structures of proteins.
This image shows the structure of a protein determined by computational
modeling. (Image by Ceslovas Venclovas, Lawrence Livermore.)
areas of major interest to Lawrence Livermore are nuclear weapons
physics and its close relative, astrophysics. Lawrence Livermore
physicist David Nowak, ASCI program leader, said, "ASCI has led
the U.S. to world leadership in high-performance simulation." The
first major barrier to success, said Nowak, is resources, which
include funding, hardware costs, and recruiting and retaining computational
experts. The second barrier is technical, for example, validating
new software that simulates imperfectly understood processes such
as turbulence and materials properties. ASCI simulations, he said,
must reflect a host of data that include engineering details, weapon
designer comments from 30 years ago, information gathered in nuclear
tests, and the results of equation-of-state experiments.
astrophysics is not a classical experimental science, simulation
plays a strong role in that field. University of California at Berkeley's
Christopher McKee said the primary barrier in astrophysics simulations
is vast time and distance scales, ranging from atomic nuclear reactions
to the formation of entire galaxies, with time scales ranging up
to a billion years.
are stretching the ability of computers" with these problems, McKee
said. He added that computer simulations can perform virtual experiments
whose outcomes can be validated with observations. The grand challenge
is to follow the simulated gravitational collapse of a molecular
cloud to the formation of one or more stars.
Livermore physicist Richard Klein said an important barrier to astrophysics
simulation is the difficulty in obtaining enough data to validate
models. A new testbed for validation is emerging in the University
of Rochester's Omega laser and the National Ignition Facility, currently
under construction at the Laboratory. Klein has used simulation
to predict photon bubble oscillation, a new phenomenon on the surface
of neutron stars, with structures the size of New York's World Trade
Livermore astrophysicist Dave Dearborn noted that astrophysics simulation
incorporates lots of physics, some of it not well known. "We're
still human," he said. "We need the brightest people to pose new
Speakers and Panelists
Berni Alder, Lawrence Livermore National
Christopher Barrett, Los Alamos National
David McCallen, Lawrence Livermore National
K. K. Muraleetharan, University of Oklahoma
Theofanis Theofanous, University of California
at Santa Barbara
Richard Becker, Lawrence Livermore National
James Belak, Lawrence Livermore National
Ted Belytschko, Northwestern University
Lee Taylor, TeraScale LLC
Physics and Simulation
Robert Laughlin, Lawrence Livermore National
Roberto Car, Princeton University
David Ceperley, University of Illinois
William Goddard, California Institute of
Bill Nellis, Lawrence
Livermore National Laboratory
Accelerated Strategic Computing
David Nowak, Lawrence Livermore National
David Dearborn, Lawrence Livermore National
Richard Klein, Lawrence Livermore National
Christopher McKee, University of California
Elbert Branscomb, Lawrence Livermore National
Krzysztof Fidelis, Lawrence Livermore National
Michael Levitt, Stanford University
John Moult, University of Maryland
Climate and Weather
Philip Duffy, Lawrence Livermore National
James McWilliams, University of California
at Los Angeles
Joyce Penner, University of Michigan
Jacqueline Chen, Sandia National Laboratories,
Paul E. Dimotakis, California Institute
Anthony Jameson, Stanford University
says the workshop was so successful that plans are under way for
a number of follow-on meetings. Kalos will be chairing a simulation
workshop this summer addressing a number of other simulation fields.
Livermore's Materials Research Institute, under director Mike McElfresh,
is holding a Computational Materials Science and Chemistry Summer
Institute where graduate students can explore cutting-edge computational
methods (see http://www.llnl.gov/mri/
for more information). A workshop on advanced simulation software
is being organized by Mish for the summer of 2002.
are also continuing about extended programs involving visiting faculty
and graduate students who would research a single topic. More informally,
individual researchers are working on ways to build on existing
collaborations and newfound friendships formed at the workshop.
Fortunately, the barriers to lasting friendship are less formidable
than those required for scientific and engineering simulations.
of the simulation workshop steering committee, from left, are
Malvin Kalos, Mike Colvin (chairman), John Holzrichter, Steve
Libby, Paul Miller, Giulia Galli Gygi, and Francois Gygi.
Accelerated Strategic Computing Initiative (ASCI), blind prediction
experiments, computing speed, Dirac observation, model validation,
multiscale modeling, predictive simulations, software.
information contact Giulia Galli Gygi (925) 423-4223 (firstname.lastname@example.org).
About the Scientist
is a group leader for the Quantum Simulation Group in the Chemistry
and Materials Science Directorate. She joined Lawrence Livermore
as a staff physicist in 1998, after holding the position of
senior scientist at the Swiss Federal Institute of Technology
in Lausanne, Switzerland. She received a B.S. in physics from
the University of Modena in Italy, and an M.A. and Ph.D. in
physics from the International School for Advanced Studies in
Trieste, Italy. Thereafter, she was a postdoctoral research
associate at the University of Illinois at Champaign-Urbana
and then at the IBM Zurich Research Center, Switzerland. Galli
Gygi has published over 70 papers in refereed international
journals. Her areas of interest are in systems and processes
relevant to condensed-matter physics, physical chemistry and
materials science, and quantum simulations. Current topics of
investigation include modeling of fluids under pressure, DNA
in solution, and complex surfaces and nanostructures.