Back

Developing technology to keep the nuclear stockpile safe, secure and reliable

stockpile stewardship 092622 (Download Image)

Phil Pagoria synthesizes a new compound at the Energetic Materials Center, 1997. Work at the center includes simulations and experiments to better understand the fundamental physics and chemistry of high explosives and assess how they perform and age in nuclear weapons, and provides important technologies to stockpile stewardship.

The last nuclear test, code-named Divider, took place 30 years ago, on Sept. 23, 1992. That year, President Bush declared a temporary moratorium on nuclear testing, which became permanent in 1995, during the Clinton administration. This ending of the era of nuclear testing coincided with a Presidential announcement of the beginning of stockpile stewardship.

As the decision to potentially halt nuclear testing approached, the nuclear security establishment (NSE) provided explicit analysis of what could be done with limited tests, continued tests at lower yields and under a no-test regime. Once the decision was made to end testing, it became clear that the plans for a non-testing regime needed to be further developed to ensure we could maintain the safety, security and reliability of the nuclear stockpile without testing. Leaders from the Department of Energy (DOE) and Lawrence Livermore, Los Alamos and Sandia national laboratories, convened to develop a complete strategy and map out an R&D effort that would come to be known as the Stockpile Stewardship Program (SSP). Its mission was ensuring the readiness of the nation’s nuclear deterrent force without nuclear tests. 

During the era of design and test, nuclear weapon designers could detonate weapons underground, measure their explosive yields and other physical parameters and adjust their designs to produce weapons that met the Department of Defense’s (DoD’s) requirements. Without testing, how would they know these devices would work if they were ever fired?

“The [SSP’s] goal would be to attain sufficiently detailed scientific understanding of the nuclear explosive process to discover, comprehend, and correct any anomalies that might occur during the lifetimes of the stockpile weapons…,” wrote Victor Reis and two co-authors in a 2016 Physics Today article. Reis, former DOE assistant secretary of defense programs, was the primary architect of the Science-Based Stockpile Stewardship (SBSS), later known as the SSP — designed with the help and advice of experts throughout the nuclear security enterprise.

Stewardship consists of performing surveillance on the nuclear stockpile, assessing its readiness through scientific and technical means, designing and manufacturing replacement limited-lifetime components and developing experimental platforms to assess our stockpile vulnerabilities and hardening as applied to threat environments. Each year, the three nuclear security laboratories participate in the Annual Assessment Process. After thorough technical evaluations, each of the three laboratory directors sends a letter to the secretaries of Energy and Defense assessing the condition of the stockpile. Based on these letters and other input, the secretaries prepare a joint memorandum to the president concerning the safety and reliability of the stockpile and whether a resumption of nuclear testing is needed.

To annually assess the stockpile, the labs would need to develop new tools and technologies for determining the condition of weapons and for advancing scientific discovery to resolve any unknowns discovered in surveillance — for example, regarding issues that might impact the performance of nuclear devices. Their challenge was to develop the needed new tools and increased understanding at a pace faster than concerns about aging weapons arose. They also would have to ensure that a trained workforce of people with knowledge and experience in a wide variety of fields would be available in perpetuity to perform the work. According to Bruce Tarter, former Laboratory director (1994–2002), the creation of the SSP was a long, difficult, sometimes dramatic process, full of twists and turns. Tarter is the author of The American Lab, a history of Livermore that covers the story of stockpile stewardship through 2010. The Program’s achievements were indispensable to keeping the stockpile in readiness and the nation secure during the post-Cold War era. The past 30 years have been an unprecedented time of technology development and scientific discovery. A few of its successes are reviewed in this series of articles.

Twin pillars — experiment and simulation

Experiments addressing the behavior of matter at high energy densities, and the simulation of nuclear explosions using high-performance computers (HPC), are the twin pillars upon which much of the assessment of the nuclear stockpile rests. Successful stewardship requires that the two activities support each other’s advancement — the computer simulation models the current understanding of the physics taking place within a nuclear device as it detonates, and lab-based experiments provide corrective, reality-based data that improve the models’ predictions. The laboratories needed to develop new technologies to carry out both.

“In terms of technology, from the Lab’s point of view, almost everyone would start at computing, computing, computing,” said Tarter. “We knew by the early 1990s that the computers you would need for stewardship would require parallel computing.”

The beginning of stewardship roughly coincides with a paradigm change in HPC, from specialized, expensive single-processor computers to parallel computers built from many central processing units (CPUs). “The change came when we realized we could leverage commodity technology like chips for desktops and servers,” said Rob Neely, the program director for the Weapons Simulation Computing Program, who began his career at Livermore around the beginning of this change. “These are cheap now, so we no longer had to buy specialized computers. Now we could strap together these processors and learn how to do parallel computing.” To make the transition to parallel computing, DOE funded the Accelerated Strategic Computing Initiative (ASCI), now called Advanced Simulation and Computing (ASC). The program funded the development of progressively more powerful parallel computers and research to develop software for them.

The Laboratory’s computing researchers began with a 24-CPU system, and subsequent systems eventually to hundreds and then thousands of CPUs. The IBM BlueGene/L, with more than 131,000 CPUs, was installed at the Lab in 2005. Inexpensive relative to the 596 petaflops (a thousand million million floating point operations per second) of computing power it delivered, BlueGene/L revolutionized simulation in stewardship, making possible studies such as molecular dynamics simulations of materials aging within nuclear devices, requiring days rather than weeks of computing time.

Computer simulations must analyze the complicated physical movements of atoms and molecules according to physical laws. The simulations are used to understand the properties of materials in detail (e.g., the equation-of-state) and dynamic phenomena such as turbulence, which is essentially three-dimensional (3D) in nature. The challenge has been, traditionally, to account for the physics of 3D phenomena in 1D and 2D hydrodynamics codes, which are used to understand weapon performance — often through adjustable parameters.

“Repeatable hydrodynamics simulations have been very important to stockpile stewardship,” said physicist Richard Ward, who began his career at Livermore as a weapon designer in 1982. “They’re important for understanding the complex hydrodynamics taking place at the material interfaces — what’s happening there and how materials mix …  When I came to the Lab, a big two-dimensional calculation divided the simulation of a process into thousands of zones, not enough to do full-system calculations. Now we can model in three dimensions, with several orders of magnitude more zones — we can model things in so much detail.”

“Livermore’s approach to HPC has led the way nationally because we pay close attention not just to codes,” explains Brad Wallin, principal associate director for Weapons and Complex Integration (WCI), “but to industry partnerships on HPC developments in the marketplace, and to our infrastructure — for example, the Exascale Computing Facility Modernization.” The recently completed ECFM upgrades the Livermore Computing Center by significantly expanding its power and cooling capacity in anticipation of delivery exascale supercomputing hardware.

Rise of exascale computing

Parallel computing has run its course because larger systems would use too much electrical power to be practical. “This has led to the exascale computing program,” said Neely. “This new architecture is based on a computer node consisting of a CPU with a graphics processing unit (GPU). Nodes of this construction now form the basis of this heterogeneous architecture.” El Capitan, Livermore’s first exascale system, will provide 2 exaFLOPs (2 billion billion floating point operations per second) of power and arrive in 2023.

The necessary second element of computing’s application to stewardship has been the development of parallel software codes, and now, software codes for the exascale architecture. Thirty years of code development has resulted in hydrodynamics software, which is used to simulate how materials behave under the high temperatures and pressures of a nuclear detonation. The challenge in algorithms for parallel software is to keep the many processors operating efficiently and in sync “talking to each other” when important physics phenomena have to be modeled on different timescales and spatial resolutions, at differing locations in the device.

“Livermore is extremely simulation-oriented — that was driven by scientists like John von Neumann,” said Bruce Goodwin, retired senior fellow, Center for Global Security Research, and former principal associate director for WCI. In the 1990s, briefing the JASON Group, a distinguished independent cohort of scientists who advise the federal government on nuclear policy, Goodwin laid out computing power requirements to simulate nuclear detonations. “I quoted numbers for simulation computing power that I thought were insane. Yet we’ve surpassed those insane numbers. The Stockpile Stewardship Program has created an astounding tool for both scientific research and nuclear weapons stockpile stewardship.”

Models to simulate nuclear devices evolved rapidly over the last three decades. Laboratory researchers developed three components of models used in stewardship together. “They are the parallel algorithms that you need to drive the model,” said Teresa Bailey, the associate program director for Computational Physics in the Weapon Simulation and Computing Program, “the numerical methods, and the physics improvements in the code to the underlying model. Advancing these three technologies together is Livermore’s secret sauce.” Bailey, who began working at the Laboratory in 2008, saw the scaling up of codes to the Sequoia system. “Livermore advanced high-performance computing [developed] into a discipline unto itself, pushed forward by the Department of Energy’s Advanced Scientific Computing Initiative,” she said. The codes that perform multiphysics simulations have also increased in complexity, and provide multiple pathways to stockpile assessments.

Over the last 30 years, she asserts, Laboratory researchers have made many advances in numerical methods, which solve complex equations numerically. “We understand how to apply numerical methods in our computer architecture. Today, we can run three-dimensional calculations in a reasonable amount of time at high fidelity,” said Bailey. “This was the dream of the ASCI program.” Physics improvements to the codes have allowed the removal of approximations in the models of fundamental physics. This removal means researchers have more confidence that the models better reflect basic physical processes.

Computer codes for these systems have come a long way. “In the beginning, we were writing codes to advance these parallel architectures. About seven years ago, we started a next-generation code project for the weapons program,” says Rob Neely. “These codes are now being used both for research and assessment.” The codes being developed under this effort will enable the next-generation stockpile modeling on El Capitan.

Experiments provide real-world checks

Livermore is widely known not only for its simulation and modeling prowess, but also world-leading experimental facilities. Experiments must provide the checks and balances to ensure that models reflect reality. “When you start to believe in simulations as gospel, you get into trouble,” said George Miller, former Laboratory director (2006–2011). “Skepticism has to be learned, and it is learned in real experiments.”

To ground simulation in living, breathing physics, the SSP developed experimental facilities with groundbreaking new technologies to gather data, some of which could not be measured during nuclear testing. “Facilities are all important. These are the only places where you can experimentally control a nuclear problem, the only places where you can mimic what goes on in a nuclear device. These testing facilities contribute to the confidence we have in the stockpile today,” concluded Miller.

The National Ignition Facility (NIF) is the crown jewel of the SSP, providing the only testbed capable of investigating high-energy density physics at regimes relevant to the thermonuclear range of conditions. Its stockpile-related experiments are essential to stockpile certification. NIF is also the culmination of a laser technology development process at Livermore that began shortly after the invention of the laser in the 1960s. A succession of larger, more powerful LLNL laser facilities nurtured expertise in Livermore’s staff and enabled its development.

For Miller, laser-based investigation for the stockpile was essential. Not long after he started as a weapon designer, Miller was asked by his Laboratory mentor to design a nuclear test that would measure “all of the physical parameters I cared about,” says Miller. “After six months, I concluded that all of what I cared about could not be measured in underground tests. This is one reason I became interested in using lasers.” Later, as a group leader in WCI, he started a program to measure the opacity, a measure of the transmissibility of a material to electromagnetic radiation. Advanced lasers were an essential tool of this research. “Measurement is really hard,” said Miller. “Since the SSP began, we’ve been able to do better measurements, theory and simulations thanks to all the technology we developed.”

This technology includes facilities at other labs in the complex — the Dual Axis Radiographic Hydrodynamic Test Facility (DARHT), a facility with two large X-ray machines that produce freeze-frame radiographs (high-powered X-ray images) of materials that move at very high speeds, at Los Alamos and the Microsystems Engineering, Science and Applications Facility (MESA) at Sandia. MESA develops the non-nuclear systems and technologies necessary for the operation of nuclear devices.

Studying energetic materials

Chemical explosives are components of nuclear devices. Livermore’s High Explosives Applications Facility (HEAF) and the Contained Firing Facility (CFF) provide significant capabilities to develop and test new energetic materials such as insensitive high explosives (IHE), which are safer replacements for conventional high explosives (HE) in the current stockpile. CFF is located at Site 300, in an undeveloped area to the Laboratory’s east. “Ongoing research into the development of new energetic molecules, and the tools to develop them, is an enabler of stockpile stewardship,” said Lara Leininger, director of Livermore’s Energetic Materials Center (EMC), which was established 30 years ago after the cessation of nuclear testing.

“Modeling and simulation and developing advanced diagnostics have been the two major contributing technologies,” Leininger said. The thermochemical modeling code Cheetah is an example of a non-weapon simulation code that has had a great impact for understanding the physics applicable to HE performance for a wide variety of civilian and military systems, where testing can be used to validate the results. Cheetah simulates detonations, predicts the behavior of chemicals in the detonation, models velocity and energy, and assists in designing optimized, customized explosives. Multiscale modeling, which brings the ability to model an energetic material at the atomic, molecular, grain and larger size scales, has been valuable in advancing the understanding of their physics and chemistry.

Leininger also points to EMC’s studies of HE aging. Forecasting the long-term stability of these materials has been challenging, but Livermore researchers have risen to the challenge by developing diagnostic technologies and tests, housed at the Thermal Aging Facility, to quantify how these materials perform as they age and experience environmental stress. Ultrafast laser-based imaging technology in EMC’s High Explosives Laser Imaging System capture high speed images of detonators and initiators. EMC researchers use other analytical facilities to monitor the long-term chemical changes of HE materials. “We’ve become a leader within the enterprise in modeling the aging of energetic materials, the processes such as absorption, diffusion and chemical reactions that take place over decades,” said Leininger. EMC’s studies of the mechanical and thermal safety of energetic materials have provided scientific understanding underlying the safety basis for mitigating their risks, and for improving stockpile safety. Some of this work takes place at Livermore and some at other user facilities like the Linear Coherent Light Source at the SLAC National Accelerator Laboratory, the OMEGA laser at University of Rochester’s Laboratory for Laser Energetics and Argonne’s Advanced Photon Source (APS). Energetic materials also play an important role in tests that study the physics of material behavior under extreme conditions known as hydrodynamic, or hydro tests.

Hydrodynamic testing’s experimental impacts

In a hydrodynamic test, high explosives detonate around a mass of inert material within a thick, shock-proof chamber. As the intense shock wave hits, the material behavior is measured by advanced diagnostic equipment installed throughout the chamber that captures thousands of data points and X-ray images. Hydrodynamic tests have supported the nuclear stockpile for more than 60 years. During the nuclear testing era, various hydro tests provided essential data that would help plan the nuclear test. In the era after testing, they provided data that helped improve and fine-tune the computer models that scientists used to simulate nuclear detonations.

In recent years, new measurement technologies developed by Livermore and others have helped provide better data from hydrodynamic tests. Photon doppler velocimetry (PDV) measures the velocity of the device’s motion by recording the slight shift in the frequency of light reflected from it, also known as the Doppler effect. Broadband laser ranging (BLR), a technique also developed by researchers from several national laboratories including Livermore, tracks the movement of fragments of the device after implosion. “Improved, increasingly accurate diagnostics such as PDV and flash X-rays have helped improve our models,” said Leininger.

Today these tests, some of which are conducted at the CFF, provide data about energetic materials that increases confidence in models and that supports the planning and development of subcritical experiments at the Nevada National Security Site (NNSS, formerly Nevada Test Site), the only location where nuclear materials like plutonium can be so studied. The Joint Actinide Shock Physics Experimental Research (JASPER) Facility and U1A at NNSS play important roles in stockpile stewardship. “Subcrits” can take years to plan and prepare. Extensive planning and validation are provided for the required pre-experiment certification that the experiment will absolutely be subcritical. Valuable data about the behavior of the material during these experiments provide input to ensure that plutonium is correctly represented in hydrodynamic computer models. “Subcritical experiments at U1a have been really important to improving our understanding of plutonium,” said physicist Rick Kraus. Experiments like these have contributed to the criteria, or performance metrics, that the labs use to certify the stockpile.

Confidence through performance metrics

Experiments conducted at facilities like NIF have produced the knowledge that the laboratories use to assure the readiness of the nuclear stockpile. Said Cynthia Nitta, “One of the most important technologies we’ve developed is the theoretical basis for performance metrics using observables. This has pointed us in the direction of the diagnostics we need to validate the stockpile. With those metrics, we don’t need to test — but need to take measurements.” Nitta, Livermore’s chief program advisor for the Future Deterrent, Weapons Physics and Design, began her career as a weapon designer during the nuclear testing era.

NIF’s ability to perform survivability testing of the warhead non-nuclear components has also made important contributions to stewardship. Nuclear survivability of those components is a concern if the weapon encounters defenses or there has been a nearby nuclear detonation. “We used to do survivability testing experiments during underground weapons tests,” she said. “These tests are mostly about measuring radiation effects.” Some of the diagnostics (unique technologies designed to measure specific physical parameters) developed during the era of nuclear testing have been applied to NIF experiments. Laboratory researchers developed other NIF diagnostics expressly for the facility’s program of stockpile stewardship, inertial confinement fusion (ICF) research and basic discovery science.

Models, according to Nitta, are tools to help understand physical processes. “Models give us insight into how a physical system worked,” she said. “Then we make a judgment based on models and experimental data. Because of the metrics I‘ve mentioned, I’m very sure we can certify weapons confidently based on them.”

Validation and verification (V&V) tools, the ability to model and assess how a nuclear weapon will perform, have come to play a significant role in stewardship. “We need a large number of experiments to validate our understanding of how well a nuclear weapon will perform,” says Rick Kraus, “as well as the computational capabilities to model complex devices in a weapon and verify that all the physics has been calculated correctly. As our need for accurate simulation increases, we require higher-fidelity facilities to experimentally verify that accuracy.”

Collectively, these tools have provided the labs with a road to certification, asserted Tom Horillo, W-80-4 Life Extension program manager: “In my present role, we are making changes to the nuclear explosive package [NEP] of the W80-4. We need to know how those changes will affect the system.” Horillo began working at Livermore in the 1970s and has been involved in planning and executing underground nuclear tests. “I’m an engineer, and at the time testing ended, I was not sure how we could certify these weapons without it. We were forced to think differently.”

The end of testing forced weapon designers and researchers to “flip things around,” he said. “Instead of starting with nuclear testing and physical data as the basis for certification, we start with physics and simulation, and then use hydrodynamic tests and legacy data from the testing era to validate our simulation codes. This has provided us with a clear path to certifying the stockpile. If we had continued with just nuclear testing, we wouldn’t have the toolbox that we have today for the SSP — NIF, simulations, high-performance computers and the rest are truly amazing tools that provide a more detailed fundamental understanding of weapon performance.”

Through these tools, researchers have been able to determine how to build wide margins of performance into the stockpile to help assure their reliability. “The builders of Roman bridges and Gothic cathedrals,” said Goodwin, “understood how to build structures that last without testing. ‘Engineering safety factors’ is the key — you figure out the worst things that could happen to a structure, and then build them much stronger to withstand failure — you build in lots of margin to failure.”

Keeping the stewardship mission on track

stockpile stewardship 2 092622
At the Nevada Test Site, the Joint Actinide Shock Physics Experimental Research (JASPER) gas gun slams small, fast-moving projectiles into small samples of plutonium, yielding data about the behavior of plutonium under high-pressure shock conditions. Facilities such as JASPER enable experiments that are critical to stockpile stewardship’s success.

The next few years of stewardship will see some exciting developments. El Capitan, the exascale HPC system is expected to arrive in 2023. Experiments will continue at NIF using targets built by the new target fabrication facility that produces plutonium targets for equation-of-state (EOS) experiments.  

At the Nevada National Security Site, researchers from the complex will finish building Scorpius, a 125-meter-long linear induction accelerator diagnostic tool designed to generate high-speed, high-fidelity radiographic images of contained subcritical experiments using fissionable nuclear material. Los Alamos National Laboratory leads the joint effort to design and build Scorpius, leveraging expertise from NNSS and Lawrence Livermore and Sandia national laboratories.

On a longer time horizon, new facilities may be needed to further advance the mission. “NIF was always intended as an intermediate stepping stone,” said Wallin. “A couple of orders of magnitude increase in the energy produced at NIF would be needed if the nation asked us to make much more significant changes to the stockpile than we have been asked to date.”

Applying artificial intelligence and machine learning (AI/ML) technology to data is another direction that the Laboratory is exploring. “With these tools, we’d be able to use every bit of data that we have, such as images of laser shots that we currently don’t fully use,” said Wallin, “and combine it with our best physics-based models to improve predictive capability.” AI/ML tools will also play a role in accelerating design iteration and manufacturing capabilities. “Applying AI-supervised modeling and simulation to manufacturing would help us shorten the design–manufacturing cycle and optimize its performance.

“We need any kind of science and technology that supports making things faster,” said Nitta. “The current enterprise is scaled for a mission of a certain size. We need the capability to do more of everything.”

“The big change that I see coming is in advanced manufacturing, whose biggest impact is still to come,” said Goodwin. “It will dramatically reduce the cost of sustaining deterrence.”

“One of the unappreciated challenges of stockpile stewardship is that ensuring confidence in something that’s changing is a harder task than designing something new,” said Miller. “When you design something new, you run up against problems, but you can design around them.” Miller looks forward to the programs developing a better understanding of the physics of fusion: “One of the most important elements of nuclear weapons is fusion. We don’t yet understand fusion, and how physical processes degrade as fusion winds down.” But there is hope — a NIF shot on August 8, 2021, reached the threshold of fusion ignition, and with it, the science of stockpile stewardship entered a new era.

Part 2 of this series will survey a few of the scientific discoveries and advances that have contributed to stockpile stewardship.

-Allan Chen
______

IM #1061164