Lab Report

Get the latest LLNL coverage in our weekly newsletter

The Lab Report is a weekly compendium of media reports on science and technology achievements at Lawrence Livermore National Laboratory. Though the Laboratory reviews items for overall accuracy, the reporting organizations are responsible for the content in the links below.

March 13, 2020

El Capitan Supercomputer

With its advanced computing and graphics processing units (CPUs/GPUs) developed by AMD, El Capitan’s peak performance is expected to exceed 2 exaFLOPS, which would make it the fastest supercomputer in the world when it is deployed in 2023. Illustration courtesy of Hewlett Packard Enterprise.

El Capitan leads the way

Hewlett Packard Enterprise and AMD will deliver what they expect to be the world's fastest supercomputer in 2023: a $600 million machine at Lawrence Livermore National Laboratory called El Capitan that they promise will perform at 2 exaflops, or 2 quintillion calculations per second. That's fast enough that if the world's human population could perform one such calculation per second, it would take everybody eight years to match one second's worth of El Capitan computing.

The current fastest machine, as measured by the Top500 ranking released by supercomputing researchers twice each year, is the IBM-built Summit supercomputer at Oak Ridge National Laboratory in Tennessee. Upgrades have boosted its performance to 143 petaflops.

"We expect when it's delivered to the laboratory in 2023, it will be the fastest supercomputer in the world," LLNL Director Bill Goldstein said.

Supercomputers are mammoth systems assembled from hundreds or thousands of computers linked with high-speed interconnects to shuttle data and coordinate operations. They occupy rooms the size of tennis courts, use thousands or millions of processors, cost millions of dollars and consume enough electricity to power a town.


The Sequoia supercomputer, which once sat in the top spot of the fastest supercomputers in the world, has been decommissioned.

Out with the old, in with the new

After eight years of service, Sequoia has been felled. Once the most powerful publicly ranked supercomputer in the world, Sequoia — hosted by Lawrence Livermore National Laboratory (LLNL) — has been decommissioned to make room for LLNL’s new supercomputer: the exascale “El Capitan” system. The system was decommissioned on Jan. 31 and will be dismantled over the first few months of 2020.

Sequoia was a water-cooled IBM BlueGene/Q system primarily comprised of 98,304 compute nodes spread across 96 racks, along with 55 PB of disk storage. At its launch in 2012, Sequoia delivered 16.3 Linpack petaflops out of 20.1 theoretical petaflops of performance. In 2013, an optimized benchmarking run increased its Linpack score to 17.2 petaflops.

Shortly after Sequoia was fully deployed at LLNL, it overtook the top spot on the June 2012 Top500 list. That same year, Oak Ridge National Laboratory took the title from Sequoia with Titan, which itself was decommissioned last summer. Sequoia remained a strong contender on the Top500 until its decommissioning: It remained in the top 10 as recently as 2018, and as of the November 2019 list, it still placed 12th.


Stratocumulus clouds are ubiquitous across the global oceans, reflecting sunlight and keeping Earth cool. Global climate models predict that their ability to shade the Earth weakens with global warming — an amplifying feedback, and one that has strengthened in the latest models. Image courtesy of NASA


Warming projections looking cloudy

Recent climate models project that a doubling of atmospheric CO2 above pre-industrial levels could cause temperatures to soar far above previous estimates. A warming Earth, researchers now say, will lead to a loss of clouds, allowing more solar energy to strike the planet.

It is the most worrisome development in the science of climate change for a long time. An apparently settled conclusion about how sensitive the climate is to adding more greenhouse gases has been thrown into doubt by a series of new studies from the world’s top climate modeling groups.

The studies have changed how the models treat clouds, following new field research. They suggest that the ability of clouds to keep us cool could be drastically reduced as the world warms — pushing global heating into overdrive.

Clouds have long been the biggest uncertainty in climate calculations. They can both shade the Earth and trap heat. Which effect dominates depends on how reflective they are, how high they are, and whether it is day or night. Things are made more complicated because cloud dynamics are complex and happen on small scales that are hard to include in the models used to predict future climate.

But it seems increasingly likely that they could become thinner or burn off entirely in a warmer world, leaving more clear skies through which the sun may add a degree Celsius or more to global warming. As Mark Zelinka of the Lawrence Livermore National Laboratory, lead author of a review of the new models published earlier this year, puts it: The models “are shedding their protective sunscreen in dramatic fashion.”


In 2012, countless lakes across the central United States went dry, including Teller Lake No. 5, in Boulder County, Colorado. Scientists are exploring the physical processes that can drive flash droughts, the existing capabilities to predict them and what is needed to establish an effective early warning system. Credit: UCAR. Photo by Carlye Calvin.

Droughts arrive in a flash

Flash droughts are a type of extreme event distinguished by rapid intensification of drought conditions with severe impacts. They unfold in weeks to months timescales, presenting a new challenge for improving predictions of when flash droughts occur.

In new research, a multi-institutional collaboration including Lawrence Livermore climate scientist Celine Bonfils explored current understanding of the physical processes that can drive flash droughts, the existing capabilities to predict them and what is needed to make progress to establish effective early warning of flash droughts.​

“There is a growing awareness that flash droughts can trigger severe agricultural impacts while the mechanisms at their origin need more investigation. This makes flash droughts a compelling frontier for research, monitoring and prediction,” Bonfils said.

Drought is perhaps the most complex and least understood of all weather and climate extremes. Despite an increasing drought risk in a future warmer climate, this risk is often underestimated and continues to remain a “hidden hazard.” Drought can span timescales from a few weeks to decades, and areas from a few kilometers to entire regions. Impacts usually develop slowly, are often indirect and can linger for long after the end of the drought itself.


Examples of two different TATB crystal structures synthesized under different conditions, shown at identical magnifications. logo

Life in the fast lane

Lawrence Livermore National Laboratory (LLNL) and its partners rely on timely development and deployment of diverse materials to support a variety of national security missions. However, materials development and deployment can take many years from initial discovery of a new material to deployment at scale.

An interdisciplinary team of LLNL researchers from the Physical and Life Sciences, Computing and Engineering directorates are developing machine-learning techniques to remove bottlenecks in the development cycle, and in turn dramatically decrease time to deployment.

One such bottleneck is the amount of effort required to test and evaluate the performance of candidate materials such as TATB, an insensitive high explosive of interest to both the Department of Energy and the Department of Defense. TATB samples can exhibit different crystal characteristics (e.g., size and texture) and therefore dramatically differ in performance due to slight variations in the conditions under which the synthesis reaction occurred.

The LLNL team is looking at a novel approach to predict material properties. By applying computer vision and machine learning based on scanning electron microscopy (SEM) images of raw TATB powder, they have avoided the need for fabrication and physical testing of a part. The team has shown that it is possible to train models to predict material performance based on SEM alone, demonstrating a 24 percent error reduction over the current leading approach (i.e., domain-expert assessment and instrument data). In addition, the team showed that machine-learning models can discover and use informative crystal attributes, which domain experts had underutilized.