LAB REPORT

Science and Technology Making Headlines

Aug. 21, 2020

calidad

ocean_8.21.20

The Southern Ocean, Southern Hemisphere summer, February 2017. This floating iceberg was located off the Antarctic continent in the Weddell Sea, south of South America. This part of the global ocean has been found to be particularly sensitive to changes driven by ongoing climate change. Photo courtesy of Yves David/Association of Polar Early Career Scientists/APECS.

In hot water

More than 50 percent of the world's oceans already could be impacted by climate change, with this figure rising to 80 percent over the coming decades, a research team including Lawrence Livermore oceanographer Paul Durack has found.

Changes to the global ocean are important to monitor as the ocean is the largest heat sink in the climate system. The global ocean covers 70 percent of Earth's surface and accounts for more than 90 percent of the Earth's excess heat associated with global warming. Along with the temperature changes associated with climate change, observed ocean salinity patterns reflect coincident climate system changes to the Earth's water cycle, ice and the cryosphere, along with land-based water stores as water moves around in the climate system.

The team found that climate models simulate the same changes to ocean salinity and temperature, across each of the three primary ocean basins, the Pacific, Atlantic and Indian oceans, as have been recorded in long-term reconstructions of observed change.

Venture Beat

Lassen

LLNL and Cerebras Systems have installed the company’s CS-1 artificial intelligence (AI) computer into Lassen, making LLNL the first institution to integrate the cutting-edge AI platform with a large-scale supercomputer. Photo by Katrina Trujillo/LLNL.

Chipping in for Lassen

Lawrence Livermore National Laboratory (LLNL) has integrated Cerebras Systems‘ new product, which the company claims is the world’s largest computer chip, into LLNL’s Lassen supercomputer for the National Nuclear Security Administration.

Technicians recently connected the Silicon Valley-based company’s massive, 1.2 trillion transistor Wafer Scale Engine chip — designed specifically for machine learning and AI applications — to the 23-petaflop Lassen. The new wafer-sized computer chip will help accelerate AI research at LLNL. The National Nuclear Security Administration tests the nation’s stockpiles of nuclear weapons using the IBM and Nvidia-based Sierra system, to which Lassen is the unclassified companion. It is No. 14 on the latest Top 500 List of the world’s most powerful supercomputers.

LLNL said the computer will enable researchers to investigate novel approaches to predictive modeling.

sci tech

white dwarf

White dwarfs are among the most ancient stellar objects, so their temperatures and predictable lifecycles enable them to work as “cosmic clocks” that can help determine the age of the universe and nearby stars and galaxies. Credit: Greg Stewart/SLAC National Accelerator Laboratory

The pressure is on

Lawrence Livermore and SLAC National Accelerator Laboratory researchers developed a way to measure the basic properties of matter at the highest pressures so far achieved in a controlled laboratory experiment.

Using the power of the National Ignition Facility (NIF), the world’s highest-energy laser system, an international team developed a way to measure the basic properties of matter, such as the equation of state (EOS), at the highest pressures thus far achieved in a controlled laboratory experiment. The results are relevant to the conditions at the cores of giant planets, the interiors of brown dwarfs (failed stars), the carbon envelopes of white dwarf stars and many applied science programs.

According to the team, the overlap with white dwarf envelopes is particularly significant — this new research enables experimental benchmarks of the basic properties of matter in this regime. The results should ultimately lead to improved models of white dwarfs, which represent the final stage of evolution for most stars in the universe. They are among the most ancient stellar objects, so their temperatures and predictable lifecycles enable them to work as “cosmic clocks” that can help determine the age of the universe and nearby stars and galaxies.


quake simulation

Simulated strength of shaking from a magnitude 7.0 Hayward Fault earthquake showing peak ground velocity (colorbar) and seismograms (blue) at selected locations (triangles).

Down to the ground

Even with the pandemic raging, natural disasters are having a busy 2020: tornadoes ravaged Nashville a few months ago; the chances of a new “big one” have dramatically risen in California’s fault zones; and meteorologists are anticipating a stronger-than-usual hurricane season for the U.S. More than ever, understanding and anticipating these events is crucial — and now, Lawrence Livermore National Laboratory (LLNL) researchers have used supercomputers to run the highest-resolution simulations of earthquakes.

Using a code developed at LLNL, the researchers simulated a magnitude 7.0 earthquake on the Hayward Fault, which runs along the San Francisco Bay Area. The new simulations ran at double the resolution of previous iterations, capturing seismic waves as short as 50 meters across the entire fault zone. These simulations required extraordinary computing power: in this case, LLNL’s Sierra system, which delivers 94.6 Linpack petaflops, placing it third on the most recent Top500 list. The Sierra-based simulations were run during Sierra’s open science period in 2018, before it switched to classified work. The team also made use of LLNL’s Lassen system (an unclassified machine with similar architecture to Sierra), which delivers 18.2 Linpack petaflops and placed 14th. 

“We used a recently developed empirical model to correct ground motions for the effects of soft soils not included in the Sierra calculations,” said Arthur Rodgers, a seismologist at LLNL. “These improved the realism of the simulated shaking intensities and bring the results in closer agreement with expected values.”

e-scrape news

e-scrape

LLNL researchers are using a naturally occurring protein to extract rare earth elements found in electronic waste such as computer components and smart phones.

Scraping by

A Lawrence Livermore team has used a natural protein, called lanmodulin, to extract rare earth elements from e-scrap, also known as electronic waste.

The protein has “several unique and exciting properties,” said LLNL researcher Gauthier Deblonde. These include the protein’s tolerance of industrially relevant conditions, such as high temperatures and low pH. The protein is able to reversibly sequester rare earth element ions.

Rare earth elements are used in numerous electronic devices such as computer components, high-power magnets, wind turbines, mobile phones, solar panels, superconductors, hybrid/electric vehicle batteries, LCD screens and night vision goggles. The vast majority of rare earths are sourced from Chinese suppliers, and disruptions in that supply chain in recent years have spurred U.S. interests to seek alternative sourcing possibilities. Recycling the rare earths from end-of-life electronics has emerged as one viable alternative.

Existing chemical processes to extract rare earth elements are complex and harmful to the environment but using the lanmodulin protein could be game changing.

“We were all amazed to discover that a natural protein can be so efficient for metal extraction,” Deblonde said. “I have worked on many molecules for metal purification but this one is really special.”

Computer with email graphic

Get the latest LLNL coverage in our weekly newsletter

The Lab Report is a weekly compendium of media reports on science and technology achievements at Lawrence Livermore National Laboratory. Though the Laboratory reviews items for overall accuracy, the reporting organizations are responsible for the content in the links below.