Lab Report

Get the latest LLNL coverage in our weekly newsletter

The Lab Report is a weekly compendium of media reports on science and technology achievements at Lawrence Livermore National Laboratory. Though the Laboratory reviews items for overall accuracy, the reporting organizations are responsible for the content in the links below.

June 24, 2022

dark matter

LLNL postdoc Nathan Woollett and LLNL staff scientist and ADMX co-spokesman Gianpaolo Carosi work on the cryostat system in which LLNL places its test microwave cavity.

Spending a lifetime looking for answers

In physics, experiments to answer the big questions can take decades to run — and might not produce any findings at all.

The Axion Dark Matter Experiment (ADMX) is one of the world’s best shots at uncovering the hypothetical axion particle — a leading candidate for dark matter. ADMX researchers have set out a clear path to finding what they seek.

Theory suggests one of the few ways to spot axions — which could be constantly showering Earth without our knowing — is with strong magnetic fields, which should change axions into photons. Once they’re photons, researchers would then measure the light’s frequency, which would directly relate to the axion mass.

ADMX aims to do just that. “It’s really a glorified AM radio,” said Gianpaolo Carosi, LLNL physicist and ADMX co-spokesperson. If axions do exist and the instrument is tuned to precisely the right wavelength, its cavity will resonate, amplifying their signal so that ultra-sensitive quantum electronic detectors can pick it up.

“Every 100 seconds or so we just sit at one frequency and get noise like that hiss that you hear on your radio when you don’t have signal,” Carosi said. “Then we’ll move just a small amount, about a kilohertz, and we’ll do another 100 seconds.”

First constructed in 1995, ADMX only achieved the full sensitivity needed to probe whether the axion might be the dark matter particle in 2018. Since then, researchers have been slowly turning the dial through the frequencies. They will complete the current search around 2025.

Though work to optimize the axion hunt is unending, and random fake signals injected into the detector keep the team on their toes, Carosi needs little extra motivation to keep going — even with the very real prospect of potentially having to listen to seven years of static.

“I would love for the axion to show up, but if we find dark matter elsewhere, or the axion is ruled out as a candidate, I’m fine with that,” he said. “We’ve already kind of drunk the Kool-Aid.”

lawrence winners

Sofia Quaglioni and Jennifer Pett-Ridge have been honored as E.O. Lawrence award recipients.

Honor of a lifetime

U.S. Secretary of Energy Jennifer Granholm announced this week 10 U.S. scientists and engineers as recipients of the prestigious Ernest Orlando Lawrence Award for their exceptional contributions in research and development supporting the Energy Department’s missions in science, energy and national security.

LLNL scientists Jennifer Pett-Ridge and Sofia Quaglioni are among the winners. Pett-Ridge was recognized for her research in biological and environmental sciences for pioneering work in quantitative microbial ecology while Quaglioni was honored for her work in nuclear physics, specifically for seminal contributions unifying the theory of structure and reactions of light nuclei, providing predictive capability critical for understanding inertial fusion and nuclear astrophysics.

The Lawrence Award was established to honor the memory of Ernest Orlando Lawrence, who invented the cyclotron — an accelerator of subatomic particles — and was named the 1939 Nobel Laureate in physics for that achievement. Lawrence later played a leading role in establishing the U.S. system of national laboratories, and today, the DOE’s national laboratories in Berkeley and Livermore bear his name. The Secretarial award is among the longest-running and most prestigious science and technology awards bestowed by the U.S. government.


Lawrence Livermore National Laboratory co-author Caitlyn Krikorian Cook, a group leader and polymer engineer in the Lab’s Materials Engineering Division, characterized the curing kinetics of the nanocomposite silica glass resin with light exposure. Photo by Garry McLeod/TID.

direct industry logo

Making glass in a jiffy

Glass is increasingly being used in fiber optics, consumer electronics and microfluidics for “lab-on-a-chip” devices. Unfortunately, traditional glassmaking can be costly and slow, and 3D-printed small glass objects have rough surfaces, making them unsuitable for lenses.

To solve those problems, a research team from Lawrence Livermore and the University of California, Berkeley have devised a new method of 3D printing known as volumetric additive manufacturing (VAM) to make glass parts all at once rather than layer by layer. The team used VAM to print microscopic, delicate, layer-less silica-glass objects in silica glass parts in only seconds or minutes.

The new technique could have real-world applications including micro-optics in high-quality cameras, consumer electronics, biomedical imaging, chemical sensors, virtual-reality headsets, advanced microscopes and microfluidics with challenging 3D geometries such as “labs-on-a-chip.” Plus, the benign properties of glass lend themselves well to biomedical applications as well as those that must withstand high temperatures or chemical exposure.


Eric Robey, left, a Sandia mechanical engineer, and Joseph Pope, a Sandia technologist, prepare a plexiglass cube for a small-scale explosion. The information from this study could be used to create new geothermal energy systems in locations where it is currently not feasible. Credit: Bret Latter.

Geothermal energy is a blast

Geothermal energy has a lot of promise as a renewable energy source that is not dependent on the sun shining or the wind blowing, but it has some challenges to wide adoption.

One challenge is that there are only a few places in the U.S. that naturally have the right combinations of hot rock close to the Earth's surface with available underground water. Another challenge is the initial start-up cost of drilling and connecting geothermal wells. Researchers from Sandia and Lawrence Livermore national laboratories are leading a team to explore whether explosives can reduce those two challenges. Livermore’s role is modeling the explosions and improving the predictability of forming fracture networks.

LLNL has trusted computer models of underground explosions based on decades of experiments, starting with underground nuclear testing in the 1960s, said Oleg Vorobiev, a computer model expert at LLNL. However, the experiments these models were based on were mostly conducted in cold rock rather than the hot rock needed for geothermal energy production.

"Eventually, the goal is to understand how to create fracture networks in hot, compressed granite at significant depths," Vorobiev said. "This is very challenging, computationally, because events occur at different timescales. Shockwave propagation is very fast compared to microfracture formation caused by the explosive gases."


Lawrence Livermore National Laboratory and Amazon Web Services have signed a memorandum of understanding (MOU) to define the role of leadership-class high performance computing in a future where cloud HPC is ubiquitous.

On common ground

Lawrence Livermore National Laboratory (LLNL) and Amazon Web Services (AWS) have signed a memorandum of understanding (MOU) to define the role of leadership-class high performance computing (HPC) in a future where cloud HPC is ubiquitous.

Under the MOU, LLNL and AWS will explore software and hardware solutions spanning cloud and on-premises HPC environments, with the goal of establishing a common stack of open-source software components that can run equally well at both large HPC centers and on cloud resources.

“The cloud HPC market is growing, and clouds are becoming a viable way to run HPC jobs,” said computer scientist Todd Gamblin, who is leading the effort for LLNL. “More software is being developed for the cloud environment than for traditional HPC centers, and larger portions of our workflows are going to start looking like cloud software. So, we want to be in tune with mainstream software development, take advantage of this ecosystem and be able to deploy it easily, the way that clouds do.”