LAB REPORT

Science and Technology Making Headlines

April 30, 2021


air capture

The Climeworks direct air capture plant in Hinwil, Switzerland, binds carbon dioxide to water and pumps it far underground. Image courtesy of Climeworks.

Time is running out on climate change

Solar panels, wind turbines and electric cars will go far in helping California and the Biden administration meet their aggressive climate goals — but not far enough. As time runs short, scientists and government officials say the moment to break out the giant vacuums has arrived.

Though it is a challenge, carbon capture and storage is one way the world can keep global temperatures from rising more than 1.5 degrees Celsius, the central commitment of the Paris Agreement on Climate Change, which aims to avert cataclysmic effects.

Pipelines need to be built, vast geological reservoirs deep underground need to be fashioned into carbon dioxide storage facilities, and costly new technologies for vacuuming carbon from the air and factories need to be brought up to scale.

California regulators are closely watching the progress of the hulking direct-air-capture facility Carbon Engineering is building with Occidental Petroleum in the Permian Basin of Texas. The 100-acre operation aims to capture up to 1 million metric tons of carbon dioxide each year.

Even if the Texas plant meets its goals, the carbon dioxide removed by it would reflect less than 1 percent of the emissions California needs to pull from the atmosphere to hit its climate targets, according to estimates by Lawrence Livermore National Laboratory.

sci tech

solar system

LNL researchers have found that the current locations of many planetary bodies in the solar system are not where they originally formed. Credit: NASA

Putting the pieces back together

As the solar system was developing, the giant planets (Jupiter and Saturn) formed very early, and as they grew, they migrated both closer to and further away from the sun to stay in gravitationally stable orbits.

The gravitational effect of these massive objects caused immense reshuffling of other planetary bodies that were forming at the time, meaning that the current locations of many planetary bodies in our solar system are not where they originally formed.

Lawrence Livermore scientists set out to reconstruct these original formation locations by studying the isotopic compositions of different groups of meteorites that all derived from the asteroid belt between Mars and Jupiter. The asteroid belt is the source of almost all of Earth’s meteorites, but the material that makes up the asteroid belt formed from sweeping of materials all over the solar system.


asteroid

LLNL scientists have suggested that a nuclear explosion could throw an asteroid dashing toward Earth off course.

The right stuff

If you want to move an asteroid, you need the right kind of nuclear explosion.

In a plot straight out of a 1990s action flick, scientists from Lawrence Livermore and the U.S. Air Force are studying the ways humans could detonate nuclear weapons to deflect an asteroid hurtling toward Earth.

“In the future, a hazardous asteroid will find itself on a collision course with Earth,” the researchers write in their new study.

Relatively speaking, an asteroid doesn’t have to be huge in order to cause serious destruction. The simulated one in this study was 300 meters in diameter, or about 1,000 feet. That’s six Olympic-length swimming pools. It’s not big, but it would still take out several big city blocks with the initial impact alone.

NASA has found nuclear weapons are “10 [to] 100 times more effective” at moving asteroids out of a collision course with Earth than non-nuclear options, the scientists say. Here’s why: The energy density of nuclear devices is many times higher that of regular materials. This gives the nukes the right amount of force to push near-Earth asteroids out of the way.


gold spheres

This image shows the spectral feature from the optical Thomson scattering diagnostic that is used to infer density and temperature. This feature appears due to laser light scattering off background density fluctuations in the plasma.

Right out of the oven

A team of scientists including Lawrence Livermore has conducted an analysis of directly driven gold sphere experiments to test heat transport models used in inertial confinement fusion (ICF) and high energy density (HED) modeling. It was found that overly restricting the heat flux caused disagreement with measurement.

This research was conducted in an effort to determine the source of the "drive-deficit," a long-standing problem in ICF and HED modeling, whereby capsule bang-times always seem to occur later than in simulations and the amount of X-ray flux in a hohlraum is over-predicted by simulation.

Lawrence Livermore's Will Farmer, who served as the lead designer, compared energy balance in a hohlraum to cooking a cake in an oven. "You've put your cake in the oven," he said. "And you want to know when you have to take it out. In order to know, you need to understand how much energy you're putting into the oven, how much energy is reflected by the walls and how much energy is lost via conduction out of the walls, so that your estimate of the oven temperature is correct."

Farmer said that for whatever reason, the oven is cooler than we think it should be and the cake always seems to take longer to cook than we think it should, and this work is trying to figure out why the oven is "leakier" than expected. "We've determined that heat transport, at least for directly driven gold spheres. No one wants a runny cake," he explained.

Farmer said the work sits at the heart of the stockpile stewardship mission for the Lab,

datanami

xray

LLNL researchers are using self-training deep learning models to diagnose disease from X-rays.

Taking a deep dive into diagnosis

Improving disease diagnosis is one of the most popular applications for deep learning researchers – and now, they’re looking to streamline the process: new research from Lawrence Livermore National Laboratory (LLNL) uses new “self-training” deep learning models to diagnose diseases from X-ray imaging.

Data acquisition and labeling are big pain points for AI applications in diagnosis: health data is hard to acquire, costly when available and difficult to label. This, in turn, impedes the AI models’ ability to diagnose, as they lack the data necessary to make inferences. 

“Building predictive models rapidly is becoming more important in health care,” said Jay Thiagarajan, a computer scientist at LLNL. “There is a fundamental problem we’re trying to address. Data comes from different hospitals and it’s difficult to label — experts don’t have the time to collect and annotate it all. It’s often posed as a multi-label classification problem, where we are looking at the presence of multiple diseases in one shot. We can’t wait to have enough data for every combination of disease conditions, so we built a new technique that tries to compensate for this lack of data using regularization strategies that can make deep learning models much more efficient, even with limited data.”

Computer with email graphic

Get the latest LLNL coverage in our weekly newsletter

The Lab Report is a weekly compendium of media reports on science and technology achievements at Lawrence Livermore National Laboratory. Though the Laboratory reviews items for overall accuracy, the reporting organizations are responsible for the content in the links below.