Lab Report

Get the latest LLNL coverage in our weekly newsletter

The Lab Report is a weekly compendium of media reports on science and technology achievements at Lawrence Livermore National Laboratory. Though the Laboratory reviews items for overall accuracy, the reporting organizations are responsible for the content in the links below.

Feb. 14, 2020


Stratocumulus clouds like these are ubiquitous across the global oceans, reflecting sunlight and keeping Earth cooler. Global climate models predict that their ability to shade the Earth weakens with global warming. Image courtesy of NASA.

Cloudy climate change projections

An apparently settled conclusion about how sensitive the climate is to adding more greenhouse gases has been thrown into doubt by a series of new studies from the world’s top climate modeling groups.

The studies have changed how the models treat clouds, following new field research. They suggest that the ability of clouds to keep us cool could be drastically reduced as the world warms — pushing global heating into overdrive.

Clouds have long been the biggest uncertainty in climate calculations. They can both shade the Earth and trap heat. Which effect dominates depends on how reflective they are, how high they are and whether it is day or night. Cloud dynamics are complex and happen on small scales that are hard to include in the models used to predict future climate.

And it's increasingly likely that clouds could become thinner or burn off entirely in a warmer world, leaving more clear skies through which the sun may add a degree Celsius or more to global warming. As Mark Zelinka of the Lawrence Livermore National Laboratory, lead author of a review of the new models recently published, has put it: The models “are shedding their protective sunscreen in dramatic fashion."

soil nutrients

This figure illustrates how the potential for plants to remove CO2 from the atmosphere (and transform it into plant biomass) is strongly regulated by nitrogen and phosphorus. Credit: Victor O. Leshyk. logo

Nutrients take it to the limit

Nitrogen and phosphorus found in soils are limiting the amount of carbon uptake stored in plants and soils, but maps of where this occurs across the globe are lacking.

A Lawrence Livermore scientist and international collaborators have developed a framework for testing nutrient limitations and a benchmark of nitrogen and phosphorus limitation for models to be used for predictions of the terrestrial carbon sink.

CO2 emissions from human activities play a double effect. On one hand, CO2 causes global warming and on the other, CO2 can stimulate photosynthesis. The increase in photosynthesis can increase plant growth, creating a feedback that can help absorb some of the CO2 in the atmosphere and slow global warming.

However, plants also need the right amounts of nutrients for growth, namely nitrogen and phosphorus, not just CO2. Understanding the limiting role of nutrients in the capacity of plants to help slow climate change is a priority to accurately model and predict climate change.

Elevated atmospheric CO2 and warming-induced longer growing seasons are likely resulting in greater nutrient limitation of terrestrial ecosystems.

brain on a chip

Using the 3D “brain-on-a-chip” platform, LLNL researchers were able to keep human-derived neurons alive, networked and communicating in a 3D gel, while non-invasively recording their electrical activity using Lab-developed, thin-film microelectrode arrays. Photo by Julie Russell/LLNL.

Chip on the brain

Engineers and biologists at Lawrence Livermore National Laboratory (LLNL) have developed a 3D “brain-on-a-chip” device capable of recording the neural activity of living brain cell cultures. This is a huge advancement in the realistic modeling of the human brain outside of the body.

LLNL researchers report that the creation of a 3D microelectrode array platform allowed them to keep hundreds of thousands of human-derived neurons alive, networked and communicating in a 3D gel. Using LLNL-developed, thin-film microelectrode arrays, they were able to non-invasively record the neurons electrical spikes and bursts for up to 45 days.

Researchers said the knowledge gained from the device could inform science in developing countermeasures for warfighters exposed to chemical or biological agents, model disease or infection, evaluate environmental toxins or aid in drug discovery without the need for animal models.


Emulators speed up simulations, such as this NASA aerosol model that shows soot from fires in Australia. Image courtesy of NASA.

Simulations are quick on the draw

Modeling immensely complex natural phenomena such as how subatomic particles interact or how atmospheric haze affects climate can take many hours on even the fastest supercomputers. Emulators, algorithms that quickly approximate these detailed simulations, offer a shortcut. New research shows how artificial intelligence (AI) can easily produce accurate emulators that can accelerate simulations across all of science by billions of times.

“This is a big deal,” says Donald Lucas, who runs climate simulations at Lawrence Livermore National Laboratory. He says the new system automatically creates emulators that work better and faster than those his team designs and trains, usually by hand. The new emulators could be used to improve the models they mimic and help scientists make the best of their time at experimental facilities. If the work stands up to peer review, Lucas says, “It would change things in a big way.”

A typical computer simulation might calculate, at each time step, how physical forces affect atoms, clouds, galaxies — whatever is being modeled. Emulators, based on a form of AI called machine learning, skip the laborious reproduction of nature. Fed with the inputs and outputs of the full simulation, emulators look for patterns and learn to guess what the simulation would do with new inputs. But creating training data for them requires running the full simulation many times — the very thing the emulator is meant to avoid. 


A new LLNL report outlines ways California could reach its goal of becoming carbon neutral by 2045.

Putting biowaste to use

In fall 2018, as California governor Jerry Brown’s final term was winding down, he signed an executive order that set a bold climate goal: The world’s fifth-largest economy needed to become carbon neutral by 2045.

That means the state will have to remove enough greenhouse gases from the atmosphere to balance out whatever amount its residents and businesses are still emitting 25 years from now.

But many wondered whether California, or any other major economy, could actually stop adding to the greenhouse gases driving climate change with the technology we have today. Researchers at Lawrence Livermore National Lab determined the answer is yes.

We do have to do an enormous amount of development. We have to scale up,” says Roger Aines, a co-author of the report who leads the lab’s Carbon Initiative, a research program on carbon dioxide removal. “But we don’t have to invent new things.”

The new study estimates California will need to remove at least 125 million metric tons of carbon dioxide per year by 2045. The authors conclude this can be done for less than $10 billion a year, about 0.4 percent of the state’s annual GDP.

The bulk of emissions reductions, 84 million tons a year, would come from waste biomass, including agricultural byproducts, food waste, livestock manure, human sewage, sawmill residue and trees removed from forests.