July 2, 2021
Even if the world manages to reduce its greenhouse gas emissions significantly, “carbon sinks” like regenerative farming are considered essential in facing the climate crisis because they sequester carbon dioxide that would otherwise remain in the atmosphere for thousands of years.
While newer technologies for carbon capture include injecting it into rocks and storing it deep underground, land management is a time-tested approach. Plants absorb carbon dioxide during photosynthesis, making forests, farms and rangelands some of the world’s biggest natural depositories of carbon. When managed with regenerative practices, they can sequester even more.
In a 2020 report, Lawrence Livermore National Laboratory included carbon farming among the practices the state will need to reach its goal to become carbon neutral by 2045. The report estimated that using such practices on the majority of the state’s cropland, along with minor emissions reductions in farming, could remove 10.6 million metric tons of greenhouse gas emissions annually.
Economic researchers are collecting new data on the downstream effects of tech transfer at the Lawrence Livermore National Laboratory in California.
The Laboratory, known for its work on nuclear weapons under the National Nuclear Security Agency, also has made significant advancements in supercomputing and other fields with commercial applications.
Enabled by patent license agreements and cooperative research agreements, private businesses have leveraged the Lab’s scientific prowess to create new products and new jobs.
To quantify the impacts of those public-private agreements, the economic research team at TechLink is now contacting approximately 450 companies, covering tech transfer agreements between 2000 and 2010.
Lawrence Livermore and Sandia national laboratories have set gold and platinum standards where few have gone before.
Like two superheroes finally joining forces, Sandia’s Z machine — generator of the world's most powerful electrical pulses — and Lawrence Livermore‘s National Ignition Facility — the planet's most energetic laser source — in a series of 10 experiments have detailed the responses of gold and platinum at pressures so extreme that their atomic structures momentarily distorted like images in a fun-house mirror.
Similar high-pressure changes induced in other settings have produced oddities like hydrogen appearing as a metallic fluid, helium in the form of rain and sodium a transparent metal. But until now there has been no way to accurately calibrate these pressures and responses, the first step to controlling them.
Following experiments on the two big machines, researchers developed tables of gold and platinum responses to extreme pressure. Data generated by experiments at these pressures — roughly 1.2 terapascals (a terapascal is 1 trillion pascals), an amount of pressure relevant to nuclear explosions — can aid understanding the composition of exoplanets, the effects and results of planetary impacts and how the moon formed.
There was an outside chance that China might pull a surprise on the HPC community and launch the first true exascale system — meaning capable of more than 1 exaflops of peak theoretical 64-bit floating point performance if you want to be generous, and 1 exaflops sustained on the High Performance Linpack benchmark if you don’t — but that didn’t happen. And so, we wait.
There was a reasonable amount of churn on the semi-annual Top500 ranking of supercomputers, the June list of which was divulged at the International Supercomputing (ISC) conference as is tradition. ISC21 was hosted online.
To get onto the Top500 list this time around, a machine had to have at least 1.52 petaflops of sustained Linpack performance. To be in the Top100, a machine has to have at least 4.13 petaflops running Linpack. The aggregate compute power of the entire Top500 list stands at 2.8 exaflops. The “El Capitan” machine going into Lawrence Livermore National Laboratory next year is expected to be somewhere north of 2 exaflops peak — call it 2.3 exaflops with maybe 1.5 exaflops sustained on Linpack.
Articular cartilage is a connective tissue lining the surfaces of joints. When the cartilage severely wears down, it leads to osteoarthritis (OA), a debilitating disease that affects millions of people globally.
New research could lead to the development of effective therapeutic approaches for the early diagnosis, prevention and treatment of OA.
The articular cartilage is composed of a dense extracellular matrix (ECM) with a sparse distribution of cells found in healthy cartilage, also known as chondrocytes, with varying forms and potentially different functions. A deep, molecular level understanding of various chondrocyte subpopulations and their functions is critical to develop effective methods for cartilage repair and regeneration.
Until now, a thorough molecular profiling of chondrocytes from healthy articular cartilage had not occurred due to technical difficulties in isolating and studying individual chondrocyte subpopulations from dense cartilage. Lawrence Livermore scientists profiled articular cartilage from healthy and injured mouse knee joints at a single-cell resolution and identified nine chondrocyte subtypes with distinct molecular profiles and injury-induced early molecular changes in these chondrocytes.