Back

Supercomputers add new dimension to earthquake modeling

(Download Image) A 3D graphical representation of the Alum Rock magnitude 5.5 earthquake using the WPP code on BlueGene/L to simulate the shear waves. The colors indicate the strength of the shaking at one instant in time, with orange being the strongest and blue the weakest. The lighter blue indicates deep water. The Alum Rock earthquake in October 2007 was the strongest to hit the Bay Area since the Loma Prieta earthquake of 1989.

Potentially damaging earthquakes tend to occur infrequently on geologic time scales. Coupled with their complex nature, this makes it difficult to foresee — based on experience and expert estimates — a region’s expected seismological behavior.

"Earthquakes are likely caused by chaotic processes, which are all but impossible to predict on short time scales, such as hours or days," points out Lab geophysicist and computer scientist Shawn Larsen. However, the research community is focused on better understanding how the ground will respond to strong forces such as fault ruptures.

That is the subject of three research papers appearing in the April edition of the Bulletin of the Seismological Society of America , on the 102nd anniversary of the 1906 San Francisco earthquake.

Six Laboratory researchers contributed to the reports. The first paper focuses on estimating the waves of ground motion that occurred across central and northern California in the 1906 quake, as well as ground motions from large hypothetical earthquakes that might occur in the future along the San Andreas faults. It includes authors from the U.S. Geologic Survey (USGS), led by lead author Brad Aagaard; UC Berkeley; and the engineering design firm URS Corp.

An accompanying paper, with Larsen as the Laboratory participant, details how observations of the 1989 Loma Prieta quake were used to validate modeling of the 1906 earthquake. A third from five collaborators at the Laboratory uses data from 12 recent moderate Bay Area earthquakes to evaluate the three-dimensional geologic and seismic model created by the USGS. The model was developed using geologic and geophysical data collected over years of study.

"The model was built for several purposes," said LLNL lead author Arthur Rodgers, "one of which is to compute ground motions for earthquakes with high-performance computing."

His group’s simulations were performed on two of the Laboratory’s computing clusters. Their WPP code was recently transferred to the Laboratory’s BlueGene/L supercomputer. As part of an ongoing effort to extend resolution of the calculations, on a recent weekend, the group performed a calculation using 26 billion grid points on 16,384 computing nodes (32,768 CPUs).

"Because the model is three-dimensional, every time we halve the grid point spacing, we increase the number of factors by eight," Rodgers said. "It escalates rapidly."

He adds: "The Geophysical Monitoring Program would like to be able to model any kind of seismic disturbance anywhere in the world." In addition to his work at the Laboratory, Rodgers has a USGS National Earthquake Hazards Reduction Program project and a faculty appointment at the UC Santa Cruz.

A current project in his research group is to predict the seismic signals from earthquakes and explosions in the Middle East in support of the Laboratory’s nonproliferation mission.

The high-performance computing models of ground motion include animations of waves moving outward, color-coded by intensity. "You can start to see effects that aren’t discernible in a static view," Rodgers says.

Larsen, who worked on the other two papers and has a joint appointment at UC Berkeley’s seismological lab, said the simulations contained some epiphanies. He was surprised by the extent of ground motion that sweeps the Bay Area from a simulated quake to the north.

"If the 1906 earthquake had occurred on the north end of the San Andreas fault, the shaking in San Francisco would have been much more devastating," he said. "The scientific benefit ultimately is to try to understand how the ground shakes during earthquakes. Ground motion predictions are used for public policy and engineering design."

One of the worst impacts from the Loma Prieta earthquake, the collapse of the Cypress freeway structure in Oakland, was attributed to not having anticipated the degree of up-and-down motion that occurred. "Engineering matters very much," Larsen affirmed.

Some of the first efforts in predicting ground motion through 3D computer models were undertaken at the Laboratory in 1998 through a Laboratory Directed Research and Development grant, right around the time three-dimensional modeling became possible. That project centered on the Hayward fault, which, Larsen said, "is kind of set to go again. It has the highest probability for an earthquake in the Bay Area, about a 25 percent chance of a 6.7 magnitude or greater earthquake within the next 30 years. It’s under a very populous area, so it is considered to be the most dangerous fault in the area. The last large quake there was 140 years ago in 1868."

Rodgers’ project was made possible through a supercomputing grand challenge allocation, which aimed to enhance three-dimensional seismic modeling. The computed seismograms were compared with observed recordings for 12 moderate quakes, which can be simulated as simple point sources for frequencies modeled. The observations came from a dense network of about 20 seismic stations that record broadband data in the Bay Area.

"The strongest motions are the late-arriving surface waves," he explained. "They have long periods and tell us a lot about the underlying structure as well as properties of the source." Basins, such as the Livermore Valley, behave like bowls of soft, pliant soil, which amplifies motion that is trapped by the contrast of material properties compared to the underlying hard rock. The simulations reproduce the observed amplification and long duration of shaking in the basins. Furthermore, "Some energy is reflected, refracted and bent from the 3D structure," Rodgers says.

Larsen’s team used the E3D wave propagation code to perform the largest and most physically complex simulations of both the 1906 earthquake and the scenario earthquakes along the San Andreas fault. They took advantage of a grand challenge computer allocation and three Laboratory computing systems, along with the Earth Simulator supercomputer in Japan.

E3D has been around since the late 1980s, and was one of the first 3D physics codes to run on supercomputers. The code is currently used at more than 100 institutions worldwide, and is applied to Laboratory-funded projects with industry, the intelligence community, and in nonproliferation efforts, as well as earthquake hazards. In fact, E3D is being used on a Blue Gene/L system at IBM Rochester.

The collaborating groups that are publishing these papers conducted their work in a variety of places, Larsen added. "Everybody had different codes and machines to implement the problem best suited for their capabilities." He initiated the effort in conjunction with a couple of the USGS contributors.

The USGS continues to refine its three-dimensional model, based on these computing projects. Rodgers said the model evaluation with the WPP code was "an excellent first step...We found the seismic wave speeds are a bit fast, but the model accurately predicts the amplifications and the complex wave response."

Larsen sees the field moving continually toward incorporating more robust physical simulations. Rodgers agreed. "As computers get bigger and more capable," he predicted, "it will be possible to make these calculations at ever higher resolution."

March 28, 2008

Contact

Nancy Garcia
[email protected]