PARADYN, a breakthrough computer code for parallel machines, was ready for action--the real action that only the most powerful of supercomputers could provide. Action began late last year when the IBM SP2 arrived, the first of a series of massively parallel supercomputers for Lawrence Livermore. Within days, ParaDyn was loaded on the machine and used to simulate, for the first time, a one-million-element model of a ground shock--a very large problem. The first simulation immediately demonstrated the machine's great power and potential, as well as the efficacy of ParaDyn.
New codes like ParaDyn are an important component of Livermore's effort to ramp up to 100-teraflops (100 trillion floating-point operations per second) power, requisite to solving problems for DOE's Accelerated Strategic Computing Initiative. ParaDyn is the parallel-computing version of DYNA3D, one of Livermore's premiere codes for modeling and predicting thermomechanical behavior.
The development of ParaDyn, ongoing for the past six years, was performed not just in anticipation of parallel computing needs, but because of the forward momentum provided by computational science in the Methods Development Group of Lawrence Livermore's Engineering Directorate. The group was formed in the 1970s to develop modeling tools that were critically needed by Laboratory nuclear weapons projects but were commercially unavailable (see box below). The broad capabilities of these tools--for analyzing the complex system deformations and nonlinear material interactions--made them valuable commercial products. Over the next 20 years, the group pursued cutting-edge code development and expanded the modeling technology base. They produced an entire family of state-of-the-art software for nonlinear analysis of thermomechanical systems.
And the work continues. As long as the Laboratory's scientific knowledge is needed for weapon systems and other extremely large systems advancements, the group must continue to develop and improve codes. The tools they produce must satisfy the Laboratory's analysis and computational performance requirements. Peter Raboin, leader of the group, puts it this way: "If we end up merely duplicating the work of others and, worse yet, they are solving the same problems faster and better, then we might as well close up the shop." So the group forges ahead, adding to the code family and making existing codes more robust, more accurate, faster, and capable of ever more applications.





Structural Problems, Computer Solutions

A Parallel DYNA3D (ParaDyn) simulation of weapons is only one example of finite-element analysis of structural behavior. Other finite-element problems include simulating car crashes and train accidents, falling nuclear waste containers, ground-shock propagation, aircraft-engine interaction with foreign debris, metal forming, biomedical interactions, and component designs for cars and aircraft.
The physics of structural behavior can be expressed in mathematical equations that can be translated into problems practicable for solving on the computer. Of the many computational techniques developed and used for solving structural mechanical problems, the one that became the most widely adopted by the 1970s--and the one that Laboratory computational scientists have advanced--has been the finite-element method.
In the finite-element method, a complex, solid object is divided into an assemblage of simple elements that become the basic units on which the computer calculates the structural behavior. Each element can be made as small and as irregularly shaped as needed to model the object being analyzed. Visually, the collection of elements resembles a wire mesh; the element boundaries are defined by lines that intersect at junctions called nodes. The nodes at each element corner define the movement and deformation of the elements.
The powerful versatility of finite-element codes is best exemplified in the numerous nonlinear material behaviors they can model: elasticity, plasticity, viscoplasticity, viscoelasticity, hyperelasticity, viscous flow, granular flow, thermomechanics, thermodynamic equations of state, composite behaviors, nonlinear foams and solids, void growth, and evolutionary damage accumulation. Simulating all of these behaviors is possible because the finite-element method uses constitutive equations to predict the element stress state based upon changing element deformations.






Two Classes of Codes
The Methods Development Group's most prominent mechanical codes are DYNA3D and NIKE3D, which address the behavior of structures as they deform and (possibly) fail, and TOPAZ3D, which addresses the thermal behavior of materials undergoing heating or cooling.
Finite-element codes solve problems by marching them forward step by step. There are two methods for calculating the solution at each new step. One method, called explicit, relies only on data from previous steps. Its equations are not coupled to other variables, thereby keeping to a minimum the amount of computer memory and work required for a solution. DYNA3D, an explicit code, has the ability to handle a large number of mesh elements because of smaller memory requirements than the counterpart implicit method. DYNA3D is suitable for analyzing rapidly changing events--a car collision, for example. However, because explicit methods require very small time steps, DYNA3D is suitable for calculating only brief phenomena (typically less than one second in duration).
NIKE3D and TOPAZ3D are implicit codes. At each step, the unknown future values of all variables must be solved. The equations thus become coupled in a set of simultaneous equations, which are stored in matrix form and must be solved to reach the next step. Solving these sets of equations requires considerable computer memory, thereby limiting the number of elements that can be solved by this method. However, very large steps in time can be taken by the implicit method compared with those taken by the explicit method, allowing events of very long duration to be modeled (typically seconds to years in duration). In addition to dynamic analysis, implicit codes such as NIKE3D and TOPAZ3D can be used to simulate quasistatic events, such as the stresses in a bridge as a car drives across, and static events, such as stress in a bridge under its own weight.
Over time, DYNA3D, NIKE3D, and TOPAZ3D have been improved and applied to an ever-widening range of engineering analyses. DYNA3D is also widely known and used throughout industry, as a result of its dissemination through a collaborator's program (see Energy & Technology Review, September-October, 1993, pp. 1-5). New applications for all of these codes are developed nearly as fast as the computers available to handle their analyses.

Enhancing DYNA3D
DYNA3D, suitable for solving problems involving rapid change, has had many applications in safety analysis. Laboratory analysts have used DYNA3D to study crashworthiness in a number of vehicle safety studies, where models of complex vehicles impact roadside safety structures and other vehicles, deforming under the impact. Now, on a project led by physicist Rich Couch, in collaboration with the Federal Aviation Administration and Boeing, AlliedSignal, and Pratt & Whitney, DYNA3D is being applied to a different kind of safety evaluation. Couch and his team are studying the dynamics of fan-blade failure in a jet engine and its effect on the airframe.
When an engine fan blade breaks off, it generates fragments and debris that could further damage the engine or the airframe unless they are confined by the engine's containment system. Although such breakage is rare, the Federal Aviation Administration nevertheless will not certify a new jet-engine design unless engine tests, in which a fan blade is intentionally broken during operation, show that
the engine casing can contain the fragments. Modeling with DYNA3D reduces the number of expensive tests needed for certifying a new design.
The model required for simulating a fan-blade failure is a complex one. The spinning rotor body inside the jet engine must first be simulated in its steady-state operation. The engine model then undergoes rapid change as a fan blade breaks off, hits other blades, and throws the engine rotor off balance, causing possible critical vibrations in the entire aircraft. The simulation required many improvements in DYNA3D, which were designed and implemented by Edward Zywicz, a computational mechanics specialist in the Engineering Directorate.
To construct the model, computational analyst Greg Kay started with an engine mesh that had been used previously to simulate blade breakoff but had obtained the wrong results. The model's simulation had disagreed with observed test data by overestimating the consequences of fan-blade failure and predicting the loss of a substantial portion of the fan-blade assembly. The errors helped Kay determine what improvements were needed for the mesh and how DYNA3D should be modified and enhanced for this problem.
For the spinning rotor-body model, Kay performed a series of analyses to determine how stress was distributed over the fan-blade assembly and how it was affected by rotation, blade breakoff, and an unbalanced shaft. Before constructing a full spinning-body model, Kay first experimented with a simplified shaft-and-blade assembly with the same radial dimensions and material properties as the more detailed engine model. He simulated the spinning of this simplified, eight-blade engine for ten revolutions; this told him whether DYNA3D programming predicted the correct load stresses in the spinning blades. The DYNA3D simulation of the full-blade assembly (Figure 1) shows that the modeled circumferential stresses of steady-state engine operation are consistent for a curved fan-blade assembly.






The depiction of the blade fragments interacting with each other, as well as with following blades and the engine containment casing, posed special challenges. These "contact" problems required accurate mesh details at the interaction points, to more specifically calculate the material interactions there. Automatic contact algorithms were used for this large simulation. For his model, Kay explicitly defined the contact surfaces, but recent improvements in the code's ability to automatically define interacting parts would make that task easier today. Figure 2 shows the engine approximately one-third of a revolution after the fan blade had broken off.
The simulations made by Kay demonstrate that the DYNA3D code can provide high-confidence simulations of blade failure. Based on those simulations, additional tasks, with the help of partners AlliedSignal and Pratt & Whitney, were begun in late 1997 to simulate the response of engine containment casings to the blade breakage and subsequent debris collisions.





NIKE3D for Biomechanics
Joint-replacement surgery, an increasingly common medical procedure, has been hampered by the inadequacies of today's prosthetic joint implants. Replacement prostheses sometimes do not reproduce normal joint movement and, because they have high failure rates, often cause their recipients to undergo additional costly, painful surgery. Attempts to improve the prosthetic designs have been based on limited information garnered from testing and failed implants. To obtain better information, implant designers have needed a way to quantitatively assess the stresses and loads on an implanted prosthesis and the wear and tear on its fabrication materials.
Nonlinear, three-dimensional, finite-element modeling is a powerful tool for performing such quantitative assessment. NIKE3D, developed for studying dynamic, finite deformations, can model the behavior of joint tissues and bones subjected to different loads and joint movement with and without prosthetic implants. It is being used by Karin Hollerbach, Scott Perfect, and Elaine Ashby, a team of scientists from Livermore's Engineering and Computations directorates, to model both prosthetic implants and human joints. Data from the prosthetic simulations are providing useful information for prosthetic design and, when integrated with data from simulations of human joints, will provide information for evaluating the performance of implanted devices.
Models of implants are based on computer-aided design (CAD) drawings of the prosthesis, from which the finite-element meshes are made. The models include numerical data that define the properties of the implant materials--including polyethylene and several metal alloys in the case of the prosthetic devices being studied--and the forces acting on the joints. For the thumb-joint models that the researchers have constructed (Figure 3), the loads produced during commonly used grasps (the key pinch, screwdriver grasp, and tip pinch) were used as the basis for assessing the performance of three thumb-joint designs. (See S&TR, September 1996, for more information on this project.) The stress predictions from the finite-element simulations were correlated against clinical findings. The agreement between simulations and clinical findings obtained by the researchers validates the modeling approach and pinpoints the most failure-prone designs. Thus, resulting modeling data can be used for selecting available implants as well as designing new ones.





For the human joint models, the researchers had to develop a process for acquiring human data and converting it into a form suitable for finite-element analysis. They started by scanning a human joint, using computed tomography (CT) to acquire geometric data and bone surface definitions. To these data they added magnetic resonance imaging (MRI) data from collaborators that delineate soft joint tissues. Together, CT and MRI data define how bones and tissues attach to each other. Their resulting images were converted into three-dimensional surfaces using specially developed processing software, and those three-dimensional surfaces were then meshed for finite-element analysis (Figure 4).
The researchers are currently working to integrate the data from the human and prosthetic joint models. These integrated models will be used to calculate the bone stresses resulting from the loads exerted by the implants and make bone-implant interface analyses possible.





TOPAZ3D for Laser Heating
Before scientists at the National Ignition Facility (NIF) begin performing laser experiments, they must fully understand how the various NIF components will behave in operation so they can design their experiments accordingly. One bit of crucial information they must know is how the NIF target chamber will expand and move as it heats up over the course of repeated laser shots. Attached all around the chamber are series of lenses that focus laser beams onto the target.
Design analyses predicted the steady warming of the chamber and the equilibrium temperatures it would reach during various operating scenarios; the design of the lens assembly around the chamber allowed for these temperature increases. However, other transient temperature effects also must be taken into account, in particular, the periodic temperature spikes caused by individual laser shots.
These temperature spikes cause a rapid thermal expansion of the target chamber. The chamber then slowly contracts, back to its equilibrium shape in time for the next laser shot. The concern is whether contractions affect the focus of the lenses. Just before a laser shot, these lenses must be in their positions to within tolerances of only a few millionths of a meter.
Wayne Miller, a thermal analyst, was called on to model and simulate the temperature changes of an operating target chamber. He used the heat transfer code TOPAZ3D, developed by Art Shapiro of the Thermo-Fluids Group, in conjunction with NIKE3D to create an analysis model.
The target-chamber model consists of an aluminum inner shell, a concrete outer shell, and aluminum ports through which laser beams are delivered. First, the finite-element geometry was created using TrueGRID, a commercial mesh-generation code. Then TOPAZ3D was used to predict the transient thermal behavior of the model, primarily due to laser energy deposited on the inner wall surfaces, which is then conducted through both shells and convected to the outside air. The thermal results were then given to NIKE3D, which predicted the thermal expansions of the chamber and prescribed the desired motion of the laser lenses for the next shot (Figure 5).





The composite model was used to simulate thermal changes occurring in the chamber when laser shots are fired once every 4 or 8 hours. The 4-hour operating sequence, more challenging to the thermal stability of the chamber, was simulated under various conditions: with laser energies deposited uniformly and nonuniformly, with different outside air temperatures, and with different values for heat transfer from the chamber to the environment.
Because one major goal of the overall analysis was to evaluate two ways of cooling down the chamber, simulations were performed with boundary conditions for passive cooling, in which heat transfers out through radiation and convection, and active cooling, and depends on the installation of a cooling system (Figure 6).
The effects of the target chamber pedestal were also studied. A separate model provided data on how this pedestal grew taller and shrank back down and how this change affected the chamber and ultimately the focusing angle of the lenses.
The results from the simulations for the different operating scenarios, cooling-down options, and pedestal influences were evaluated together to provide details on lens displacements--tangential, radial, or rotational--under different time and temperature conditions.





DYNA3D Becomes ParaDyn
The availability of massively parallel supercomputers, such as the IBM SP2, means that analysts now can solve problems larger by one or two orders of magnitude than was possible with other existing supercomputers. It is now possible and practical for engineering analysis codes to contain meshes with up to ten million elements. Now the limiting factor for problem size and complexity is the amount of time that analysts need to design the mesh rather than the raw speed of the computers.
That boost in computing power depends on capable, efficient parallel codes. Because of the work of a code-developer team led by Carol Hoover, such a code, ParaDyn, is now available. As a result, parallel computing in solid and structural mechanics is becoming a powerful tool in computational engineering.
To take advantage of the power of parallel computers, the mesh for a large problem must be partitioned so that the calculations for each part may be performed in parallel by the processors. The more processors working on the problem, the faster the solution is obtained. Multiprocessor calculations depend on the computer program to appropriately divide a problem, to logically sequence its calculations, and to allow communication among the processors so that their solutions can be integrated into a final correct answer.
For computing efficiency, the workload must be evenly distributed among processors so that no processors are waiting on others. Furthermore, calculation time and processor communication time must be balanced; the latter must be limited to a small fraction of the overall computing time. Thus, for optimal efficiency and performance, a problem must be divided using a method that minimizes communication among the processors.
The task of partitioning a problem may be needed more than once at the beginning of a parallel calculation. Because the problems are dynamic, the mesh may change significantly during the evolution of the deformation, affecting the parallel efficiency. When this happens, repartitioning the mesh and boundary conditions is often necessary. Developing methodologies for these repartitioning tasks is challenging, and research is still in progress in this area.
ParaDyn's success has spun off several benefits to the weapons engineering programs at Livermore. Calculations that previously took several weeks are now performed in a day or less. New models are being generated for mesh sizes between one million and ten million elements--an order of magnitude larger than the largest models possible in the past.

The Next Goals
With the completion of ParaDyn, the development of parallel algorithms for NIKE3D and TOPAZ3D has begun. "This effort," Raboin explains, "is an even greater challenge. Developing efficient procedures for solving implicit matrix equations on single-processor computers is already difficult, so imagine the immensity of doing so for parallel processors. That's not to say that projects to couple the DYNA3D, NIKE3D, and TOPAZ3D codes to solve more complex physics problems will be any easier. Those are hard problems, too."
With their experienced research and development, the Methods Development Group is helping to change the nature of engineering work and, in the long term, the very nature of problem solving.

--Gloria Wilt

Key Words: computational mechanics, computer modeling and simulation, DYNA3D, finite-element method, nonlinear behavior, NIKE3D, ParaDyn, parallel computing, solid and structural analysis, TOPAZ3D.

For more information contact Peter Raboin (925) 422-1583 (raboin1@llnl.gov).

PETER J. RABOIN joined Lawrence Livermore in 1989, when he became a member of Mechanical Engineering's Applied Mechanics Group. He has been serving as the group leader for Methods Development since 1995. Raboin received degrees in mechanical engineering from Georgia Institute of Technology (B.S., 1982) and Massachusetts Institute of Technology (M.S.M.E., 1985; Sc.D., 1989). He has published papers on thermomechanics and finite-element method.






Back to May 1998