'Bubbles' simulation on Sequoia wins Gordon Bell Prize
new record for a high performance computing calculation set on Lawrence Livermore National Laboratory's Sequoia
supercomputer was awarded the Gordon Bell Prize for peak performance Thursday
at SC13 in Denver, Colo.
Scientists at ETH Zurich and IBM Research, in collaboration with the Technical University of Munich and the LLNL, have set a new record in supercomputing in Fluid Dynamics using 6.4 million threads on LLNL's 96 rack Sequoia IBM BlueGene/Q, one of the fastest supercomputers in the world.
Lawrence Livermore computer scientists Adam Bertsch, Blue Gene Systems lead, and Scott Futral, group leader for the HPC development environment, also were members of the winning team. Livermore Computing enabled the achievement of this simulation on Sequoia.
Members of the winning team include: lead author Diego Rossinelli, Babak Hejazialhosseini, Panagiotis Hadjidoukas and Petros Koumoutsakos of ETH Zurich; Costas Bekas and Alessandro Curioni of IBM Zurich Research; and Steffen Schmidt and Nikolaus Adams of the Technical University of Munich.
The prize was awarded for an 11 petaflops (11 quadrillion floating operations per second) simulation of cloud cavitation collapse.
The scientists performed the largest simulation ever in fluid dynamics by employing 13 trillion cells and reaching an unprecedented, for flow simulations, 14.4 Petaflop sustained performance on Sequoia - 73 percent of the supercomputer's theoretical peak. The simulations resolved unique phenomena associated with clouds of collapsing bubbles which have applications ranging from treating kidney stones and cancer to improving the efficiency of high pressure fuel injectors.
The Gordon Bell Prize is awarded annually by the Association of Computing Machinery (ACM) at the Supercomputing conference.
For additional information, see the IBM announcement.