Back

Sequoia retains top ranking on Graph 500 for third year running

LLNL's 20 petaflops Sequoia supercomputer again retained its No. 1 ranking on the Graph 500 list, a measure of a system's ability to conduct analytic calculations -- finding the proverbial needle in the haystack.

An IBM Blue Gene Q system, Sequoia was able to traverse 15,363 giga edges per second on a scale of 40 graph (a graph with 2^40 vertices). The new Graph 500 list was announced at Supercomputing 13 (SC13) in Denver, Colo. Tuesday. Sequoia has held the top ranking on the Graph 500 since November 2011.

Scott Futral of Computation accepted the certificate for LLNL. Dave Fox, systems lead for Sequoia, led the LLNL work on the Graph 500 calculation and Fabrizio Petrini led the IBM team.

The Graph 500 list ranks the world's most powerful computer systems for data-intensive computing and gets its name from graph-type problems -- algorithms -- at the core of many analytics workloads in applications, such as those for cyber security, medical informatics and data enrichment. A graph is made up of interconnected sets of data with edges and vertices, which in a social media analogy might resemble a graphic image of Facebook, with each vertex representing a user and edges the connection between users. The Graph 500 ranking is compiled using a massive data set test. The speed with which a supercomputer, starting at one vertex, can discover all other vertices determines its ranking.

Blue Gene/Q systems have dominated the Graph 500 over the last two years. What this means is that Sequoia is the world's highest performance supercomputer for processing gargantuan data sets of petabyte size (quadrillions of bytes).

Data-intensive computing, also called 'big data,' has become increasingly important to Lab missions as HPC platforms have become increasingly powerful, producing enormous quantities of information.

The challenge for scientists is that the ability of HPC systems to produce and collect data far outstrips the rate at which those computers can analyze that information.

Extracting essential nuggets of information, sometimes called data mining, and/or recognizing patterns in large data sets, is important to such LLNL missions as nonproliferation, atmospheric monitoring, intelligence, bioinformatics and energy. For example, computer scientists develop techniques for organizing data that allows nonproliferation analysts to pinpoint anomalous behavior in a huge quantity of data.

See the IBM announcement for additional information.

For more on data-intensive computing at LLNL, see the January 2013 edition of Science & Technology Review.