Back

Energy and cost savings from LLNL data center consolidation earn DOE award

sustainability (Download Image) The Laboratory effort to conserve energy and reduce costs by consolidating data centers at LLNL has received a Sustainability Award from the U.S. Department of Energy. Leading the effort are Lawrence Livermore employees (from left) Mariann Silveira, Anna Maria Bailey and Andy Ashbaugh.

The Laboratory effort to conserve energy and reduce costs by consolidating data centers at LLNL has received a Sustainability Award from the U.S. Department of Energy.

The program, begun in 2011, using LLNL’s High Performance Computing Strategic Facility Plan as a guide, has so far shutdown 26 data centers, representing 26,000 square feet of space, resulting in annual savings of $305,000 in energy bills and $43,000 in maintenance costs. Leading the effort are Anna Maria Bailey and Mariann Silveira of Computing and Andy Ashbaugh of Operations and Business Services.

"Our institutional HPC needs will grow in the future and it is important for the Lab to have a Data Center Sustainability Plan to ensure we manage our computing resources in an efficient and cost effective way," said Bailey, LLNL’s HPC Facilities manager and an organizer of the annual Energy Efficient Working Group workshop conducted at the Supercomputing Conference.

Bailey says that an additional benefit of shutting down data centers is that it eliminates the need to install and maintain the electrical meters now required by DOE’s sustainability program to monitor power usage across the institution. That alone has resulted in cost savings to date of $348,000 and a cost avoidance of more than $10 million. DOE had required that all LLNL data centers be metered by FY15.

A "data center" is defined as a space that is at least 500 square feet and contains one or more servers. When the consolidation effort began three years ago, there were more than 60 data centers scattered around the main LLNL site.

The over-arching goals of the effort were to increase efficiencies and decrease the number of staff required to maintain centers. Achieving these goals requires reducing the Lab’s computing footprint, water usage and energy intensity. By taking an institutional approach that looks at the entire Lab site, the team believes the Lab can reduce facility and network infrastructure while still meeting the growing computing needs of research programs.

Furthermore, redundant equipment, such as air handlers and cooling systems, can be repurposed for new facilities.

Energy savings from the data center consolidation effort also will help the Lab meet its goal of an average sitewide power usage effectiveness (PUE) rating, a standard measure of power efficiency in data centers, of 1.4 during FY15. Centralizing previously scattered data centers is a primary way to improve power efficiency. The collective space at LLNL dedicated to data centers is referred to as the Enterprise Data Center, or EDC, which consists of 15,620 square feet housing 2,500 mission critical science, engineering, computational research and business computing systems.

In addition to consolidating some 126 physical servers into the EDC, the team has created 140 "virtual" servers. Acting as a kind of private cloud, virtual servers put unused CPU space on physical servers to work for multiple clients, reducing the number of physical servers needed in the EDC. One physical platform can be subdivided to create as many as 50 virtual servers.

Now that the "low hanging fruit" has been picked for data center consolidation, the effort will evolve into a broader institutional phase under DOE’s Better Building Challenge program to be led by Doug East, LLNL chief information officer. The Better Building Challenge is a partnership — in this case between LLNL and DOE — under which the Laboratory commits to reducing the energy intensity of data centers by at least 20 percent within 10 years. DOE, for its part, provides technical expertise, training, workshops and a repository of "best practices" across the complex.

As the demand for computing grew over the past few decades, small data centers sprang up around the site to meet programmatic needs, often located in storage closets or buildings with inefficient heating and cooling, East said.

"Now we need to take a more institutional approach and to conduct an education campaign to show the advantages of consolidating into a professionally managed data center," he said. "Centralizing equipment in efficient data centers makes business sense and has the potential to save research programs money — resources that could be redirected to science."