A new study recently completed by Lawrence Berkeley National Laboratory (LBNL) Demand Response Research Center (DRRC) examines demand response opportunities and enabling technologies for data centers and reports findings from field studies.
Throughout the study, integrated monitoring and data from IT equipment and site infrastructure, necessary for baseline and demand savings analyses, was collected and aggregated using Power Assure’s EM/4 software and tested on infrastructure provided by NetApp and University of San Diego’s San Diego Supercomputer Center.
The study is built on the premise that the energy use in data centers is increasing and, in particular, impacting the data center energy cost and electric grid reliability during peak and high price periods. As per the 2007 U.S. Environmental Protection Agency (EPA), in the Pacific Gas and Electric Company territory, data centers are estimated to consume 500 megawatts of annual peak electricity. 2011 data confirms the increase in data center energy use, although it is slightly lower than the EPA forecast. Previous studies have suggested that data centers have significant potential to integrate with supply-side programs to reduce peak loads.
In collaboration with California data centers, California utilities, and technology vendors, this LBNL study conducted field tests to improve the understanding of the demand response opportunities within the walls of the data center. The study evaluated an initial set of control and load migration strategies and economic feasibility for four data centers. The findings show that with minimal or no impact to data center operations a demand savings of 25% at the data center level or 10% to 12% at the whole building level can be achieved with strategies for cooling and IT equipment, and load migration. These findings should accelerate the grid-responsiveness of data centers through technology development, integration with the demand response programs, and provide operational cost savings.