I strongly disagree EUE is a better metric than PUE. There are many variables outside a data center’s control that make EUE impractical. First, power generation efficiency is dependent on weather conditions, type of fuel, maintenance and operation styles, site environmental controls, to mention a few factors. So if a power plant is generating and transmitting power to a data center more efficiently, then wouldn’t EUE change even though operations at the data center have not? Second, what if renewable sources like solar or wind are used to augment power, should their power conversion efficiencies be included in source energy calculations? Third, using EUE will also result in different efficiencies depending on time of calculation, whether at peak or off peak, on which type of plant comes online and for how long. So data centers will game the system by calculating EUE at the most favorable time. Fourth, using EUE is akin to calculating all the energy used to produce a box of cereal and then dividing it with the resultant caloric value; academic but impractical in the real world of business decision making. Fifth, obtaining real-time reliable energy production data from power plants would be difficult at best. I think a better way of measuring efficiency is finding peak power consumption (IT and cooling) per square foot of entire data center.
Solutions Partner, CQM Systems.
The Energy Star program was developed in the 1990’s by the U.S. Environmental Protection Agency to identify energy-efficient products. The original focus was computers and printers, but over the years the program expanded to include such products as kitchen appliances, office equipment, lighting, home heating systems, and air conditioners. What these applications have in common is that they are independent of location; an Energy Star refrigerator, for example, uses less energy than a refrigerator that does not carry the label regardless of where it is installed. Energy consumed in data centers, on the other hand, is highly dependent on location. A data center located in Chicago might use less energy for the same compute load than a more efficient data center in Dallas simply because there are more free cooling hours in Chicago.
Responding to your specific points:
While it is true that power generation is a function of weather conditions, fuel mix, and other variables beyond the control of a data center operator, locating a new data center should consider free-cooling opportunities and the local utility’s generation portfolio mix.
If renewable sources are being used they would be incorporated in the utility’s generation portfolio and have a positive effect on the overall fuel efficiency calculation.
The logical time period for any data center energy calculation is one year, which takes into account the full weather cycle and use of all utility generation assets. It also is important to include fuel used in the course of testing and maintaining the diesel generators.
If one year is the time period of interest, it is not clear how data center operators can game the system as long as historic regional weather data is the basis for energy consumption projections.
- It is common practice to rank engineering designs using standardized test protocols. For example, automobile buyers pay close attention to the results of mileage tests on the window sticker even though their personal driving habits might differ from the EPA’s standard protocol. A similar approach, based on standardized compute loads and regional weather data, can be used to rank data centers by their relative energy efficiency.
For convenience it is entirely possible that data center colocation customers, whether in Atlanta or Chicago, might insist on a nearby facility. My point in writing the article, however, is that data centers are location dependent, which PUE ignores, and the EPA should haven taken that into account when designing a universal energy efficiency program like Energy Star.