By the time this is published we will be in the middle of the dog days of summer. Data center cooling systems, as well as our energy budgets and patience, will be tested. Daily, weekly, and monthly energy use and PUEs will spike. Peak utility rates will be in effect and compressor runtime and electrical draw for chillers and CRACs will be at their highest, as exterior heat rejection systems face performance challenges from high ambient air temperatures and solar heating of rooftops.

Moreover, if your data center cooling system requires water, sooner or later it could be facing a variety of drought related problems; economic, tactical, and even political. And if it is located in California, and if you have not yet begun to feel any of them already, they could become issues in the very near future. The recent June 24 article in The Wall Street Journal (WSJ) titled “Data Centers and Hidden Water Use” proclaimed “In California, computer farms are under scrutiny for their large and growing use of water for cooling.”

While all industries, general businesses, and of course people, use energy and consume water, it would seem that data centers are still an ever popular target for a multitude of “green guardian” groups and public media focus. The data center industry has been primarily focused on energy efficiency a few years after the power usage effectiveness (PUE) metric was first introduced by The Green Grid (TGG) in 2007. Water was simply not considered part of the operating “efficiency” of the data center at the time. While I and others have been speaking about considering water usage in the data centers for many years, it was not until 2011 when TGG introduced the water usage effectiveness (WUE) metric. Nonetheless, even today very few data centers publicly discuss or disclose their WUE, with a few notable recent exceptions, such as eBay and Facebook.

One of the primary reasons that many large scale cooling systems (data center or otherwise) use water is because it makes mechanical cooling (i.e., compressor based) more energy efficient. There are two general categories for data center water usage, first the traditional water cooled chiller, coupled to an evaporative cooling tower (which is commonly used by many types of buildings). And more recently, the use of adiabatic cooling, such as the type used by Facebook, which uses a direct airside “free cooling” system, augmented with adiabatic cooling (which introduces moisture to the incoming air). This lowers the incoming air temperature, which reduces or eliminates the need for mechanical cooling, thus improving the PUE. To be clear, I am not comparing the energy efficiency of a traditional closed loop data center (with a water cooled chiller)to a direct airside “free cooling” system, however, I will correlate how the energy usage is intertwined with water usage in a bit.


WUE: Site and Source

What is not well understood in most of the mainstream media articles that point the fingers at data center water usage, is that the production of power (which is used by everyone) uses a great deal of water. In fact even those in our own data center industry are not completely aware of the details of the WUE metric. There are two categories; WUE site and WUE source. WUE site is the simpler metric that directly correlates the water actually used by the data center to the energy used by the IT load (expressed as liters/kWh, based on annualized consumption). At the time of writing this article the eBay dashboard indicated a WUE of 2.5 (3 month trailing average) for its entire fleet of data centers (a mix of old, new and some containers). The Facebook dashboard was unavailable (it was being updated), previously published information indicated a WUE of 0.22 liters for its adiabatically cooled site in Prineville, OR. Obviously, quite a significant difference of water usage, presumably due to adiabatic cooling. While mainstream data centers may not be comfortable with direct outside air free cooling, more recently indirect airside free cooling systems have become more popular, which combine the water and energy efficiency of adiabatic cooling, yet it does not introduce outside air into the white space.

Now comes the more complex “WUE source” metric, which is based on the sum of annualized Site water, plus the Source water used to generate the total facility power, divided by the IT energy (again expressed as liters/kWh). To put it all into perspective, the production of power at the utility itself (“source” energy water usage), has a national average is 1.8 liters of water per kilowatt-hour, according to The Green Grid WUE white paper (the rate varies from 1.4 to 1.9 for different geographic locations and 0.8 to 3.3 for different power generation system types). These figures were based on a National Renewable Energy Laboratory, (NREL) report and the U.S. Department of Energy 2006 report to Congress. So even though we are only now beginning to look at data center WUE “site” water usage, “source” water consumption by power plants is a global and long standing issue by comparison for all industries.

I did a bit more digging into power plant water usage by type of generation and was a bit surprised. According to the NREL report titled “Consumptive Water Use for U.S. Power Production,” it defines and analyzes water “consumption” into three categories: thermoelectric (i.e. coal, natural gas, nuclear), hydroelectric, and combined aggregate (a weighted national average of both). Thermoelectric power generation (e.g., coal, natural gas, nuclear), for those which use evaporative cooling, has an annualized average consumption rate of 0.47 gallons per kWh (1.8 liters/kWh). While the NREL report is from 2003, power generation cooling technologies have not changed significantly since that time (although the mix of thermoelectric vs. hydroelectric may have changed).

So how does that work when looking at the big picture holistically? By just considering a site WUE, an air cooled mechanical system (CRAC or air cooled chiller) would have a “perfect” WUE of 0. However, it will most likely use more energy than a water cooled chiller system. In effect, no water is used by the data center, presumable taking them off the list of “guilty” parties and contributors to a localized water shortage.

Let’s take a look at different examples which includes source water usage to produce energy (1,000 kW of IT load - 8760 mWh year).

• Case 1: Air cooled chiller PUE of 2.2 (19,272 mWh total facility). No water - WUE site = 0

Energy generated by a coal fired plant (2.2 liters/kWh). Source water 42,398,400 liters + site (0) = 42,398,400 liters total

• Case 2: Water cooled chiller (Evaporative WUE site of 2.5) PUE of 1.5 (13.1 mWh total facility), site = 21,900,000 liters

Energy generated by a coal fired plant (2.2 liters/kWh). Source water (28,908,000) + site (21,900,000 liters) = 50,808,000 liters total

You can see that the air cooled data center used 46% more energy (19.3 mWh vs13.1 mWh) than the water cooled site, yet the total water used (site + source) would be 20% higher than the air cooled facility (50.8 vs. 42.4 million liters).

While the numbers I used were hypothetical and simplified to highlight the water used by just one type of power generation, each data center PUE and WUE is unique, just as where and how the power is produced. Of course, data center owners do not see the water usage in their utility bills (nor do any other users), only the total cost. Therefore, just as in any other business, the price of power will be a factor, which in the U.S. can range from 2.5 to 25 cents per kWh. The price per kWh, when coupled with the 46% difference in annualized energy can represent substantial savings, which will impact many site and cooling design decisions.

In addition to the national averages, the NERL report breaks out water usage for every state, with California having a Combined Aggregate of 4.64 gallons/kWh (17.6 liters/kWh). This is well over twice the national average, which would only make the above WUE source examples worse. Moreover, California also has some of the highest energy costs, which would further motivate data center owners toward water cooling to lower energy costs.

In case you were wondering, I have not forgotten carbon usage effectiveness (CUE) in this discussion. Life can get pretty complicated very quickly for data center operators if they begin to examine how their energy was generated. There are already many protests of data centers which are located where coal is used to produce power. Yet they seem to ignore the fact that all the other local users (other industries, businesses, and people) also use the same source and everyone needs to consider that using less total energy will reduce their carbon footprint.


The Bottom Line

PUE serves an important purpose as a straightforward metric to measure facility energy efficiency. WUE site does the same thing in essentially the same manner. WUE source begins to help open our eyes to the environmental resources required to generate power, as does CUE. While PUE has now become a commonly used metric, we are just now beginning to see some discussion of water usage, perhaps because of the shortages. Regardless of the reason, it is time to look at PUE, WUE, and CUE collectively.

So what about golf courses? Coming back to the WSJ article, it also included data and a graphic indicating that a “typical” 15 MW data center uses between 80 to 130 million gallons per year (we can debate about what type of data center cooling system this was based on, since there were no details), however the article also cites that two 18-hole golf courses use 100 million gallons per year (which it did not seem to indicate as a problem).

Now perhaps I am being politically incorrect by focusing on the water used by a golf course, but based on this information it calculates as 2.78 million gallons per hole! Maybe, since I am not a golfer, I wind up spending too much time in the hot aisle, but at 2.78 million gallons a year per hole, it make me wonder — why are they looking to place blame on data centers and not golf courses?

As a side note, in researching this I also discovered that apparently the golf course industry has had a WUE metric long before The Green Grid (according to a 2015 article by the Golf Course Superintendents Association of America “GCSAA”). While the GCSAA version of WUE is obviously calculated differently, the article contains a WUE chart covering 2000-2003.

On a more serious note, while in the U.S. potable water is taken for granted by most people, in some parts of the world safe drinking water is a very precious commodity. The WSJ article noted that the annual water usage of a “typical” 15 MW data center could be used to irrigate 100 acres of an almond farm, a major crop in California. Clearly the ongoing drought in California is serious, as well as in other areas of the U.S. and other parts of the world that have also had water issues, which in turn impacts the overall food chain and ultimately the supply and prices of other items such a beef.

While we cannot stop mainstream media from inaccurately painting data centers as a target, we can do our own self-examination to see what design and operational practices we can implement to increase energy efficiency and reduce overall water usage. So stop putting around and let’s all treat clean water as the precious commodity it is, and continue optimizing your data center before you fire-up another server. And if you are a golfer, consider miniature golf, at least it does not use water.