Back in ancient Egypt, servants kept the Pharaohs cool by waving palm fronds over water-soaked reeds. A couple of thousand years later (on April 19, 2012, to be exact) Greenpeace activists in Seattle unfurled an 800-ft banner that read: “Amazon, Microsoft: How clean is your cloud?”

For data center experts, there’s a clear connection between these two historical antecedents. It is evaporative cooling — a method as old as civilization to keep computer servers comfortable and consistent while saving tremendous energy and operating costs.

And yes, Greenpeace, it’s pretty darn clean, and it can be used in more regions around the country than most people think.

DATA CENTERS DEMAND ENERGY

We all know that keeping our phone and mobile devices humming — not to mention meeting our growing need for internet TV and streaming video — takes enormous computing power.

In 2007, the Environmental Protection Agency estimated that national energy consumption by computer servers and data centers would nearly double from 2005 to 2010 to roughly 100 billion kilowatt (kW) hours of energy at an annual cost of $7.4 billion. Recent studies suggest that the recession and virtualization caused a more moderate growth rate of 36% in those five years.

Still, the power usage is enormous. Worldwide, data centers pull about 30 billion watts of electricity, roughly equivalent to the output of 30 nuclear power plants. In the United States, roughly 50% of electrical power is generated by coal power plants, creating a clear connection between data centers and concerns about global warming.

Traditionally, data centers have posted poor power utilization effectiveness (PUE) scores around 2.5, meaning the building uses two-and-a-half times the energy as the IT equipment installed in it. This high PUE score is typically associated with a chiller unit running full time to keep a constant temperature and humidity in the facility regardless of the outside weather.

A NEW WAY OF COOLING

The first steps in the evolution of data center cooling occurred when cool outside air was allowed to circulate among the aisles of server racks. When the outside temperature was, say, 95ºF, full refrigeration was still employed. But when the outside temperature dipped to 80º, outside air was let in with partial refrigeration to maintain an internal temperature of 70º. If the weather was 70º or below, outside air could handle all the cooling duties alone without any refrigeration.

The issue then became humidity, or the lack thereof.

Even school kids notice static electricity on cold, dry days as their hair stands on end as they wait for the bus in winter. That same principle can occur within data centers, with potentially disastrous results. It’s called “electrostatic discharge” — a static electric shock, and it can potentially ruin a server, disrupting the stream of data on which we’ve come to depend.

Engineers tinkered with the idea of adding a steam injection humidifier, but adding steam generation that uses energy would offset all the energy savings by using outside air in the first place. Other solutions proved problematic as well, including ultrasonic (limited and expensive), compressed air (needs lots of space, noisy), and compressed water (ditto). The best answer was the same one developed by our friends in ancient Egypt: running air through a wetted media, otherwise known as direct evaporative cooling. It’s cheap, simple, and takes almost no energy to operate.

For example, by mixing 40º dry outside air with warm air from the data center’s hot aisle through the fiberglass honeycomb of an evaporative cooling unit, engineers could create a stable 70º with enough humidity to reduce the static electricity shock risk.

With the humidity problem solved, the thinking around data center cooling evolved further.

First, engineers examined changing the work practices and attire of data center crews to reduce the chances of electrostatic discharge. And there was further research into whether the danger of electrostatic discharge was as truly menacing as once assumed.

EVAPORATIVE COOLING RISES

By the early 2000s, evaporative cooling was starting to be used as a method to cheaply and effectively keep acceptable minimum humidity within data centers. Around 2006, the first engineers began to use evaporative cooling not just for humidity control, but to maintain appropriate temperatures.

In 2008, ASHRAE came up with its recommended humidity and temperature parameters for data center servers, as well as a wider range of what was considered allowable limits to run the machinery for short periods of time.

However, server manufacturers such as Dell, IBM, Hewlett-Packard, and Sun devised even wider allowable environments, with drybulb temperatures in the ’90s and 80% relative humidity, and drybulb temperatures as low as 60º and 10% relative humidity (rh).

For some old-school thinkers, these parameters were mind-blowing. The legacy practice was to peg data center temperature at 65º  day and night, regardless of the weather or computing load, and regardless of price or cost to the planet. 

 A CASE STUDY

I worked on one of the first major data center projects to take advantage of the new ASHRAE ranges and the new technology.

A major software company wanted to consolidate servers from its research and development labs scattered across its campus into an optimized environment. The project had big goals: support a critical load of 17 megawatts (MW), in a high density environment (above 500 W/sq ft), with low operating costs and rapid delivery — less than a year from concept to occupancy.

Using the expanded inlet temperature and humidity range for the servers allowed us to use an airside economizer and evaporative cooling only — no refrigeration, no chillers.

In other words, we eliminated a 7,500-ton chiller plant with a potential capital cost savings of $17 million. And that doesn’t include energy savings from lower operating costs.

A few other notable characteristics of the project: there was overhead air distribution and no raised floors. Hot and cold aisles were contained — meaning that there was no “pollution” of the cold-aisle air from the hot aisle. In addition, we maintained a slight negative pressure in the hot aisle, so any breach in the containment would bring cold air into the hot aisle, and not vice-versa.

Temperature was maintained at a steady 70º for 99% of the hours. The worst hour of the year is about 77º.

Even though the 48 evaporative cooling units use about 20,000 gallons of water on a peak cooling day, that is still at least 25% less than a traditional water cooled chiller plant. In other words, 20,000 gallons of water are turned to steam and sent into the atmosphere each day, where  they eventually return to earth as rain.

The water isn’t treated chemically but run through a high-efficiency softener to remove the calcium and magnesium that could cause problems over time. Liquid chlorine is added to storage tanks only to maintain the same level as city water.

The results are incredible. Models show an annualized PUE of 1.19 with 100% load, and an annualized PUE of 1.15 with the expected 65% load.

THE EVAPORATIVE COOLING MAP

Using outside air and evaporative cooling makes sense in the Northwest, where 97% of the annual hours are under 75º and cooling servers is not that tall of a hurdle (Figure 1).

In fact, businesses currently running data centers with outside air economizers and evaporative cooling systems in the Northwest include Microsoft, Facebook, Google, Apple, T-Mobile, and AT&T.

Playing weatherman, you can see that using outside air and evaporative cooling is possible 100% of the time in Seattle, San Francisco, and Denver, regions where the climate is moderate with low humidity. Even in desert towns such as Phoenix, the heat and humidity index shows you could use outside air and evaporative cooling 80% of the time. Adding a chiller to the mix, and you could use all three an additional 17% of the time.

The picture doesn’t change that much even on the humid East Coast. In Baltimore, for example, 84% of the annual hours are less than 75º.

But when you get to the deeper South, the problem for data centers turns from not having enough humidity (and risking electrostatic discharge) to the challenge of too much humidity and concern about conductive anodic filaments that could grow within the circuit boards along the fibers of the composite material. There is ongoing research to assess whether these filaments actually have been created in real-life, or just in a lab experiment.

Even in Atlanta, data centers could use outside air and evaporative cooling 64% of the time, the remaining hours split between a chiller, outside air and evaporative cooling and, for higher temperatures and humidity, a chiller alone.

Most server manufacturers say their equipment works fine in 78º supply air. That further extends the opportunity for evaporative cooling in Atlanta to 85% of annual usage.

Evaporative cooling makes sense for many types of facilities, including: enterprise data centers, colocation data centers, research-and-development labs, low-rise office buildings, high-rise office buildings, telecommunications centers, smaller server closets, and container data centers.

DIRECT EVAPORATIVE COOLING IS NOT FOR EVERYONE

Evaporative cooling doesn’t make sense in tropical climates, and coastal areas out of concern for salt mists. It doesn’t work well in regions with very severe pollution. But, perhaps most important, it doesn’t work well for clients who demand their data centers are pegged at 65º with constant humidity. This psychological aspect is an important hurdle to overcome if evaporative cooling is to be used in all the geographic locations where it makes sense.

As its track record grows and minds change, evaporative cooling will likely be the go-to method for data centers in all but a few semi-tropical areas in the coastal Southeast.

A few issues to consider:

  • Need to establish client expectation of cold aisle temperatures on the “worst” design day

  • Need plans for systems operations when weather exceeds design conditions

  • Watch the dewpoint of the supply air. If it gets too high, it could condense on cold building surfaces

  • If your system depends on evaporative cooling to compliment a chiller plant, water is now a critical utility. Need to watch water consumption rates.

  • Using outside air, need to be aware of radiator and/or diesel exhaust when chiller plant starts up

  • Make sure you are appropriately treating the water, and don’t use chemicals that may poison the air. High-efficiency softeners work well.

Here’s the bottom line: Using outside air and evaporative cooling can reduce or eliminate the need for a refrigeration plant, which reduces both capital and maintenance costs, systems complexity, and physical space requirements.

For those concerned that our insatiable appetite for data will end up driving up energy costs and toasting the planet, evaporative cooling offers some solace. It helps flatten the curve for energy demand. But an increasing number of data centers will come online, and costs and energy use will continue to present tough challenges, even with the latest technology and thinking.