In 2016, 82% of the S&P 500 companies published corporate sustainability reports — up from less than 20% in 2011. This trend in sustainability reporting is a response to the growing public interest in large organizations’ environmental impact. Hyperscale data center developers are no exception.

Data center executives typically talk about their use of renewable energy systems to demonstrate their commitment to sustainability, referencing highly visible energy projects such as a wind farm or a solar array. Sometimes there is discussion about carbon offsets through the purchase of renewable energy credits, accompanied by the projection of becoming carbon neutral someday. The fact is this system of energy credits provides substantial state and federal tax and financial incentives.

Environmental offsets such as wind farms, solar farms, and renewable energy credits are great, but data center sustainability is also about energy consumption, water use, and e-waste generated within a data center’s locked gates. Claims of energy efficiency, typically substituting water evaporation for mechanical refrigeration and shifting central fan power to rack fans, may make the metrics look better but have little to do with sustainability.

 

ENERGY

Data centers use a lot of energy, and about half of this energy is required to maintain the operating temperatures required by today’s compute and network electronics. The root cause is air cooling, a technology mismatch. Since the invention of the computer, air has been the standard method of cooling, but air is not a heat transfer fluid; it is an insulator. Air needs to be filtered and cooled to allow today’s modern server farms to operate. Central air-handling systems in some hyperscale data centers move more than one million cubic feet of air per minute, and energy required to operate central and rack-mounted fan systems can account for more than 25% of total data center energy use. Liquids have over a thousand times more heat removal capacity than air, and pumps are much more efficient than fans. Sustainable data centers use liquid cooled racks.

 

WATER

Water is a resource that costs virtually nothing until there is none, and then no amount of money is enough. Data centers use water in two ways. Many hyperscale data centers rely on adiabatic cooling or cooling towers to reject heat generated by the electronic equipment inside. Even if free cooling is used when ambient temperatures are low, data centers require water on hot days when everybody else needs it too. One cloud data center is seeking a permit to draw over 1,000 gallons of water per minute for free from an ancient aquifer, about half the flow currently drawn to serve a community of 80,000 people. Yet another consideration, if electric energy comes from the grid and has a fossil fuel component, over 60% of the energy in the fuel is rejected in cooling towers at the power plant. Wasted energy at the data center translates to excess water use where that energy is generated. Sustainable data centers use no water and significantly less electric energy onsite.

 

E-WASTE

Today’s cloud data centers operate on a model where 30% of the servers and storage units are upgraded every three years. Some of this equipment is replaced because of technological obsolescence, although not every server in a large data center needs to have the latest processor. There are some applications that destroy servers in environments for which the servers were never intended to be used. Harsh environments, physical abuse, high operating temperatures, or dusty environments prevent cooling air from reaching all the components. Adding a few years to the average server life makes sense both environmentally and economically. Sustainable data centers are designed to prolong the life of IT hardware.

 

THE FIRST STEP TO SUSTAINABILITY

Liquid cooling, the first step to sustainability, takes three forms:

  • Cold plates
  • In-row cooling systems
  • Total immersion in a dielectric fluid

Cold plates originally were developed for high-performance mainframe computers. This technology was later adapted to gaming computers as kits to cool overclocked processors. During the past few years cold plate technology has been modified for traditional data center servers by introducing a sealed flow-through circulating loop from a central water distribution system. About 40% of rack heat is removed in the circulating water and rejected by a cooling tower, but the rest is blown into the room by inefficient server fans. This technology enables higher rack power density but does not save much energy.

In-row cooling systems reduce the volume of air that must be cooled, but contain all the elements of a small air conditioning system including fans and water-to-air heat exchangers. Cost and space are issues with this technology.

Total immersion in a dielectric fluid is a technology where all electronics are submerged in a fluid that conducts heat but not electricity. It addresses problems that cold plates and in-row cooling systems ignore:

  • No fans within the server or storage enclosure
  • Dramatic reduction in thermal fluctuations, improving reliability
  • Lower server CPU core operating temperatures
  • Isolates contaminated air from electronics

It is indisputable that a system where heat generating components are immersed in a heat removal medium without any thermal barrier to heat transfer is the most efficient way to dissipate heat from servers. Despite its operational and environmental advantages, liquid cooling technology has been slow to catch on. In part this might be attributed to the large investment that is required to retool data center architecture, but some approaches to total immersion have presented real problems relating to cost, maintenance, and scalability.

Having addressed the perception of sustainability by investing in highly visible renewable energy projects, perhaps the time has come for U.S. cloud operators to focus on operations, as the Chinese are doing, and address the reality of energy use, water consumption, and e-waste generation inside their locked gates.