On the first Earth Day on April 22, 1970, the World Wide Web existed only in research labs, “texting” was something you wrote in a notebook and the closest thing to a data center was the public library. On that day, millions of people gathered at events around the country to learn, share and work toward innovative solutions to protect our natural resources.

In 1970, we worried about landfills and paper waste, some 44,310,000 tons of it annually1. Today, those things that were on paper now are in digital format, with an estimated 639,800 gigabytes of that digital information2 passing through data centers every minute. Forty-four years later, Earth Day 2014 reminds us to examine the impact data centers have on our natural resources and sustainability and the strides being made by businesses and governments to meet new data center energy efficiency levels.

According to the U.S. Environmental Protection Agency,data centers account for up to 2.5% of the total electricity generated in the United States. There are a number of technology innovations that contribute to the overall energy efficiency of data centers. Some examples include:

  • Capturing Free Cooling: About 30% of all data center energy is used to cool servers and information technology equipment with power-hungry air conditioning chiller units, which are used at both the server rack and printed circuit board levels. Many companies today are investing in “free cooling” technology to draw in exterior ambient temperatures to keep data center systems cool and to reduce energy. In 2013, Facebook unveiled a new data center in Sweden that uses free cooling technology that harnesses the average exterior temperature of 34°F3.
  • A Few Percent Matters: According to Frost & Sullivan, raising energy efficiency levels of uninterruptible power supply (UPS) systems in data centers from 90%to 98%can save the United States $3 billion annually in energy costs. CoreSpace is saving an estimated $25,000 a year with its eBoost*-equipped, multi-mode UPS systems operating at 99% efficiency.
  • Overnight Ice: To lower the temperature at its 538,000-sq-ft data center in Phoenix, i/o Data Centers engineered a set of cooling tanks filled with a mix of ice balls and glycol that are chilled during the night when electricity is less expensive and used to cool data center equipment during the day4.
  • Modular Power: With data center capacity expected to expand by 33% over each of the next five years5, squeezing more capacity and efficiency from existing data facilities is vital to sustain growth. Containerized, or modular, data center and power protection units — connected to existing facilities — lets companies scale the capacity and energy use they need. These outside units use ambient air to reduce excess heating and energy consumption.
  • Mining Energy: To save energy costs, Iron Mountain built its data center in a former limestone mine in Pennsylvania. It keeps the data operation cool by letting the subterranean walls absorb as much 1.5 British thermal units of energy per square foot4.
  • Powering the World’s Cell Towers: Powering the world’s 640,000 off-grid cell towers with diesel generators burns more than 11 billion liters of diesel a year. Hybridizing a cell tower using sodium nickel chloride batteries cuts fuel consumption at sites by up to 50%and can enable more towers worldwide to be powered by renewable sources such as solar. Cutting fuel consumption by up to 50%delivers significant cost savings for the industry and reduces greenhouse gas emissions from each off-grid cell tower.

As we create, share and use more and more data in our business and personal lives, data center energy efficiency will remain one of the great challenges facing the industry. The innovations we’re deploying today not only provide real and immediate benefits, they are leading the way to new solutions for tomorrow.