Do data centers deserve to be called “energy hogs?” Or do they deserve a more accurate nickname?

Given the rapid growth of the industry, managing data is both complex and energy-driven. For data center owners and managers, as well as their construction project teams, the challenge is matching an energy-reducing solution with the needs of the customer.

Stanford University professor Jonathan Koomey, a widely-cited authority on data center use, reports that data centers account for approximately 1.7% to 2.2% of all electricity used in the United States. With the growth of public cloud computing companies, such as Facebook and Google, scientific computing centers and colocation facilities, and on-site data centers delivering energy-related solutions can be both challenging and ongoing.

BEST PRACTICES

REGULATORY TRENDS

EQUIPMENT INNOVATION

THE FUTURE

Worldwide, energy consumption by digital warehouses amounts to approximately 30 billion watts of electricity or about the equivalent to the output of 30 nuclear power plants, according to industry experts interviewed by James Glanz of The New York Times in an article titled, “Power, Pollution and the Internet.” That’s a lot of energy, and the industry has become a target for sustainable advocates and government agencies seeking to assess and harness usage, much of which is indeterminable because of the highly secretive and protected nature of data centers themselves.

So how best to tackle a giant? Start with a shovel.

Many solutions begin with best practices stemming from the construction of new facilities and the renovation of existing facilities.

In terms of design/build projects, the “short list” might include utilizing:

  • Lean construction concepts/methods and BIM with the ability to meet fast-track construction schedules
  • Intelligent building solutions that enable whole building energy monitoring performance
  • Green building expertise, including LEED®, Energy Star, and other environmentally responsible processes
  • Experience in providing integrated project delivery that engages the entire project team for innovative solutions
  • Pre-construction planning for commissioning process
  • Quality assurance programs

Companies have begun to embrace design/build practices that expedite both delivery of new facilities and expansion of existing data centers but also ensure greater energy management control.

BEST PRACTICES: CONSTRUCTION SKILLS NECESSARY IN TODAY’S ENVIRONMENT

Outside the box thinking is a priority. Data center growth is happening so rapidly that clients expect innovative thinking along with continuously aggressive improvement. For MEP specialty contractors, this means understanding the priorities of the client. Most data center owners have built facilities throughout the U.S. and the world so their experience may drive the process. Owners typically fall into three camps: conservative, moderate, and progressive when considering innovative solutions for which they are not familiar. Few are progressive in this market, but when the team’s track record shows it can deliver real value, some conservative owners may move into the moderate category and implement outside the box solutions. Everybody wants a win/win situation, and this is one way to achieve it. The owner may be able to deliver the data center ahead of schedule at a lower dollar per kW, and the innovative contractor is rewarded the customer’s next project. While reliability and familiarity with specific equipment and system types are still dominant themes, performance-based, outside the box solutions with positive results can quickly gain the attention of the client-customer.

Being able to work in a team environment. Many commercial jobs today are being fast-tracked. This is especially true of data centers within office buildings and university campus facilities where there are very compressed schedules because the owner has servers that must be stored and/or ready for use — pronto! Some data center owner/operators may penalize the project team if the schedule is missed. With data centers, those liquidated damages can be significant which is why collaboration among the project team is essential. The use of integrated project delivery (IPD) relies heavily on team interaction to drive success for each member of the construction team and owner. Lean construction concepts, such as pull plans and Last Planner®, help the team identify critical project milestones and the flow of information that is needed at specific times to meet these milestones. Nobody wants to let somebody else down or hold up the schedule so accountability is really ratcheted up. These methods are becoming more acceptable to mission critical facilities and their owners because it can expedite construction completion.

Relying on technology with mechanical and plumbing design/construction. Investing in technology helps the design and construction team to consistently deliver accurate and cost-effective installations. Electronic methods of drawing MEP systems are performed using 3-D software. This facilitates coordinating the MEP systems with the architectural/structural 3-D model to perform a clash detection to ensure that all the systems fit within the structure. This process eliminates material and labor waste commonly found on projects where coordination is not done. Once a clash free model is signed off on by the construction team and owner, pre-fabrication of trade specific systems can begin.

Sheet metal ductwork drawn using 3-D software can be input into a format recognized by shop fabrication equipment and built off-site in a controlled environment. This method increases productivity and eliminates delays typically experienced in performing the work on-site.

Piping and plumbing systems are prefabricated and delivered to the project site as well. Total Station® is an electronic/optical based technology used to find spatial coordinates at the jobsite from electronic shop drawings generated by the contractor. Using 3-D drawing methods during design enables engineers to quickly put together “what if” scenarios for the owner to look at in a 3-D environment, which speeds up the decision-making process with little or no impact to the schedule. Technology will always be advancing and specialty contractors must embrace it and become proficient with it to keep up with the industry.

Implementing other technologies. Existing technologies are gaining increased use for data center cooling due to revised ASHRAE data center guidelines. Waterside and airside economizers have expanded hours of operation due to increased drybulb temperatures and expanded relative humidity recommendations. For example, a data center in Austin, TX, using airside economizers may be used to provide 100% of the cooling needs for approximately 20% of the yearly hours of operation based on 2004 ASHRAE recommended data center temperatures. When considering 2011 ASHRAE recommended temperatures, 100% economizer cooling increases to approximately 27% of the yearly hours. Partial economizer use can be implemented for 70% of the yearly hours when considering ASHRAE 2011 recommended temperatures for data centers.

Containment techniques are a common way of improving energy efficiency. Hot aisle containment eliminates inefficiencies caused by intermittent mixing of the cold aisle and hot aisle air flows. Having the hot aisle completely separated allows return air temperatures to be substantially increased allowing the cooling equipment to operate more efficiently.

Containment typically utilizes rack-based temperature sensing, as well as underfloor temperature sensing to provide energy efficient equipment capacity control. Emerson Power Networks/Liebert Corporation, a provider of data center solutions, indicates that containment systems typically provide a 25% increase in equipment capacity and an increase of 30% in cooling system efficiencies. However, according to the Uptime Institute’s 2013 survey, only 53% of companies with less than 1,000 servers are using this method of heat confinement to reduce hot/cold air contamination.

Several data center equipment suppliers have developed in-row cooling options where the cooling cabinets are strategically placed in the same row as the server racks. These systems commonly deliver air parallel to the racks in two directions providing efficient air distribution. In row cooling is typically utilized for a rack power density exceeding 5 to 7 kW.

Indirect evaporative solutions can significantly reduce cooling energy when compared to conventional cooling solutions. Munters Oasis™ Indirect airside economizers (IASE) factory selections provide annual performance information regarding predicted annual ton-hours of mechanical cooling, indirect evaporative cooling, and dry heat exchanger cooling (water spray disabled) for various climates and operating parameters.

Based on a data center in North Texas with a 68ºF cold aisle temperature and a 91°F hot aisle temperature utilizing 2011 ASHRAE recommended data center humidity levels, mechanical cooling is only required 18% of the year. The cooling load can be met by the indirect evaporative cooling unit or the heat exchanger only with the water sprays turned off.

Data center infrastructure management (DCIM) uses controls, processes, and monitoring to determine when additional equipment should be purchased through real-time data analysis. Equipment power consumption, historical trends, and a physical model indicating equipment types and location can be used to drive efficient facility operations. DCIM does not focus on a single aspect of a data center but considers every aspect and how they are interrelated.

REGULATORY TRENDS

Of course, many of the biggest changes are being driven by the customer, the Department of Energy, the EPA, OEMs, and IT industry trade groups. There is much that has already been prescribed by both federal and local jurisdictions that include municipal green building codes already in place and sustainability guidelines of organizations, such as the U.S. Green Building Council.

In addition, the IT industry’s sustainability group, The Green Grid, introduced its PUE™ — power usage effectiveness metric — in 2007, and it is widely accepted today. This metric is the ratio of all the energy utilized by the data center to the portion of that energy being consumed by the IT equipment.

Energy goals among users is to achieve an efficiency rating as close to 1 as possible. This can be accomplished by reducing the energy consumption of the cooling equipment serving the facility. A fairly aggressive PUE for a climate like North Texas would be 1.25 when using multiple energy-saving strategies. Southern California on the other hand could possibly obtain a PUE of 1.1 with similar strategies.

National Renewable Energy Laboratories (NREL) in Golden, CO, has published a PUE of 1.06 for its newly constructed HPC data center. That’s a huge driver in achieving greater energy-savings. What can we innovate to reduce energy consumption in the mission critical sector? The focus on sustainability also involves using renewable resources as sources of power, such as solar, wind, and geothermal, among others, along with improved best practices, such as DCIM programs/tools.

EQUIPMENT INNOVATION

In 2004, the American Society of Heating, Refrigerating and Air Conditioning Engineers (ASHRAE) recommended a temperature upper limit of 77°F for data centers; then an upper limit of 80.6°F was recommended in 2008. The data center standard was further refined with expanded classes for data centers. Classes A1 through A4 were added in 2011 but the recommended temperature and humidity requirements remained as published in the 2008 version. These temperature changes helped open the door to critical thinking about data center thermal environments.

Microsoft has reported that they allow the server rooms at their Dublin data center to rise up to 95° using only outside air for cooling.1  Some fear that these elevated temperatures may also tax the cooling equipment and shorten its life expectancy. Several years ago Intel preformed a 10-month study where the temperature variation at a location in New Mexico was allowed to vary between 64º and 92ºF using only outside air for cooling. Intel concluded that no consistent increase was due to this temperature variation.2)

It may be some time before data center operators are comfortable with such elevated temperatures but we can be pretty certain that server manufacturers are developing equipment to meet the need.

Moreover, different users have different needs with nearly 50% of data centers still operating in the 71° to 75°F range; and 37% operating in the 65° to 70°F range, according to the Uptime Institute’s 2013 Data Center Industry Survey. Some data center operators fear that higher temperatures may lead to more hardware failures for expensive IT equipment, although there are guidelines to assess and mitigate that risk.

Equipment manufacturers are always seeking solutions to these issues. Companies, such as Liebert, have a significant presence in the industry due to the many design solutions they provide for mission critical facilities. Today, variable- speed fans, EC motors, and digital scroll compressors enable a facilities manager to meet the cooling requirements of the facility in the most cost-effective way. Predictive and adaptive digital control systems combined with equipment geared towards saving energy can pack a powerful energy-reducing punch.

THE FUTURE: WATER?

The trend for liquid, submersed cooling of data center servers which use vats of non-corrosive, non-conductive liquid may be the next best practice. Engineered fluids such as 3M Norvec™ has been developed for use with immersive solutions, and, when coupled with ASHRAE’s mandate for new commercial buildings to be net zero energy by 2030, this trend may prevail.

In the future, energy is going to be expensive but water is going to be at a premium. The 2012 International Green Construction Code (IGcc) focuses heavily on measures that reduce water consumption and eliminate waste. Condensate from cooling equipment and humidifier flush water are required to be collected and reused. It is only a matter of time until municipalities adopt green building codes, which focus on business consumption of water, and that focus may well include data centers.

In 2011, The Green Grid released a white paper titled, “Water Usage Effectiveness (WUE™): A Green Grid Data Center Sustainability Metric.” The paper discussed a method of “an assessment of the water used on-site for operation of the data center. This includes water used for humidification and water evaporated on-site for energy production or cooling of the data center and its support systems.” Although WUE is relatively new, it will probably become as widely accepted by the industry as the PUE.

No doubt the trade-offs between energy and water consumption and costs will be affected by the physical location of the facility/climate and its energy source, as well as the facility’s MEP infrastructure, IT equipment, and business objectives.

Given that most IT managers don’t have to see or pay the bills for their facilities (that usually ends up in the facilities manager or operations manager’s budget), implementing best practices in the design/build of a data center or its renovation becomes an important next step to surviving the future and changing the moniker of “energy hogs” to thermal management experts.