Figure 1. High-density environments can create hot spots that surpass 30 kW per rack.


Following a green path, once a concept relegated to a small group of environmentalists and the U.S. Environmental Protection Agency (EPA), is turning into a significant activity for organizations worldwide. And with soaring energy prices, shrinking floor space, and larger masses of data being generated, the green data center is fast becoming a focal point of interest.

Evolving to a green data center and optimizing operational efficiency is a complex endeavor. Utilizing a holistic perspective to manage the data center infrastructure and following data center best practices can move the process along in a step-wise manner with measurable benefits being realized along the way. A strong data center infrastructure resource management system can assist organizations “green” their data centers by providing the building blocks for managing the data center as a single entity and rolling out these best practices processes.

Figure 2. Site costs will reach almost 300 percent of 1U server costs by 2012 in a best-case scenario derived by The Uptime Institute.

The Growing Challenges

IT customers are consistently demanding better performance from data centers and high reliability at lower prices. Meeting these demands have resulted in faster servers, lower-cost storage, and more flexible networking equipment. While technology advancements-such as high-density blade servers-have succeeded in providing greater performance, the operational costs associated with such improvements have reached astronomical levels. As long ago as August 2007, Ken Brill, founder of The Uptime Institute Inc., forecast a potential crisis. “The benefits of [Moore’s Law] are eroding as the costs of data centers rise dramatically,” he stated. “Increasing demand for power is the culprit, driven by both higher power densities and strong growth in the number of servers in use.” In fact, predictions by a variety of industry analysts have indicated severe problems will occur if energy efficiency is not achieved, including:

  • According to a 2008 Digital Realty Trust survey of senior data center decision-makers, power use of data centers (average kW use per rack) jumped 12 percent from 2007 to 2008.
  • The increase in high-density IT equipment, the cost and scarcity of power, and the focus on the reduction of greenhouse gas emissions are requiring new technology to meet these challenges. (Gartner, July 2009)
  • In a recent survey of 100 data center operators, 40 percent reported running out of space, power and cooling capacity without sufficient notice (Aperture Research Institute)


Figure 3. Total growth in electricity use from 2000 to 2005.

The increase in the number of installed servers has driven up power consumption more than any other element within the data center. With high-density servers filling the racks, hot spots are being created that surpass 30 kilowatts per rack. Consequentially, many data center managers are finding they cannot obtain enough power to distribute to the racks or that the power utility is unable to deliver additional capacity.

To further compound the issue, the costs associated with electricity are rising significantly. Currently, power and cooling costs represent up to 44 percent of a data center’s total cost of ownership. The Uptime Institute estimates that the three-year cost of powering and cooling servers is approximately one-and-a-half times the cost of purchasing server hardware (see figure 2). Extending these projections out to 2012 show the multiplier increasing to 22 times the cost of hardware under the worst-case scenarios and to almost three times for the best-case situations. Figure 3, developed by The Uptime Institute, maps out the cost projections for powering and cooling average servers in a data center (OpEx + amortized CapEx), as it relates to the cost of buying a server.

Planting the Seeds

These challenges are not being ignored, as corporations, utilities, and governments worldwide are developing measures to address not only sky-high energy bills, but also new and expanding regulations for the disposal of IT equipment and the growing concern over global warming. The EPA distributed a preliminary Tier 2 draft specification for servers on September 24, 2009, and is progressing on standards for storage and networking equipment as well. Utility companies have begun initiating programs offering rebates for increasing energy efficiency. A research study by Digital Reality Trust reported that 60 percent of companies expect green data center strategies to be a part of future capital spending. A new directive in the European Union has been established to reduce energy usage by 20 percent by 2020. Some high-profile efforts have already been instilled to showcase green data centers-including the first “green data center” built in August 2005 for Fannie Mae and Sun’s new data center built in 2007.

Green Data Centers in the Making

The first green data center was built in August 2005 for the Fannie Mae Urbana Technology Center (UTC), an environmentally sustainable 247,000 square-foot data center and office building. The UTC has earned a Leadership in Energy and Environmental Design (LEED) certification from the U.S. Green Building Council. Creative design that boosted sustainability and provided adequate support for computers, security, lighting, and power redundancy reduced overall energy consumption by 20 percent.

Figure 4. Green IT awareness levels among executives worldwide. Source: Studies show green IT awareness, but little planning, June 20, 2007, Penn, Schoenm and Berland Associates, ItBusiness.ca

Sun built a high-profile green data center in 2007, which has resulted in a dramatic decline in electricity use. Sun deployed new server technology and state-of-the-art cooling system, consolidated its Silicon Valley data centers, and halved the footprint of its facilities-contributing to a nearly 61 percent reduction in power consumption. Sun reduced the number of servers from 2,177 to 1,240, but computing power reportedly increased by 456 percent, accomplished primarily by investments in virtualization.

In addition, a survey conducted by Penn, Schoen & Berland Associates questioned more than 400 corporate leaders from the U.S., the United Kingdom, Canada, and China regarding their awareness of Green IT. Although few had put any strategic initiatives in place, the vast majority was very aware of the environmental issues (see figure 4).

Confusion as to what constitutes the best approach is one of the major reasons corporations have not implemented a green strategy plus they say that it is not clear how to measure whether specific strategies and technologies actually work. There is no clear industry standard with regards to measuring energy efficiency within data centers, but it is quite evident that a roadmap to assist companies in these efforts is desperately needed.

A consortium of information technology companies and professionals developed The Green Grid (www.thegreengrid.org) in response to the lack of standards and confusion, to promote energy efficiency, and lower the overall consumption of power in data centers. By providing data and guidance, The Green Grid expects to assist data center managers in making better decisions with respect to design, planning, deployment and day-to-day operations.

The Green Grid encourages data center managers to use one of two fundamental metrics for measuring data center effectiveness and efficiency-power usage effectiveness (PUE) and its inverse, data center efficiency (DCE).

PUE (also called the energy efficiency ratio-EER-by the EPA) represents the amount of total energy entering the data center facility divided by the power delivered (and consumed) by computing resources. The smaller the ratio is, the greater the amount of power entering the facility that is used directly by computing (IT) resources. Many industry analysts have stated that the ultimate goal is to obtain a PUE of 2; some large companies seem to be racing to have PUEs less than 1. In order to manage operations to the PUE ratio, data center managers and operator must fully understand the infrastructure components affecting total facility power-such as if the chillers and/or power are the inefficient factors.

Figure 5. Decreasing PUEs represent increasing energy efficiency. Ultra efficient facilities are achieving PUEs approaching 1.0.

Steps to a Greener Data Center

Without established industry standards, what is the best way to evolve to a green data center that is focused on energy efficiency? How do data center managers and facilities managers understand the most cost effective means to implement the green data center? Although there is no single solution, there are many steps that can be taken to help improve productivity and efficiency. Implementing data center best practices and taking a holistic perspective to managing operations can create a robust decision support system to monitor and measure productivity and efficiency.

1. Eliminate “Ghost” Servers.Approximately 8 to 10 percent of servers in many of today’s data centers have no identifiable function. Technically considered dead with regards to serving the organization, these servers can still haunt an organization by consuming IT resources. These forgotten-and usually undocumented and unprotected-pieces of equipment take up valuable floor space, waste power, and become a ripe target for hackers. Current estimates indicate that removing just one physical server from service can save $560 annually in electricity costs, assuming 8 cents per kilowatt-hour.

2. Improving Resource Utilization.Organizations must manage a data center infrastructure according to its specific power and cooling profiles. At the time of a facility design or build out, organizations typically pre-determine specific operating guidelines for racks-including power and cooling limits-in order to properly design the power and cooling infrastructure. Currently, many organizations lack quality information sources on power consumption and heat output, which causes operations to run well below the designed levels effectively underutilizing these critical resources.

Whether a data center is comprised of high-density or low-density servers, a data center infrastructure resource management system can assist data center managers and facility managers design the physical layout to meet specific capabilities, to avoid disruptions and to reach optimal utilization. The system combines real-time measured values and manufacturer specifications with a detailed model of the data center, enabling organizations to “recapture” unused resources.

3. Eliminate the barriers between IT and Facilities. In many companies, the IT and data center facilities groups continue to operate as two separate and distinct organizations with little communication and interaction. As stated by the EPA in its report to Congress, in many data centers those responsible for purchasing and operating IT equipment are not the same as those responsible for the power and cooling infrastructure-leading to split incentives. Thus, those most capable of controlling the use of energy have very little incentive to do so. A single business model where both IT and facilities work together and let the economics drive the solutions as well as implementing chargeback for power will be critical to harness the power and cooling resources within the data center.

4. Implement standardized performance measurements.Assessing and reporting on the energy performance-including power distribution and power cooling-as well as benchmarking will help data center managers better understand the relationships between power distribution and consumption.

In energy-efficiency management, measurement goes far beyond just calculating the ratios, but truly understanding what is behind the ratios. Better analysis, planning, and execution of a green data center can only occur when the underlying relationships are understood and the data center is managed from a holistic perspective.

5. Evaluate investment alternatives to upgrade the data center equipment. Many data centers in operation today were built prior to 2002 and did not include design specifications to support high-density operation and deliver power and cooling to racks operating at over 2 kilowatts. By assessing the needs of the organization, evaluating design alternatives to dramatically reduce power requirements within the infrastructure and applying innovative technologies, significant computing power per kilowatt can be recognized. Just replacing legacy equipment with newer, energy-efficient models can reduce the overall power and cooling requirements of a data center.

Conclusion

Following a greener path has become much more than an altruistic endeavor. Organizations are focusing on developing green data centers in order to reduce the exorbitant power and cooling costs as well as improve operational efficiency. It is quite apparent that the need to construct and operate green buildings will be more and more important-for shareholder value and for the environment. New and emerging tactics and technologies to develop solutions that address today's critical business needs and environmental requirements will deliver economic benefits in an environmentally sound manner, while addressing power and cooling issues in the data center. Decision support systems can assist data centers in becoming “greener” by allowing data center managers manage the physical infrastructure from a holistic perspective and in an integrated process.