Fill a large space with machines powered by electricity and one quickly observable result is massive heat generation. For edge data center operators, addressing the heat emanating from rows and aisles of servers, storage, telecoms equipment, and more is one of their most formidable challenges. If not done correctly, the cost of heat removal can easily exceed the costs of running all of that expensive equipment.  And as most data center operators see it, the complexity of removing heat often eclipses other issues related to powering the equipment.

 

EFFICIENCY VS. EFFECTIVENESS

There are two fundamental dimensions to improving edge data center cooling: efficiency and effectiveness.

Not surprisingly, any discussion of data center cooling likely will include talk about efficiency and power usage effectiveness (PUE). In this case, efficiency simply means taking all the steps possible to minimize the amount of power consumed to cool the IT load, because that effort in turn preserves the maximum amount of power to dedicate to the IT load.

In contrast, the effectiveness of cooling efforts often is an assumption. In this sense, effectiveness is simply about ensuring that the frigid air generated to cool the IT load is doing the most productive work possible. In other words, the chilled air is going where the cold air is supposed to go rather than leaking out in various places where it is wasted.

 

ENTERPRISE VS. MULTI-TENANT DATA CENTERS

How data centers view their cooling needs can vary by how the data center itself is defined.

Enterprise data centers — those that provide all the equipment used either by single or multiple clients — typically are more concerned with efficiency. That’s because they have full say in what equipment is deployed there.

Multi-tenant data centers provide their customers with controlled data center environments while saving them the expenses associated with finding their own space and providing needs such as heating, cooling, and even racking equipment. But because the tenants largely provide their own IT infrastructures and architecture, for multi-tenant sites, the primary cooling concern generally relates to effectiveness.

 

EFFICIENCY CONCERNS FOR ENTERPRISE DATA CENTERS

Cooling efficiency worries can cause enterprise data center operators to lose sleep.  Even a small decrease in PUE at a very large scale can have a material impact on operating costs. The good news for operators, however, is that they have the advantage of controlling their power and cooling infrastructure in addition to all the IT equipment and applications. 

Having end-to-end control of what elements comprise the data center means enterprise operators have much more of a say in design and systems architecture, giving them the opportunity to tease out every last bit of efficiency, while ensuring everything works in harmony.

 

EFFECTIVENESS CONCERNS FOR MULTI-TENANT

In a multi-tenant data center, there are tens, if not hundreds, of customers with different IT equipment specifications and technology refresh cycles, and they all want to follow their own deployment standards and guidelines. Unlike in an enterprise center, the multi-tenant center’s operator has less (or even no) control over the individual pieces of equipment coming into the center, which means their cooling tasks can grow in complexity very quickly.

That’s why cooling effectiveness is generally the bigger concern — and also the larger opportunity — for the multi-tenant data center operator. While no one wants to set standards based on the lowest common denominator, the multi-tenant edge data center operator must remain aware that some or many of its customers might not have IT equipment that meets the latest American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) standards.

 

COOLING CONSIDERATIONS

One of the primary steps in ensuring the most effective cooling is educating the data center’s tenants about good deployment standards. For example, alerting them about simple steps and strategies they can take to ensure good air management can help customers protect their brand and get the most out of the services they are paying for.

Delivering cooled air to a customer that is blocking the airflow with poor cable management leaves that customer exposed to technical and business risk. In raised floor data centers, temperatures can vary from closer to the floor to the tops of tall pieces of equipment, so options to maintain more consistent cooling throughout the area are important. Implementing cold aisle containment, either as a standard design element for a new build or going into an existing data center, is a smart idea. It works by providing customer blanking panels for their IT cabinets and placing end-of-aisle doors at the ends of every cold aisle, driving efficiency and effectiveness in data center cooling.

Leveraging good air management strategies remains the easiest “low-hanging fruit” in a colocation data center because it benefits both the data center service provider and the customers.

Many customers can benefit from being educated about these issues as they are adding equipment into their data center space. Simply having a data center manager meet with customers and review their equipment placement and layout can help alleviate many of the setup and ongoing problems.

While there is certainly a great deal of focus on green power to be environmentally friendly and carbon neutral while building and operating data centers, water is another natural resource that should be top of mind when thinking about efficiency. 

Water cooled chiller plants with complex controls have been a data center staple for many years.  While these systems can be very efficient, there are newer technologies available that are more modular, more efficient, more intelligent, and that use no water.

These waterless systems lend themselves very well to different physical deployment scenarios. For instance, some systems focus on economizing the use of pumped refrigerant and can be deployed in data centers located in both multi-story, multi-tenant buildings as well single-story, single-tenant footprints. Systems like these offer good options for upgrading and are typically selected to provide “like for like” replacement of end-of-life (EOL) legacy DX systems. This type of upgrade provides operators a viable path for improving the operational efficiency and resiliency of those data centers to meet the increasing IT demands of today.

Some operators focus on cooling effectiveness by containing the warm air and keeping it separated from the cooler, conditioned air with physical barriers. Systems like this typically include a physical barrier that separates the cool air entering the intake on the equipment from the heated air exiting the system via its exhaust.

Another consideration for improved energy effectiveness, in addition to replacing EOL systems, is by taking advantage of utility incentives. Many regional utility companies offer exclusive deals for their customers who implement intelligent cooling management systems.  The utilities feel that these smart systems, that deliver monitoring, dynamic control, and analytics — improve the resiliency and efficiency of an existing data center and put less stress on the energy grid while also saving their customers money.

Being smart about cooling efficiency and effectiveness translates into better customer service, improved ROI, and brand protection for customers of multi-tenant and enterprise data centers.