Those lazy, hazy days of summer are upon us again and while I have written about this before, many small and midsize firms will still see their data center’s cooling systems pushed to their limits, and even far beyond. So you may want to consider putting sunburn lotion on your servers or you can use some of these cooling tips to keep them from overheating.

This is especially true for rooms located in mixed use buildings that are not using large dedicated cooling systems with enough extra capacity for those very hot summer days.  Many IT departments are “sweating” out the summer (again), hoping that they will not have servers suddenly crashing from over-temperature shutdowns.

Here are a few tips, tricks and techniques that may not solve the long term problem, but may help enough to get you through the summer. Many times, when the actual capacity of the cooling system is not severely exceeded by the actual heat load of the equipment, optimizing the airflow may improve the situation until a new or additional cooling system is installed.

1. If it feels warm, don’t panic - even if you see 80°F, in the cold aisle!  Yes, while this is hotter than the proverbial 70-72°F data center “standard” you were used to (and you may not enjoy working in the room), however it may not be as bad for the servers as you think. If the highest temperature reading in the front of the rack is 80°F or less, you are still within ASHRAE’s TC 9.9 latest “Recommended” guidelines. Even if the intake temperature is somewhat higher (up to 90°F), it is still within the “Allowable” guidelines.

2. Take temperature measurements at the front of the servers. This is where the servers draw in the cool air and is really the only valid and most important measurement. Take readings at the top, middle and bottom of the front of the racks (assuming that you have a Hot Aisle - Cold Aisle layout).  The top of the rack is usually the highest. If the bottom areas of the racks are cooler, and you have open rack space, try to re-arrange the servers nearer the bottom (or coolest area) of the racks.  

3 Make sure that you use blanking panels to block off ANY and ALL open un-used spaces in the front of the racks. This will prevent hot air from the rear re-circulating into the front of the racks.

4 Don’t worry about rear temperatures – even if they are at 100°F or more! (this is not unusual)  Do not place random fans blowing at the rear of racks to “cool them down” – this just causes more mixing of warm air into the cold aisles -  (I wish I had a dollar for every time I have seen this)!

5. If you have a raised floor, make sure that the floor grates or perforated tiles are properly located in front of where the hottest racks are.  If necessary re-arrange or change to different floor grates to match the airflow to the heat load. Be careful not to locate floor grates too close to the CRACs, this will “Short Circuit” the cool air flow immediately back into the CRACs and rob the rest of the room/row of sufficient cool air.

6. Check the raised floor for openings inside the cabinets. Cable openings in the floor allow air to escape the raised floor plenum were it is not needed, and lowers the available cold air to the floor vents in the cold aisles. Use air containment brush type collar kits to minimize this problem.

7. If possible, try to re-distribute and evenly spread the heat loads into every rack to avoid or minimize “Hot Spots”. Remember, check the temperature in the racks at the top, middle and bottom, before you move the servers, just relocate the warmer servers (again based on the front temperatures) to a cooler area. Then use blanking panels to fill any gaps. Recheck all the rack temperatures again to make sure that you have not just created new hot areas.

8. Check the rear of racks for cables blocking exhaust airflow. This will cause excessive back pressure for the IT equipment fans and can cause the equipment to overheat - even when there is enough cool air in front.  This is especially true of racks full of 1U servers with a lot of long power cords and network cabling.  Consider purchasing shorter (1-2 foot) power cords and replacing the original longer OEM cords shipped with most servers.   Also use the shortest possible network cables as well. Use cable management to unclutter the rear of the rack so that the air flow is not impeded.

9. If you have an overhead ducted cooling system, make sure that the cool air outlets are directly over the front of the racks and the return ducts are over the hot aisles. I have seen sites where the ceiling vents and returns are poorly located, the room is very hot, yet the capacity of the cooling system has not been exceeded simply because the all the cool air is not getting directly to the front of the racks or the hot air is not properly extracted.  The most important issue is to make sure the hot air from the rear of the cabinets can get directly back to the CRAC return, without mixing with the cold air. If you have a plenum ceiling consider using it to capture the warm air and add a ducted collar going into the ceiling from your CRAC’s top return air intake.  Some basic duct work will have an immediate impact on the room temperature. In fact the warmer the return air, the higher the efficiency and actual cooling capacity of the CRAC.

10. Consider adding temporary “roll-in” type cooling units only if you can exhaust the heat into an external area. Running the exhaust ducts into a ceiling that goes back to the CRAC does not work. The heat exhaust ducts of the roll-in must exhaust into an area outside of the controlled space.

11. When the room is not occupied, turn off the lights.  This can save 1-3% of electrical and heat load, which in a marginal cooling situation, may lower the temperature 1-2 degrees.  

12. Check to see if there is any equipment that is still plugged in and powered up, but is no longer in production. This is a fairly common occurrence and has an easy fix, just shut it off!

The Bottom Line

While there is no true quick fix when your heat load totally exceeds your cooling system’s capacity, sometimes just improving the air flow may increase the overall efficiency 5-20%. This may get you though the hottest days, until you can upgrade your cooling systems. In any event, it will lower your energy costs, which is always a good thing.

Plan ahead. If all else fails, have a fall-back plan to shut down the least critical systems, so that the more critical servers can remain operational (i.e. email – financial, etc.).  Make sure to locate the most critical systems in the coolest area. This is better than having the most critical systems unexpectedly shutdown from overheating.

This way you may actually be able to enjoy the weekend on the beach with your pina colada, instead of worrying if you will start getting (or perhaps not getting) high temperature warning email messages on your smartphone, which you may then need to quickly update your resume.