Technology’s Impact On The Data Center Environment
An evaluation of the total cost of ownership is needed.
Over the last year I have been writing about the new role of the CIO. Columns have included such topics as program roll outs, TCO and cloud, metadata, the role of the CIO driving profits, and technology innovation. This year, I will be discussing the impact of technology on the data center environment both through infrastructure technology as well as data center technologies. Both technologies will require evaluation of the total cost of ownership (TCO), which includes system efficiencies and life-cycle costs. From the server and software side of technology I will discuss the benefits as well as the requirements to deploy new technology within the data center.
FOCUS ON PUE
Three years ago approximately 70% of existing data centers operated with a power usage effectiveness (PUE) of 2.0 or greater. Today, that number has been driven to less than 50%. The primary reason for the 70% operating at 2.0 or greater was that the data center mechanical system’s calculation was typically at 1.8 of the electrical usage. This was the case for data centers that were constructed eight years and above. Today’s average design calculation is between 1.4 to 1.5 PUE. However, in many cases we are able to operate at 1.2 to 1.35 PUE. This is due to new technologies that are more efficient than systems of past generations.
QTS AND THE KYOTO SYSTEM
Recently, ESD was awarded the design of QTS’s Chicago data center. One of the exercises in the design was to create a mechanical system comparison concerning multiple mechanical systems. By the end of the study, the Kyoto systems proved to be one of the most cost effective systems concerning first cost as well as operating cost showing a PUE of 1.15 annually. In addition to a low PUE, the system also does not use water, which is a plus as more and more companies are looking to reduce water consumption. Be sure to read our article in this month’s issue titled From Paper to Digital on page 48. Table 1 identifies system comparisons on PUE conducted for QTS.
A LOOK AT CLOUD PROCESSING IN 2016
Above and beyond data center infrastructure, I will do an in-depth review of new technologies (processing) that have an effect on data center infrastructure. In the past, virtualization processing on blade technologies has led the way to impact both space and cooling in the data center. However, in 2016, the primary impact on data center processing will also include cloud processing by either hybrid or complete outsourcing of processing. As in 2015, it is still anticipated that a majority of large enterprise companies will continue to outsource 7% to 10% of processing to the cloud, and continue to keep critical processing in house. However, this may change upward due to additional cloud offerings that are being created by providers such as Microsoft and Amazon.
NEW YEAR’S RESOLUTION
While there are several technology guru’s within the industry, when Mark Zuckerburg makes a resolution he generally keeps it (unlike me). His resolution for 2016 is to pioneer into artificial intelligence (AI). In order to achieve this, data input will be constructed on a meta level with a mass storage capability similar to high-performance compute (HPC). While the impact on data center processing in 2016 may be minimal, the overall impact 10 years from now may be significant to support if he is able to pull it off.
For me? I will just try to stick to exercising more often in 2016.