By virtue of its name, starting from the days of the mainframe, we traditionally tend to think of the “data center” as the central point for the data processing, storage, as well as the nexus of the data network. While this was clearly the case for most data centers in the last century, this has begun to change since the advent of the internet. The trend has gained momentum, as software centric architecture, such as virtualization coupled with data replication distributed content networks became widespread. And of course, last but not least, the “cloud,” which has now become the metaphorical data center of today’s universe.

In some cases data centers are really not storing, or processing data as such, instead they store and deliver content, such as music, videos, and movies. Moreover, for the past few years, we have repeatedly heard the warnings of the coming tsunami of data from the Internet of Things (IoT). Furthermore, the terms edge, hybrid, and modular, as well as micro, hyper, and mega have crept into today’s buzzwords which are now used in conjunction with data centers. So what about 5G, and how will it impact the role and design of the data center power and cooling infrastructure?



While 5G is still a work in progress, it has moved from concept to field trials, and carriers such as Verizon and AT&T expect to start delivering early deployments in the next 24 to 36 months. What ultimate capacities 5G will deliver when the standards are finalized and the 5G wireless network is fully built-out is yet to be seen. What is known so far is that there will be a 10 to 20X increase in per channel speed, coupled with a much greater number of channels per node, resulting in a much higher overall data throughput than current 4G LTE per cell-site capacities.

This, when coupled with more applications such as IoT devices, will generate a magnitude increase in data. While 5G, when it is fully deployed, will be the next upsell point for your next smart phone, its primary focus is the myriad of new applications, such as autonomous vehicles. According to a February 2017 Intel investors meeting, Intel projected that by 2020 these applications will generate:

  • Autonomous vehicle: 4 TB data/day

  • Connected plane: 5 TB data/day

  • Smart factory: 1 PB (petabyte) data/day

To put things into a time-scale perspective, Verizon and equipment manufacturers Ericsson and Qualcomm are not waiting for 5G to be completely finalized, they are continuing to develop and test faster 4G LTE networks by increasing speed and capacity by bonding multiple channels together to deliver Gigabit speed in working trials.

 According to a September 11th announcement by Verizon, they envision 5G will support increased use of video surveillance and drones deployed in “smart cities” applications:

  • 5G enhanced intelligent video surveillance: Today, 4G LTE is used to connect surveillance cameras across cities in the U.S. With 5G, there will be significantly more capacity to stream the captured video back to the network and support large numbers of cameras, enabling new levels of intelligence. With all streams coming to a central, video optimized repository in the core of the 5G network, additional analysis can be applied, providing actionable intelligence.

  • Drone powered by edge compute: The Verizon and Ericsson Distributed Edge Cloud amplifies the power of simple drones to match or exceed the capabilities of complex drones, which are much more expensive. This proof of concept demonstration shows that as intelligence and processing are moved to the 5G core and the very edge of the network, existing device constraints will be lifted, enabling advanced applications with low cost devices.

Of course, all of this expanded surveillance could prove very useful for recording incidents in smart cities when early deployments of fully autonomous vehicles begin to interact with NYC taxi drivers.

Nonetheless, what seems relatively clear today is that it will require a massive number of wireless nodes, since so far 5G wireless is expected to operate in the 28/39 GHz fixed wireless access used in trials and deployments, according to Verizon 5G Technology Forum ( This forum was formed in cooperation with ecosystem partners Cisco, Ericsson, Intel, LG, Nokia, Qualcomm, and Samsung. While there will be much greater bandwidth, the coverage range will be much smaller due to higher frequencies which are not able to penetrate buildings and other objects as well as current 4G LTE systems.

Therefore, there will be a need for much greater network processing power and high-speed data caching (e.g., multi-terabyte, flash, ssd, or whatever develops) at each 5G node, or in-building micro-cells. Each 5G node or micro-cell will effectively be and act as a micro data center. While conceptually similar to the small telecom shacks at each cell tower, in many situations they will need to be condensed into very small weatherproof self-contained enclosures the size of a single rack, half-high rack, or even a suitcase size enclosure. They will need to be installed in locations with no generator back-up power and most likely no external source of mechanical cooling. I believe some of these smaller units will need to be based on internal liquid cooling, capable of operating continuously in ambient temperatures of 140°F (60°C) or possibly higher. In addition, some locations will most likely be supported by Li batteries (or whatever new battery chemistry will be developed in the next few years) capable of eight to 24 hours of back-up time.



I am writing this article on the heels of major hurricanes Harvey and Irma that tragically impacted millions of people in Texas, Florida, and other areas in the U.S., as well as many others that were devastated in the various Caribbean islands. While there is nothing meaningful I can say here to help those who have suffered from these and other natural catastrophes, I can hope that recovery occurs as quickly as possible.

Notwithstanding the above, I have seen many posts and articles about U.S. data centers that remained operational because they were well designed, solidly built, located above the flooded areas, and had stored enough fuel for their generators. In other cases some well-built sites with high-levels of redundancy were located in areas which also lost power, but in those areas roads and bridges were so badly damaged that trucks could not reach them to refuel. In some other cases, they may have remained “operational” on generator, but were essentially off-line due to lost connectivity caused by massive widespread damage to the communication infrastructure. We saw all these conditions in New York City and especially in the Financial District which shut down for two days in the aftermath of Hurricane Sandy. In the case of Texas, over 50 in of rain fell in the Houston area and a new rainfall color had to be added to the National Weather Service maps. We are living in an age where the 100-Year Flood is now the new normal. Clearly, it is no longer prudent to depend on any single facility even if it is a Tier 5 data center.



The internet hyper-scalers, Amazon, Google, Facebook, and Microsoft have long been built based on widespread geographic diversity and IT redundancy as the source of their “availability.” While this article is about 5G potential impact on future data center design, which requires continuously available network coverage based on many autonomous self-supporting nodes capable of meshed communication via wired networks and wireless communications. Their size, form factor, and capacity may originally be designed by the wireless carriers, however, as the various new applications develop the role edge and hybrid data centers will expand and for highly resilient micro-grids.

My last column, “Tier Wars,” discussed the issue of data center “standards” from differing organizations, which directly or indirectly focus on the redundancy levels of facility power and cooling infrastructure. The true purpose is to ensure that the application and data is protected and “available.” While there will still be a need for the more traditional data centers with high levels of redundant infrastructure for critical core operations, the “edge” data center will become more pervasive. As evidence of this, on September 19th the Uptime Institute, which is normally focused on the traditional UI Tier (I-IV) Certified large data center, announced its “Tier Ready Certification” for edge data centers aimed at smaller pre-fabricated modular, micro, and container data centers. Most of these units are expected to be Tier I or II.

 “The TIER-Ready program enables manufacturers to work with Uptime Institute to validate the specific designs of their pre-built solutions. Customers of these TIER-Ready solutions will enjoy reduced time and cost for a Tier Certified data center. TIER-Ready solutions are available now from a wide variety of manufacturers,” according to the press release.

This distributed edge computing model will only increase once 5G becomes fully developed and widely deployed. This will be tightly coupled with hyper-scale data centers with multiple communications paths which will play an integrated role highly intertwined with edge and mesh 5G nodes.

While 5G applications go far beyond just delivering more HD Netflix streams to smartphones, everyone will want a 5G phone once they become available. So for now, the new iPhone X is not 5G capable, despite being the first phone to break the kilo-buck price barrier. Nonetheless, I expect those who purchased them will be more than happy to upgrade to the first “5G ready” iPhone Y or Z when it becomes available. However, I would highly recommend they get the, ultra, really-ultra, “unlimited” data plan.