5G promises the transfer of mass data at unprecedented speeds, enabling innovation that's only been imagined so far to be seen. Visions of the IoT connecting machines and their learnings through everyday AI are still elusive because the 5G ecosystem is not fully realized, but it’s coming. The big wireless carriers are stepping up the necessary infrastructure investments in fiber, small cells, and high-frequency spectrum, as evidenced in Verizon’s $45.5 billion and AT&T’s $23.4 billion auction wins. But, the network and its pipes are just one piece of the IoT universe — uber-fast computing is how data turns to gold. Shifting computing to the point of the end user and building out edge data centers is when 5G truly comes to life and the IoT is finally delivered. So, the question is, how can data centers evolve to accommodate the next generation? While clients are preparing for 5G technologies and device offerings, the industry is still working on a definition for what exactly an edge data center is.

 

data centers
As data centers evolve, they will be less defined by size and more by proximity to end users and their nimbleness to process, move, and store data.
Photo courtesy of Black & Veatch

 

High-Density Impact

5G is accompanied by machine-to-machine applications and real-time data analytics driven by software defined networks (SDNs) and network virtualization, which are powered by high-density servers housed in today’s core data centers. Hyperscale data centers and cloud facilities are purposely built to handle this network of increased computing power. Enterprise legacy data centers, on the other hand, will have to (if they haven’t already) strategize on migrating to a cloud provider or upgrading infrastructure to be able to process future 5G innovations.

5G's impact on core data center infrastructure will remain centered around optimizing efficiency, reliability, resiliency, and security while meeting increased data and network demands.

The conditioning challenges for all of these next-gen solutions have remained the same for the past 50 years — high-density racks consume more power, generate more heat, and require a different approach to how equipment is configured and cooled. Cooling systems take a significant brunt of greater data processing.

The industry has experimented with how to keep these energy-intensive systems cool. Now, in the 21st century, liquid immersion and direct-to-chip liquid cooling are emerging as go-to solutions for high-density computing. A knock-on effect will be greater emphasis on water usage effectiveness (WUE), which is already a hot topic. An alternative on the horizon is creating a chip that needs less cooling.

Data Center Decarbonization

In close cooperation with cooling is power. These systems work hand-in-glove to keep power usage effectiveness (PUE) in check. 5G and IoT will put even more power demand on an already high-power-consuming industry. Hyperscale data centers with their eye on 5G and beyond are already focused on decarbonization to responsibly manage the expected rise in computing power. The green data center market size is expected to grow from $49.2 billion in 2020 to $140.2 billion by 2026. Beyond renewable energy credits, other initiatives include integrating renewable power sources, such as microgrids, battery energy storage systems, and fuel cell technology using blue and green hydrogen. Interest in nuclear power is also rising as a mature, clean technology for large baseloads, like crypto mining and data center campuses seeking proven technologies for innovation hubs, research facilities, and the like. This may include locating data centers near nuclear power plants or the use of advanced small modular reactors (SMRs) to increase speed to market in an effort to meet demand.

The Rise of the Edge

As data centers evolve, they will be less defined by size and more by proximity to end users and their nimbleness to process, move, and store data. Gartner predicts 75% of enterprise-generated data will be created and processed outside of traditional centralized data centers or the cloud by 2025. Network philosophy and architecture needs to prepare for storing, processing, and converging near end users to deliver the latency, resiliency, and accessibility consumers expect from their digital experiences.

The proliferation of 5G will, undoubtedly, stimulate the need to manage huge amounts of data. The pull of data gravity will strengthen significantly, making it more difficult and costly to move, giving rise to the edge.

At the edge, computing is evolving faster than infrastructure can support, and its proliferation demands smart responses immediately. To realize machine learning, edge data curation, management, and security needs to grow significantly in scale and sophistication.

Infrastructure hubs avail resources already in place. For example, geographically interconnected telecommunication towers with access to power and fiber are a natural choice for deploying edge data centers. Companies, like American Tower and SBA, are actively working to accommodate edge data centers at their tower sites. At the end of 2020, Switch, Dell, and FedEx announced they will be working together to deploy edge data centers and cloud solutions throughout strategically placed Fedex distribution centers. This approach to hubs will likely become common and necessary as 5G brings infrastructure applications closer to sustainable power sources and fiber to support advanced infrastructure, like autonomous vehicles and industrial IoT.

Accelerating the Edge

Resources in place put IoT within reach. However, deploying one edge data center at a time will most likely not get the job done. To accelerate deployment, owner/operators need to be thinking strategically about how to effectively design and build edge data centers to simultaneously leverage opportunities of scale and speed to market. Each edge data center will be unique and customized to an organization and its location’s requirements. Flexible and scalable solutions will be key to a successful edge deployment. Some key considerations include the following.

  1. Understand data requirements — How an enterprise designs its edge facilities depends heavily on latency requirements and how data will be consumed, secured, stored, and transported. Will the data be processed and stored locally at the enterprise for real-time data analytics and response or transported and stored at the cloud? All of this is important to the design, location, and size of a data center.
     
  2. Modular, scalable, and flexible design — As 5G continues to evolve, so will data requirements. It’s widely accepted edge data centers will require modular facilities to accommodate the scalability data growth will require. Current supply chain crunches only accelerate this need.
     
  3. Large-scale deployment capabilities — Efficiently rolling out edge infrastructure across multiple sites requires the ability to coordinate a complex network of interdependencies to remain on time and within budget. It takes a large team of partners and vendor relationships to perform site design, permitting and zoning, prefabrication and logistics, installation, and commissioning.
     
  4. Systemwide perspective — For multisite enterprises, visibility into the entire network of edge facility deployment and ongoing maintenance will be necessary to drive consistent customer experiences across all locations. Remote monitoring and diagnostics, along with rapid and efficient maintenance will be key focus areas for edge owner/operators.
     

Consider the entire edge data center deployment process on a continuum from site identification and due diligence to permitting and from zoning to engineering and design to installation and commissioning. Owner/operators may want to consider a project manager and an engineering and construction partner who can help manage the entire deployment and mobilize quickly at the speed and scale required.

For the foreseeable future, the extreme dynamics of 5G will remain complex. The need to navigate this fluid, interlinked web of dependencies is not going to pass. As it is impossible to fight obsolescence, everyone involved in this undertaking will need to make a decision on the standard technology philosophy and architecture that will allow for agility and growth. We need to accept that we will always be, to a greater or less extent, behind the technology curve — always chasing the next big thing, the next generation.