Some may be surprised to learn that the benefits of chassis-level liquid cooling can be leveraged for deployments anywhere today. How an organization deploys liquid cooling technologies might vary, but the key advantages are now flowing through to colocation implementations, enterprise infrastructures, legacy data centers, and the edge.

Several factors are driving rising demand for liquid cooling solutions — not least because tried and trusted strategies are no longer enough.

High Performance With Maximum Efficiency

The earliest adopters of liquid cooling, including the likes of IBM or Lenovo, for example, have often been large enterprises engaged in high-performance computing (HPC) or supercomputing projects— targeting the highest performance at the lowest temperatures possible across their entire infrastructure, with additional focus on energy efficiency in some instances.  

Meanwhile, the hyperscale companies of the world, like Microsoft, Facebook, and Google, require data center support for increasingly advanced analyses, monitoring and management, and data handling. Consumers are using these companies' platforms and apps, which are required to enable and provide data-intensive solutions, such as facial recognition and analyzing behavior and trend patterns.

In addition to the aforementioned, a broader range of organizations have begun targeting artificial intelligence (AI), machine learning, data analytics, and other resource-hungry applications, such as advanced imaging. Yet traditional air cooling strategies haven't kept up with these developments, as heat sink volume and footprint limitations, along with the inability for fans to deliver the airflow required for high-performance processors, is bottoming out.

Today’s solutions are likely budgeting as much as 20% of server power to the air-cooling circuit, whereas it used to be 10%.

It's clear the industry has reached a point of diminishing returns when it comes to air cooling for data centers, especially because disruptive, intelligent applications are finding new deployment and use cases across almost every market segment from enterprise, health care, and financial to oil and gas, warehousing, and logistics.

The overarching macro trend is for using data in more intelligent ways to generate valuable outcomes, and this requires better solutions for cooling. Fortunately, liquid cooling is better at removing and recovering the heat (warmed liquids, for example, can be piped elsewhere and deployed to achieve other purposes, such as site heating).

The Emerging Edge

A myriad of applications are also getting closer to the edge of the network. Software is deployed alongside smart IoT sensors that collect or process data in situ, for instance, to support process automation. Edge computing means a growing need to handle, manipulate, communicate, store, and retrieve data quickly, efficiently, and cost-effectively whenever required.

Organizations can no longer afford to rely completely on centralized data centers and their related latencies; it has become critical to be able to analyze and manipulate data in near real time to improve outcomes.

Gartner projected that, by 2025, most cloud service platforms will provide at least some distributed cloud services that execute at the point of need. Distributed cloud can replace private cloud and provides edge cloud and other new use cases for cloud computing. In 2018, perhaps 10% of enterprise-generated data was processed outside of a centralized data center or cloud, with the share expected to potentially reach 75% by 2025.

So, the stage has been set. It's clear that worlds are colliding in a race for high-performance computing that can meet the needs of disruptive, data-hungry applications. Everyone wants these benefits; it doesn't matter whether you're an enterprise or a hyperscaler, a financial services provider or a hospital.

Data center professionals might argue that this sort of technology doesn't belong at the edge — that the risks are just too great. That's where offerings like chassis-level liquid cooling platforms can provide secure, high-performance, environmentally sealed solutions that  can be monitored and managed remotely to help mitigate risks even at the perimeter.

Managing Risk Trajectories

Legacy data centers looking into digital transformation stand to gain, as well as colos, because chassis-level liquid cooling can mitigate challenges around serviceability, density, and restricted floor space.

For colocation providers, it can be even more critical to mitigate risk, both from their own perspective as well as  their tenants’. As a result, there will be a major shift to liquid cooling, whether the solution chosen is direct-to-chip or immersion-based.

Chassis-level immersion technology delivers many organizations the highest possible performance cooling in a form factor that's both rackable and vertically scalable — unlike the “bathtub” type solutions of the past. Because it comes in a form factor that is both serviceable and sits neatly inside the rack, it still delivers many of the advantages and conveniences of air-cooled strategies.

Chassis-level immersion cooling technology is market-agnostic; it can be deployed anywhere for resiliency and efficiency at high density while still preparing to meet the power and cooling requirements of future CPUs and GPUs. Right now, organizations are struggling to understand what platform they need, which technology should be used, how fast their processing should be, what kind of memory and networking will suit, etc., so providing a clear path for cooling can ease some of their pain.

The industry is at a crossroads; however, you can have your cake and eat it too when it comes to increasing cooling performance, energy efficiency, and density in your environment while mitigating risks.