We are a decade into the software-defined data center (SDDC) concept, driven by virtualization and the increased utilization of compute, storage, and networking through shared platforms. While that journey has proved to be successful, it is not yet comprehensive. A critical foundational element is missing: infrastructure management. In order to achieve maximum efficiency and efficacy in a virtualized ecosystem, data center managers must enable software to intelligently manage the facilities themselves.

By all accounts, data center automation must continue to mature over the next decade to match other advances.

“The market is heading toward the fact that everything is going to be software-defined, everything is going to be software-controlled, and the internal spending of the IT department is increasingly shifting to software rather than hardware now,” said Roy Illsley, principal analyst at Ovum.  

This customer spend shift puts additional price pressure on those companies delivering foundational data center services to support them.

A Paradigm Shift is Coming

Bottom line, data center providers cannot effectively compete from a dollar-per-kW perspective without adopting software-defined architectures. In addition to being more efficient and economical, data centers of the future will take on different forms with a higher concentration of hyper-converged infrastructures and distributed edge computing workloads. As more data moves to the edge — think AI, gaming, autonomous vehicles, and smart cities — there will be intensifying pressure on enterprises and data center providers to deploy modular, distributed data centers that leverage increased levels of security, interconnectivity, and operational efficiency.

Chetan Sharma is a telecom industry consultant and futurist who projects the edge economy to reach $4.1 trillion by 2030.

“I think we’re entering a seismic paradigm shift,” he said. “A number of technologies are coming together to create this experience the world has never seen before.”

In edge data centers, power efficiency will be at a premium — and a defining factor in the success or failure of the operation. In 2017, U.S.-based data centers consumed more than 90 billion kWh of electricity, and, as noted by Forbes Technology Council member Radoslav Danilak, it would take 34 coal-powered plants generating 500 MW each to equal those power demands. If power is overprovisioned, which has long been the norm in traditional IT environments, data center profit and loss will continue to face an uphill battle. On the other hand, insufficient power allocation will come with a hefty price if surges caused by workload peaks adversely impact customer experience or business performance.

In centralized data centers, there’s a constant balancing act between managing energy supply and demand. Built capacity is often overallocated but under-used, which leads to “robbing Peter to pay Paul” — taking energy allocated for one workload and using it to address a spike or peak created by another one. Unfortunately, in distributed edge data centers, there is typically no surplus to support energy reallocation. Nor is there as much control over edge power sources, which may not be as clean, consistent, or reliable as a traditional data center.

Challenging the Status Quo

With these factors in mind, the industry must widen the scope of SDDC to include software-defined power (SDP). The best way to optimize power utilization in edge data centers is by combining software virtualization with AI, machine learning, and edge hardware to automate data center operations and power management. The result increases utilization of installed capacity by removing unnecessary buffers and enabling matching and dynamic service level agreements (SLAs) while mitigating risks and improving uptime (Figure 1).

Beyond greater response capabilities and efficiency, not automating facility infrastructure will impact the bottom line. In today’s environment, where hybrid portfolios are less dependent on physical infrastructure, data center operators need to anticipate a new cost reality and adjust accordingly. In a recent interview conducted at Pacific Telecommunications Council’s annual meeting, power was emphasized as an element that needs to be included in every SDDC. If operators adopt a software-defined power methodology, they can increase margins, capacity, and resiliency.

Is there a data center anywhere that doesn’t need to reduce capital investments and operating expenses? No data center operator or provider wants to continue paying for, allocating, and being responsible to deliver 15 kW per rack when workloads are drawing 5 kW. It is counterintuitive to invest twice as much capital for capacity customers aren’t actually using — nor is it good for the planet.

When power and cooling consume approximately 40% of data center operating budgets, it’s hard to accept status quo. Software-defined power automatically identifies, aggregates, and pools all sources of stranded power within a data center and routes energy on demand to racks, nodes, workloads, or circuits in real time. Using machine learning and predictive analytics, cutting-edge SDP platforms reallocate data center power according to capacity and availability demands while reducing the power-related costs of capital and operational expenses by up to 50% annually.

Moreover, these efficiency and financial transformations are not confined to new builds. Virtual Power Systems found that greenfield data centers originally designed at 20 MW can now be built and provisioned with only 14 MW when leveraging SDP. With CapEx averaging $10 million per MW, that’s $60 million in savings just on the original design. Meanwhile, SDP enables brownfield facilities to reuse 30% to 60% of their stranded and unusable power, making modernization possible regardless of structure. With these optimizations in mind, it’s easy to see that until power is added to the party, SDDCs fall short of helping operators and providers take full advantage of the benefits that have been promised.

And don’t forget the impact of the IoT. International Data Corp. estimates that, by 2025, there will be 175 zettabytes of data created annually, with almost 50% generated at the edge by more than 41 billion connected IoT devices, or “smart things.” This could result in data centers consuming up to 5% of the world’s electricity.

Intelligent software that is capable of driving real-time decisions will be the key to efficiently operating core data centers and new edge deployments. These facilities will require orchestration of compute, network, storage, and power resources. SDP is really the only way to further power utilization while improving data center scalability, flexibility, resiliency, programmability, and intelligence. SDP puts data center operators on an accelerated path to designing, building, and maintaining next-gen infrastructures with “power aware” workload orchestration. As a result, enterprises and colo providers will get more out of their capital deployments by matching IT loads to dynamic SLAs.

It’s time to bring data center power provisioning into the 21st century.