Fire up the Wayback Machine, and let’s travel back to the year 1996 — the time when the Telecommunications Act paved a competitive pathway for traditional phone companies and internet businesses. It’s also the year that coined the term “carrier hotel.” Carrier hotels marked the transition point from telcos to data centers, thus opening the new frontier of data processing outside the walls of traditional enterprises.
Fast-forward to today, and carrier hotels have evolved into hybrid IT environments, where workloads are processed on-premises, across different cloud formations, and through virtual realms. For the foreseeable future, the hybrid IT model is where the workloads will remain. Katy Huberty, head of North American technology hardware equity research at Morgan Stanley, underscored this workload processing duality.
“While companies plan to migrate a larger share of their workloads to the cloud, they aren’t abandoning on-premise computing,” she said. “Instead, many are adopting a hybrid IT model in which applications move between a public cloud and their own internal data centers.”
However, as workloads are stretched, it becomes increasingly difficult to optimize the process and gain a holistic view of the end-to-end systems. And even more challenging is determining what will impact business services. No matter where the workloads are, IT administrators will always need to align compliance with regulatory standards, consolidate assets, detect network vulnerabilities, eliminate downtime, and resolve issues with more refined controls and policies.
Take Off the Blinders
Pick your flavor of hybrid IT, and you’ll always find the need for full transparency across the entire infrastructure for effective workload monitoring. In fact, today’s workload sprawl elevates the reliance on in-depth IT discovery to ensure all parts — including virtualization — are profiled. Only through an in-depth view of the entire IT infrastructure can organizations define and control all layers of the IT stack while adding greater support for IT infrastructure library (ITIL) processes. This is where the value of data center infrastructure management (DCIM) solutions really shine.
DCIM solutions offer holistic visibility into the entire workflow process, including historical trends, forecasting, security, incident reporting, and service level agreement (SLA) breaches. These contractual SLA obligations between colocation providers and their tenants outline expected services and various provisions the provider must undertake and deliver. DCIM solutions allow colocation providers to capture data and effectively translate it into customized reports for each tenant to prove services are delivered as contractually outlined.
The largest case study in “fractured IT” environments and the most pronounced need for end-to-end visibility is found in the federal government’s data centers. Although not applicable to colocation, these sprawled hybrid IT environments have become such an unruly tax drain that, in August of 2016, Tony Scott, federal chief information officer of the U.S., mandated the data center optimization initiative (DCOI) go into effect.
The federal government’s mandate is intended to help agencies execute their “cloud-first” IT modernization priorities while also consolidating and optimizing existing data centers. As recent as April 2019, the Government Accounting Office (GAO) released its DCOI report card (GAO-19-241), highlighting how far along agencies are regarding mandated efficiencies. Twenty-four participating DCOI agencies had achieved $1.94 billion in savings as of August 2018. Agencies identified an additional $430 million for a difference of $370 million between planned and achieved savings from fiscal years 2016 through 2018. DCIM solutions were critical for identifying IT areas in need of consolidation while also benchmarking their progress against mandates and goals.
Given the sheer scope of the federal government’s hybrid IT footprint, proving adherence to DCOI mandates would be almost impossible if not for DCIM solutions providing deep-dive visibility into the past baseline and present advancements.
How Far We Have Come
In 1996, when we were all impressed with Alcatel Telecom’s announcement of the new, ultra-fast modem speed of 4 million bits per second, who could have predicted that the Telecommunications Act would pave the way toward hybrid IT environments? ISDN and the much-hyped cable TV modems gave way to smartphones that can hold and stream an entire season’s worth of video entertainment. Behind all these capabilities are various types of data centers that must effectively process all those workloads.
The common denominator among all these hybrid IT environments is the fact that you can’t manage and monitor what you can’t see. If IT administrators lack visibility across colocations, all forms of cloud services, and on-premises devices, there will be performance issues. The U.S. federal government’s workloads moving into the cloud creates perhaps one of the largest hybrid IT footprints. The solid DCIM foundation easing the path to consolidate and optimize federal IT performance should be a template for enterprise organizations transforming and spreading their data through multiple IT initiatives. Even the most advanced, present-day hybrid IT offerings will look like ISDN dial-up services to IT administrators 20 years from now. The most effective method to ensure smooth operations today and in the future is by creating a real-time dashboard visibility into the end-to-end hybrid IT architecture.