Tackling Overprovisioning in Enterprise Data Centers
It's not the cost of business
Tech literature is full of commentary on the ever-changing landscape of data centers — everything is moving to the cloud, enterprise data centers are dead, colocation is the only way. A hybrid model mixes and matches these services, and, indeed, this works for some enterprise data centers. What underlies this message is that on-premises data centers, built to support single organizations, aren’t going away anytime soon. For some enterprises, especially those with significant legacy data or with strict compliance requirements or security needs, the on-premises solution may be the only option.
With all that necessary compute power and options for expanding the data center footprint outside the organization’s walls, it’s worth asking a few questions. Is your enterprise data center the right size for your needs? Are you getting the full value from your investment?
Many data center operators build redundancy and extra storage or computing power into their data centers just in case. Research from Future Facilities shows that data center operators build three data centers for every two they need. The risk avoidance of overprovisioning makes sense to some, given the importance of uptime and smooth operations. But it’s a lot of investment in extra capacity that may never get used.
In light of a few facts, overprovisioning may seem to just be the cost of doing business. Approximately 77% of organizations are seeing increased demands on their infrastructure and pressure to maximize space and resources. And 29% of organizations are having to compromise capacity planning decisions all the time.
Yet, in reality, overprovisioning is uneconomical. With recent research from AFCOM showing that more than half of data center operators own between two and nine data center facilities — and many plan to increase the number they operate — it is an extremely expensive risk-avoidance tactic and is costing organizations 136% of their operating budget. Enterprises managing their data centers this way are not getting full value from their investment.
By using a digital twin — a three-dimensional, virtual model of the physical data center — organizations can improve the performance of their data centers and increase capacity without additional operating costs for computing equipment or power and cooling infrastructure. Using physics data on how the components of a thing operate … a digital twin can be used to analyze and simulate real world conditions, respond to changes, improve operations and add value, according to Gartner.
With a digital twin, owner-operators can:
- Test scenarios before purchasing equipment or rolling out new projects.
- Maximize data center capacity.
- Identify and upgrade or replace outdated infrastructure.
A physics-based digital twin is not just a one-time tool to use in the design phase or prior to expansion or implementation of major projects. Data center operators can use a digital twin to continuously test changes and adjustments when any new equipment is deployed before installation. With ever-increasing demands being placed on data centers, digital twins can ensure that businesses meet the needs of their enterprise while remaining within budget.
With the visibility a digital twin gives into existing enterprise data center performance, organizations can improve operational efficiency and maximize existing capacity without running the risk of downtime. Overprovisioning really shouldn’t be the cost of doing business.