Recent Survey Highlights Flaws in Data Center Capacity Planning
Lack of centralized toolsets and increasing rack densities are challenging IT and facilities teams’ ability to make capacity planning decisions
IT and facilities are struggling with capacity planning — 90% compromise capacity planning at least some of the time because they do not receive quality data about IT equipment or system performance. Over two in five feel they could make better decisions if they received quality data in time (41%) or that upgrading infrastructure would have a better impact on operations (43%).
With 85% of the organizations in the survey reporting outages or downtime in the last 12 months, and an average downtime costing $150,619, the problem is real. Many cannot get quality data about data center performance or meet their deadlines because of this.
A digital twin offers benefits in the form of a collaborative platform that can calculate what-if scenarios, prevent network failures, and facilitate moving to a high-density environment. For those companies utilizing computational fluid dynamics (CFD), there were significantly less downtime episodes reported in the past 12 months.
Half of the organizations studied are considering using CFD to virtually represent the physical data center as a planning tool in the next year to keep track of their systems. For many organizations, knowledge of CFD, cost, and trust in the system are barriers to adoption.
“This study highlights some of the key challenges with capacity planning and points to the digital twin as a solution,” said Akhil Docca, director of marketing at Future Facilities. “It’s a significant issue for IT and facilities teams when companies are reporting they have to compromise capacity planning because they don’t have visibility into their equipment and performance. Simulations like CFD offer that visibility and a way to model any data center environment to understand issues like networking, cooling, power, and configuration.”