Over the last 30 years, the IT industry — data centers in particular — has seen the pendulum swing from centralized to decentralized and back again. Is it any wonder why mission critical operations are so driven by the latest buzzword of the day? In the last decade, as we decentralized into the cloud, we all learned it’s not cost-effective to build data centers for less than 2 MW of power requirements, right? We knew the economies-of-scale wouldn’t allow us to be competitive. But with the edge needing nearline data and processing to reduce latency for data-intensive applications, we’re now rethinking how our facilities can best support new dense processing and storage.
So why are edge data center applications breaking the mold? If you ask me, the process of business requirements -> driving IT requirements -> driving data center facility requirements is once again in vogue. If the business application can perform better by processing some cached data at the edge, then that’s what must be done. In 2018, Gartner reported that 80% of traditional enterprise data centers as we know them today will be closed by 2025. Some of this transition will be driven directly by edge applications. Global Market Insights forecasts the edge data center market to grow from $4.5 billion in 2018 to $16 billion by 2025.