Think back to 2000. What did the data center look like? New, higher density server technology was trending, but adoption rates weren’t particularly impressive. Data centers could typically support an average density of 150 watts per square foot (W/sq ft). The concept of cloud computing and server virtualization was still just that, a concept, as these strategies hadn’t yet been put into practical application. Data centers were still seen as cost centers, not as a means to generate profit. The digitization of data wasn’t as rampant as it is today and the world’s Internet population was still fairly small. Significant improvement in data center processing, storage, and networking still hadn’t been achieved.
Fast forward to today and distributed systems, cloud computing, and virtualization are ubiquitous. It’s not unusual to find data centers now that can support an average density of 300 to 500 W/sq ft. Operating expenses, not capital costs, are more important as efficiency becomes more of a priority.