The problem with dire predictions is that they can never account for the unexpected. History is replete with examples of how the inevitable was derailed by some unforeseen, or unplanned, event. The Roman army was unbeatable until they met Hannibal at Cannae. Hitler was going to overrun the Soviets, but then it started snowing outside Stalingrad. Standard Oil was going to be a monopoly of one before some Texans started drilling at a place called Spindletop. And speaking of oil, in 1970 many experts said that we had hit the peak and oil production had no place to go but down; a theory that didn’t anticipate fracking. I bring all these instances up because it looks like we might be experiencing the same type of phenomenon regarding data center energy usage.
For those of you who don’t remember, in 2010 a report by Jonathan Koomey found that data centers use one heck of a lot of energy. (Aside: For those of that don’t know Jonathan, he is the furthest thing from draconian — he’s a scientist — data, not emotion, wins. He is a great researcher and very thorough testing all assumptions.)