In our far-reaching discussion I suggested that a true game changer would change how we do data centers in the same way that LEDs may come to replace the typical incandescent bulbs that still dominate lighting. Interestingly, data center professionals and lighting professionals face much the same challenge. More than a century after their invention by Thomas Edison, incandescent bulbs still give off far more heat than light. The heat is just waste, and space cooling is often needed. Linear fluorescents, CFLs, and metal halide sources pose the same problem, to a lesser extent. LED light sources promise to change the game.
Similarly data centers run hot and must be cooled because servers employ processors and chips that produce more waste heat than work. One industry figure describes the data center energy scheme as 10 ovens cooking at high temperatures inside a standard refrigerator.
Obviously, I don't see the cloud changing data centers in such a fundamental way as to make them truly sustainable, but I'd like to stimulate thinking on the subject.
The industry is not lacking for imagination: economization, clouds, containers, and blades provide more than ample proof that the technology used to run data centers changes constantly.
But the challenge remains: What's going to be better than the technologies available today? The next generation of IT must use less energy, be sustainable, provide greater reliability and uptime, and be immune from physical and electronic attacks.
I'd like to hear about efforts to reach this goal.