Network World recently published a report in which Denise Dubie calculated that fantasy football costs U.S. businesses $7.4 billion worth of productivity-in just five months. $7.4 billion just happens to be the Environmental Protection Agency’s (EPA) estimate for the annual energy cost for running data centers in the United States in 2006.
Coincidence? Most assuredly-but which figure is more easily reduced? Americans, much to their bosses’ chagrin, are undoubtedly going to be reluctant to put their fantasy football obsessions (or March Madness pools, for that matter) on hold until after work hours. Barring a player strike in the National Football League or a complete clampdown certain websites, the $7.4 billion productivity hit is a given. Fortunately the outlook is brighter for data center energy bills-that is, if companies take a proactive approach in this age of ever-increasing power costs.
There are an escalating number of solutions for stemming data center expenses, ranging from energy-efficient servers to cooling tiles to simply buying land in Sioux Falls, SD. While just about every money-saving solution has merit, the future of data centers will ultimately be grounded in a new way to operate corporate IT systems called utility computing, because it has the greatest potential to cut costs while enabling IT to best to deliver business results.
Utility computing promises to help businesses turn their current data centers into utilities that automatically adapt to their business goals while running data centers at their optimum capacity and dramatically reducing costs. Just as electricity and water bills comes from metered supply, so will computing power.
Today, however, fully implemented utility computing environments are rare, and the industry is at least a couple years away from seeing a significant tipping point. As with any large, fundamental change, implementation is a process, not an event. The cyclical life of an expensive data center does not permit instantaneous overhauls. However, along this road there are still actions to take that directly prepare the data center to reap the benefits of utility computing. And, ironically, one of the biggest initial steps has a very relevant side benefit: it can slash data center energy costs immediately.
Mid-Life CrisisData centers facing mid-life crises. Many have been around for a while now – sometimes for decades – but the strain that rapid growth and complexity has put on them is leaving IT departments looking for ways to reduce cost without sacrificing performance. The financial problems that data centers create can be organized into three categories: space constraints, energy consumption, and design. Space is an inevitable dilemma-data centers get bigger when more is asked of them. While they continue to multiply, new real estate is either expensive or non-existent (at least in populated areas, and particularly along the east coast of the U.S.). Huge, rural data centers may not be bursting with convenience, but they have become an undeniably appealing solution to the real estate predicament. And in part due to these efforts in curbing land costs, location is no longer the biggest financial burden to data centers; energy costs are. Just as data center managers have learned to adapt to out-of-control real estate expenses, the immediate future will focus on energy efficiency.
Carbon emissions are another driver of data center energy efficiency. While products ranging from automobiles to light bulbs get most of the mainstream press for their need to be energy conscious, data centers are the behind-the-scenes power guzzlers that deserve a share of that attention. A quick look at the numbers illustrates why. According to a recent Environmental Protection Agency report, data center energy consumption has doubled in the last five years. It will double again in the next five years, a staggering 300 percent growth over a decade. Data centers now comprise a full 1.5 percent of all power usage in the United States. And as stated, all that energy comes with a rather hefty $7.4 billion price tag.
It is naïve (and surely just wrong) to speculate that data centers will ever be fully green. Perhaps they don’t have to be. There are plenty of smaller-scale green initiatives that can add up to make a huge difference. Simply upgrading to more energy-efficient servers can put a dent in energy costs, although the initial cost of swapping out servers can be an expensive endeavor. Other options include improving heating, ventilation, and air conditioning (HVAC) efficiencies and installing smart power distribution systems. Every bit helps – there is not a silver bullet for the energy-cost problem in data centers.
One alternative that is at the height of popularity is server consolidation and virtualization. Considering the land that data centers consume is still very much an issue, reducing server sprawl without compromising performance is an important facet in any IT department. Of course, fewer servers also mean a reduced cooling need, making it a win-win situation.
Consolidation and virtualization efforts are big challenges and therefore attract their fair share of skeptics. High-tech is a largely metaphysical world. Virtualization takes this level of abstraction one step further. The need to control all of these newly proliferating, non-physical servers make high-tech data centers even more complex environments. Still, these techniques have a lot to offer to both the space and energy issues. They also are a conceptually necessary precursor, the enabler, in moving from static to dynamic IT - and to utility computing.
End of the TunnelUtility computing is the light at the end of the metaphoric tunnel for data center efficiency.
There are three broad areas in which utility computing does very well. The first, and arguably most noteworthy, is the reduction in the total cost of ownership. Utility computing is a true commodity acquisition model: incremental computing power can be added on-demand. Use what is needed, when it is needed. Metering brings cost transparency because it prevents over-buying. IT managers are now free to choose the best hardware at any given time without locking into a specific vendor, creating more autonomy than currently exists in data centers. As an added bonus, the automatic properties of running IT as a utility are self-configuring, self-healing, and self-optimizing, which cuts labor cost expenses.
In addition to that pesky energy-cost problem that utility computing helps alleviate, IT managers also deem availability to be a necessary component for any new data center process to possess, and rightfully so. Running IT as a utility means improving application reliability and availability. By automatically correcting software failure and pulling in extra servers to compensate for serious hardware faults, utility computing has a dependable troubleshooting component essentially built into it. Another significant advantage is the hybrid of application availability and efficiency. User-defined policies and priorities can assign computing capacity to priority applications, pulling resources from less important systems.
Data centers are the energy-guzzling behemoths responsible for everything from connecting fantasy football opponents though endless instant messaging, to processing the credit card purchase for a vintage Joe Montana jersey. Energy costs and overall inefficiency mar the current data center picture. But as these problems (and their potential solutions) gain increasing attention, IT budgets will get some breathing room and the environment will be better off because of the efficiency. Utility computing is the future of the data center. In the meantime, preparation involves taking important, practical steps now to cut costs and conserve energy.