Using a Tier level to predict data center availability has been an industry standard for so long, it’s easy to forget there was a time before this benchmarking system existed. In the mid-1990s, data center operators had no means to provide senior management decision makers with a simple and effective non-technical means of conveying the differences in data center investments. In response to this demand, the technical staff of the Uptime Institute developed the four-Tiered approach that is ubiquitous today.
By conducting a thorough look at case studies, they determined that data center infrastructure designs had evolved through at least four distinct stages, now captured in the Institute’s Tier Classification system. Historically, Tier I first appeared in the early 1960s, Tier II in the 1970s, Tier III in the late 1980s, and Tier IV in 1994. The Institute’s technical staff participated in the development of Tier III concepts and pioneered the creation of Tier IV with electrical power distribution systems. Following the conceptual development, United Parcel Service’s Windward data center project was the first Tier IV implementation, and in subsequent years, facilities around the world continue to aim for this achievement.
Since its inception in 1995, the Institute’s Tier Classification System has been used broadly in industry dialogue. It is a mixed blessing for the Institute and its Certification licensee ComputerSite Engineering that Tier language is so pervasive. On one hand, it clearly demonstrates that the marketplace has thoroughly embraced the concept, but this popularity can also result in erroneous interpretations and misuse. It’s complicated to police understanding and recognize client achievement when your intellectual property becomes the de facto industry standard.
According to Julian Kudritzki, the Institute’s certification manager, the misunderstanding of Tier Classifications is surprisingly rampant, “A lot of clients are ascribing characteristics to Tiers which have more to do with Operational Sustainability (UPS battery run times, number of utility feeds) and not paying attention to characteristics of infrastructure solution that are heavily vested in Tiers. The fundamental concepts of Concurrent Maintenance and Fault Tolerance are receiving relative inattention. This leads to a shortfall in the final design or constructed facility, which could result in millions of dollars in squandered investment and a facility that will not deliver the uptime requirements over the long term.” It is of deep concern to the Institute and its Certification licensee that a misunderstanding of the benchmarking system originally developed in order to prevent exactly this kind of confusion and disappointment still persists.
The Tier objective of any data center is determined by the uptime needs of the business. Organizations that require 24xforever availability, such as those in the banking and transportation industries, for example, have a more critical use of a higher Tier level to support their uptime needs and are most likely to refer to their Tier level when describing their data center infrastructure. It is imperative that the managers of these mission critical facilities thoroughly understand the official distinctions among the Tiers as defined by the authors of the Tier Classification System.
The rapidity of equipment changes and the exponential growth curves presented throughout the industry are overwhelming to understand, let alone act on. The recent Symposium on Green Enterprise Computing in Orlando, FL, presented startling statistics on expected power consumption increases that begged the question of how these increases will be equipped and supported. With these kinds of fluctuating demands on data center infrastructure, owners and operators must be clear on the objectives and expectations of their facility in terms of its Tier, but there is also an onus on the authors of the Tier Classification System to keep information current and responsive to our dynamic industry.
In an effort to address both atrophy in industry understanding of what constitutes each Tier and industry changes since the first publication of the white paper defining Tier levels, the original Institute authors collaborated on an update to Tier Classifications Define Infrastructure Performance following last year’s Design Charrette in Santa Fe, NM, and in the process, incorporated peer reviews from key industry professionals. “This Tier white paper revision addresses industry feedback about organization and specific topics, such as engine generators,” says Kudritzki. It is also intended to clarify and build on the original fundamental concepts. The Institute also released a second white paper in tandem. The term “Operational Sustainability,” as quoted earlier by Kudritzki in reference to Tiers, is defined in the new publication, Operational Sustainability and Its Impact on Data Center Uptime Performance, Investment Value, Energy Efficiency, and Resiliency, as the design and operating factors that affect a site’s resiliency through infrastructure performance, effectiveness, and long-term value. As an enhancement to Tiers, factors are evaluated and a rating is added to the Tier level to indicate longevity of the Certification. Upon their release in early 2008, both white papers were downloaded and read by thousands of data center professionals.
This re-energizing of Tier interest has resulted or coincided with a number of precedents in recent years. Nationwide Insurance and Bank of Montreal were both officially ranked as Tier IV facilities by the Institute. OnePartner became the only data center outsource provider to have their Design Documents rated as Tier III. In June, Fujitsu’s North London site became the first European data center to be Certified, achieving a Tier III of the new facility.The licensee firm, ComputerSite Engineering, has also collaborated with many clients in promoting their recent Tier Certifications.
OnePartner, in particular, has succeeded in using their Tier rating to promote business, and even though the Certification process was eye opening, the data center management team is happy with the impact to the organization. Tom Deaderick, director, expressed concern about the frequent self-certifications so prevalent in the data center world. “Certification is in-depth and almost certain to identify features or components required to achieve the desired Tier level. Many people would rather make claims than go through the process knowing they are going to face potentially expensive retrofit. The problem with that kind of thinking is that our industry needs a consistent measure so the public really knows what they are buying. The customers of data center services can’t actually see inside the operations, electric systems design, etc., so without a consistent reference point, how will they know if a center meets their business needs?” His remarks echo those of Kudritzki who warns that a lack of true understanding of Tiers leads to serious business consequences.
Due to all this attention to Tiers, mission critical operations are increasingly aware of an official Tier Certification’s value as a market differentiator. “The Tier Certification process is by nature rigorous and exclusive,” says Kudritzki. Also exclusive is the Institute’s licensing agreement with ComputerSite Engineering, a data center management and engineering consulting firm that performs the fieldwork for official issuance of a Tier Certification. Design documents of planned facilities are rated by a home-office review or after an extended on-site evaluation. Constructed facilities are awarded a Tier Certification with Institute authorization. Organizations with official Certification are presented with a plaque for their facility and an image that looks like a “seal” or “foil” that can be used for promotional purposes. See computersiteengineering.com/certifiedsites for a list of Certifications.
As the only recognized evaluators of Tiers, ComputerSite Engineering has developed other resources to help clear up misunderstandings around Tier levels. Previous to the release of the new white papers, a “Technical Users Guide” was posted online, bulleting the definitive elements of each Tier in a brief five pages as opposed to the longer, more in-depth white paper. Many of their consultants also function as Institute faculty, co-authored the Tiers white paper, and conduct an annual seminar series intended to coach data center professionals in the Tier concepts by practical application and evaluation of attendee one-line drawings. Pitt Turner, Senior Tier Certification Authority, Institute Faculty member, and Principal Consultant for ComputerSite Engineering, highly recommends seminar attendance for those keen on understanding Tiers: “As data center availability grows more critical to business success, it is essential that the data center industry understands the official Tier system directly from its authors.” Registration information can be found on the Institute’s website (uptimeinstitute.org).
As strategic data center projects today can top $100M in site infrastructure costs (not including IT, migration, hardware refresh, etc.), the consequences of premature obsolescence take on paramount significance. Given the enormity of the capital outlay, project managers must demonstrate to upper management and the marketplace that their significant investment is protected and optimized - protected by confirmation that the performance objective has been achieved (Tier) and optimized by ensuring that the facility is capable of delivering the performance objective over the long term (Operational Sustainability).