State-of-the-Art Data Center, Center of Attention
Identifying the NeedPlans for the attention-grabbing facility started about 3-½ years ago when Jackson, with its growing assets and clientele, outgrew the basement level of the company’s headquarters building, built in 2000. In addition, the data center floor had suffered a couple of water leaks, one when a sprinkler line broke, essentially convincing Jackson that it wasn’t worth the risk. Thus the decision was made to build a remote, windowless facility just 1,500 feet away from the existing building, according to Dennis A. Blue, Jackson’s director of corporate support services.
To ensure top protection for the server farms, which are distributed over 6,480 square feet (ft2), the team developed a concept that located all of the electrical and mechanical equipment within isolated service corridors. This way, no one needs to set foot on the data center floor, as all maintenance can be performed in the corridors.
On the topic of power reliability, primary power is supplied through a separate utility line with a double-ended substation for redundancy. Two 750-kilovolt-amperes (kVA) UPS units back up the computer loads, and the entire site is backed up by three 1,000 kilowatt (kW) on-site power generators with sufficient fuel storage to run for five days.
“With N+1 redundancy, there is no chance of going down,” states Elliott Krieger, PE, a senior associate and assistant director of electrical engineering with Kahn. While the facility is classified as Tier 2, according to the Uptime Institute’s classification system based upon redundancy, a relatively new product, Liebert’s Static Transfer Switch power distribution unit, with its high-switching speed capability, provides an extra level of redundancy. “With the leveraging of this new technology, [Jackson’s center] is one of the most efficient data centers that I’ve ever seen at this level of reliability,” claims Howerth.
Getting down to details, the power distribution unit is a streamlined package that includes dual-transformation, four 42-pole panels, input source switching, and two isolation transformers with an integral static switch for an additional level of redundancy at the 120-volt level within the data room.
This way, says Krieger, “It becomes easier to protect critical equipment by giving single-cord loads the reliability of dual-cord power or by using the power distribution units in pairs we can provide added redundancy to the dual-cord network switches and servers that have dual power supplies.” With its packaged system approach, the unit saves space, and requires less installation time and cost, as compared to conventional systems which utilize multiple interconnected components.
ImpenetrableTo meet Jackson’s strict requirement that the center be insulated as much as possible from water, the facility is fitted with three roofs – a regular exterior roof and two concrete roofs, all without any penetrations.
“The roof is made up of liquid waterproof membrane and protection board sandwiched with concrete,” explains Pankaj Patel, AIA, LEED AP, a Kahn senior associate and project architect. “The surface is topped with a sacrificial roof made up of insulation and Firestone membrane roofing, rated to I-90 Factory Mutual requirements.”
The facility’s two walls provide additional security and protection from weather events. Furthermore, the large M/E equipment is located inside courtyards, which are protected with a grated top to both allow for free airflow and to keep out debris and deter vandalism.
As for the site itself, which is a former farm field, 30,000 cubic yards of soil were brought in to bring the ground up eight feet in order to further distance the facility from any ground water.
Jackson’s tight requirements even restricted the fire protection system from utilizing water. Thus, Kahn worked with Healey Fire Protection, Orion, MI, – one of the few contractors with a product capable of protecting a room of this size – on the installation of an FM200 system manufactured by Phoenix.
In addition, an AnaLASER II HSSD early-warning smoke detection system alerts security personnel at the neighboring headquarters facility in the event of smoke or smoldering. That way, a potential fire event can be immediately addressed prior to activation of the fire suppression system.
Top SecurityNaturally, a facility of this nature requires a high level of security. Consequently, the data center was designed with multiple layers of access to the building and support spaces. The building’s one outside entrance is outfitted with a Johnson Controls card access system and another card swipe is then required to enter the electrical and mechanical areas.
As for the data room itself, a dual-door, sally-port arrangement requires a biometric palm reader through the first door, and a card reader protects the second door. Exiting the data center floor requires another card swipe to prevent pass-backs, and door position switches monitor exterior doors and a number of interior doors.
In the realm of visual coverage, CCTV digital cameras on the roof observe the building’s exterior while internal cameras monitor the entrance, sally-port and other sensitive areas. The cameras, which digitally record 24/7, also have programmable motion sensing capabilities and will notify guards during non-occupied time if motion were to occur.
Keeping CoolAnother essential element of data center design is accommodating its massive cooling requirements, relates Lee Sun, PE, CPD, LEEP AP, an associate and mechanical engineer at Kahn. “Without a system that properly distributes the supply air and quickly removes the heat generated in the computer equipment racks, the system will shut down in a matter of seconds.”
This being the case, Kahn’s general strategy was to create alternating hot and cold air aisles between the rows of racks, however, it took time before the final rack layout was available. In the meantime, an underfloor air system with air delivery via perforated air tiles was utilized since floor tile relocation, in response to the changing rack layout, was a simple procedure.
Once again, Jackson’s policy of zero tolerance for water penetration affected the HVAC design. Consequently, a full height wall, penetrating all the way down to the raised floor plenum, secures the data room from any air conditioning system leaks. In addition, leak detection cable, on the floor slab and around the mechanical corridors detect any leakage from the glycol/water cooling system piping in the plenum.
In general, 11 nominal 30-ton down-flow air conditioning units cool the data room, coupled with eight remote air-cooled dry fluid coolers, located outdoors for heat rejection. Two glycol/water piping distribution loops with an emergency cross-connection pipe were specified with each loop serving every other AC unit for additional redundancy. In addition, the two 100 percent capacity recirculation pumps, which serve each loop, are connected to two different power supplies in order to further protect the center against down time.
And in order to maximize the life of the facility’s temperature-sensitive back-up batteries, a 100 percent outdoor air makeup unit with DX cooling and electric heat maintains the room where the batteries are stored at 72 degrees F, and continuous exhaust ventilation works to prevent hydrogen gas build up.
Staying ConnectedEven though Jackson had an existing, secure connection to its telecommunications provider, the decision was made to run a separate line for the new center, as well as tying into the old data room via an underground, concrete-encased duct bank, with spare ducts for future additions and service upgrades. The ducts have also conveniently been utilized to route BAS, security, fire alarm and voice communications cabling from Jackson’s headquarters to the remote data center.
As for the cabling in the data center itself, cable baskets, supported by bracket assemblies in the plenum floor, were specified, and the power feeds were installed using metallic flexible conduit. The plenum floor is also connected to a water detection system and is free of condenser water piping.
Big PlansTo accommodate Jackson’s growth requirements, the team has also designed an expansion for the complex, to be completed in January of 2008. The expansion will increase the size of the computer floor by 5,040 square feet, as well as add 18,520 square feet of support space to the existing 23,300 square foot center, including computer floor and courtyard space.
As proof of the effectiveness of the original design, very little has been changed for the expansion, according to Blue, including the services of the same A/E team.
“The team was just outstanding; they have the knowledge, they have the experience,” testifies Blue. “I couldn’t have asked for a better set of professionals.”
One small change in the expansion design calls for a new Liebert hybrid UPS system with flywheel generation on the output. Specifically, three Liebert FS Flywheel CD Stored Energy Systems each offer 632 kWb for 7 seconds at 50 percent load and 15 seconds at 100 percent of the load.
The device “takes care of smaller electric anomalies that come through, so the battery lasts longer,” explains Howerth. Additionally, flywheel hybrid technology provides higher levels of fault current than a conventional UPS, to more effectively clear downstream branch breakers should a fault occur. This can prevent the entire UPS from going off-line.
As far as overall power redundancy, the expansion promises to be just as reliable.
“The double-ended substations for both the data center and its addition have been sized to handle double the load of each data room,” explains Krieger. “This will allow automatic throw-over, placing the entire load on one transformer should a transformer failure occur. Similarly, the four 750-kVA UPS units were sized so the load from each data room can be supplied by one unit of each pair.”
While designing the expansion, one hurdle the team ran into was the fact that some of the new data cabinets employ new edge server technology that would be hitting 20-kW electrical load per rack, which would make heat removal from such a dense area quite challenging. The solution was to relocate cabling to an overhead cable basket system in order to free up space under the floor plenum for cooling the air flow.
At the same time, Jackson still wanted to maintain a finished look via a lay-in ceiling, even though data rooms where cabling is routed overhead typically don’t have ceilings. Consequently, in order to accommodate the hanging cable basket system, “a preformed channel was integrated into the lay-in ceiling system on a pattern which will allow for modifications and additions of new cable baskets,” says Krieger.
A Job Well DoneUp and running for more than 3-½ years, Jackson couldn’t have asked for a more secure and efficient data center. “What’s getting people so excited is the fact that Jackson decided to build a data center that met their needs without cutting corners,” relates Howerth.
“We are extremely proud of our facility,” echoes Blue. “We’ve been providing tours to educational and governmental agencies such as Michigan State University and the Michigan State Police. It shows a level of accomplishment that they are willing to take time out of their busy schedules to come and walk through a site that they are considering mirroring.”
Others who played a key role in delivering the project, which took just one year from start to finish, include Liebert manufacturer’s representative, Hedrick & Associates, Grand Rapids, MI., who supplied all of the data center’s mechanical and electrical systems – with the exception of a pump package from Trane.
In addition, Granger Construction, Lansing, MI, served as construction manager for both the initial data room and the expansion. But besides the building professionals, Blue contends that it was the active involvement of Jackson’s own IT department that enabled the smooth and successful project delivery process.
“The primary key to the success of this project was communication and collaboration between Corporate Support Services and Information Technology. From the date that we decided to construct the new site, both groups performed all aspects of the project hand-in-hand. Although, design and construction is not IT’s specialty, they took a deep interest into what was being decided and how it would impact their operations,” says Blue.
As a matter of fact, it is this – the thorough thought process that went into building the data center – that has impressed Jackson’s British parent company, Prudential, the most, according to Blue.
Currently processing more than one pedabyte, which is 1,000 terabytes of data per day, Jackson is confident that its hardened and secure data center is up to the task of maintaining its 24/7 operations, come what may.