Pen and paper were more than sufficient for record keeping when the hospital in Licking County, OH, was operating with nine donated beds. But a lot has happened in the more than 100 years since the facility opened in a nine-room house in this growing community about 30 miles east of the state’s capitol. Today, the 227-bed Licking Memorial Hospital (LMH) supports a staff of more than 1,600 professionals and boasts a 21st century electronic record-keeping system. Anchoring that system is a new 2,520-square-foot (sq ft) data center, completed in 2011.

“The new data center is the proactive solution that resulted from a comprehensive review of our ever-increasing digital storage needs,” said Rob Montagnese, Licking Memorial Health Systems president and CEO, “We felt that it was imperative to invest financial resources to ensure the safety of the Health Systems’ digital files and patient information.”

“Our new data center is definitely one of the most advanced in central Ohio,” explains Sallie Arnett, vice president, Information Systems, “The level of sophistication we are using is at, or above, what you’d typically find in most major health-care facilities and is on par with many major sites such as the banking industry.”

EVER-INCREASING DEMAND

The health-care industry’s demand for data and data storage has more than kept pace with advances in computer technology, and data managers within the industry are hard-pressed to keep up.

“A couple of years ago, we began to realize that it wouldn’t be long and we’d be running out of space in our existing data center,” points out Chris Keck, director of Contracts Management, “so we began discussing a variety of options. But, with the existing data center physically about out of space, we didn’t take long to settle on building a new facility.”

“Not only were we out of space, but our data center was really taxing our cooling equipment,” explains Jeff Kovalik, director of Engineering Services. “As we continued to add components to that room, we added more heat and required more cooling. We were using computer room air conditioning (CRAC) units and had installed an additional 22 tons of cooling, just to keep up.

“As we planned for our new data center, we took into account these demands and what might be expected in the long-term,” Kovalik adds. “We have planned for a load of 350 kilovolt-amperes (kVA) when the room is completely populated, but we’re running at only about 25 percent of that right now—and it will be a long time before we approach that mark.”

With the room up and running for about a year, the group has been able to take advantage of every efficiency at its disposal, including what has proven to be an abundance of free cooling.

“The data center expanded exponentially during the 10 years that I have worked at LMH,” said Justin Sturgill, senior systems analyst. “Just 10 years ago, it had only six racks, and at the time we began the switch to the new facility it was almost completely full. That’s because so much has changed in the medical field. We have to store all kinds of images for radiology, for example. Images that were once stored as film. And, we keep records for all physicians of Licking Memorial Health Systems.

“Even though all these files are compressed,” Sturgill adds. “When you go ‘paperless,’ you trade your stacks of paper for racks of hard drives.

“And today’s components have a tendency to generate more heat,” he adds. “But, because we aren’t populating the entire data center from day one, we are able to keep our cooling demands in check. We’re running our room temperature at 68°F right now and could easily be keeping it at 70°—at the current load, we could do just fine at 72°.”

“The ability to operate efficiently at higher temperatures is another example of our original design plan,” explains Arnett. “Knowing we were going to phase in the new equipment over time—this is designed to be a ‘20-year’ room—we are progressing in a modular fashion. And, the truth is, we didn’t want to have two CRAC units in the room just to cool two rows that may not be fully populated for some time. That’s expensive and inefficient.”

“You can only hold back a CRAC unit so much,” adds Kovalik, “and even if we had wanted to go that route, it would be very inefficient for us at this stage.”

“Right now, we’re waiting until the first row is a little more populated before we determine exactly what temperature standard we want to establish for the room,” Sturgill points out. “Once that’s done, there’s no doubt in my mind that we’ll be moving that number up from 68°.”

BEGINNING THE PROCESS

When the decision was made to create a new data center, the staff met with all individuals who might be involved in the project to address all possible options and concerns. That team included architects, designers, engineers, and those who would be end users of the final system. The plan was to create a data center that would be ready to take on the demands of today yet flexible enough to continue to be viable for the next 20 to 30 years. With that in mind, the new data center came to life with only a portion of its available space filled.

Two critical components were addressed immediately: making sure the infrastructure had the redundancy required to protect all of its valued files and making sure the connection to its existing infrastructure would be seamless, dovetailing the new systems with the existing to maximize previous investments.

“One goal was to make certain that we didn’t build something that was outdated upon completion,” recalls Kovalik, “and it was about this time that we were introduced to the concept of inline cooling and introduced to Rittal Corporation. The concept of inline cooling really opened our eyes to what could be done to address our cooling concerns. It’s more efficient and more cost-effective than the traditional CRAC. In addition, for redundancy and backup support, we would have required two huge CRAC units to do the same thing we’re doing with the Rittal inline systems.”

GAME CHANGER

“As we learned more about what was ahead for the data center, we realized we wanted to have virtualization in the new room,” adds Rob Thompson, IS manager, “and we learned that with the inline cooling technology we could accommodate the additional heat generated independently without dropping the whole room temperature another 10° just to cool one pocket.”

The solution that Rittal offered was what one employee called a “game changer.” It provided efficiency, redundancy, and the modularity that Licking Memorial Hospital required. And, the data center wasn’t going to be built around a “monster” in the room taking up valuable floor space.

“At the level of use we’re at so far, we’re not taxing the cooling system at all,” Sturgill explains. “The water supply going into the system is at 52.7°, and it’s coming out at 53.7°, so it isn’t picking up Btus on the way. The system is very, very efficient.

“We’re only running at about 25 percent of what we expect, but even the racks that are fully populated aren’t putting any stress on the system,” he adds. “In our configuration, a full rack’ holds from eight to 10 servers—that’s another part of our original design. Our ‘magic number’ is 10 servers, but anything with eight or more servers is fully populated here.”

Because of the extremely mild winter, LMH has been able to make use of free cooling on a regular basis so far in 2012, and expects that trend to continue no matter what the weather has in store.

“Last summer was very warm,” said Kovalik, “so we were able to get a good test in those conditions, too. Even then, free cooling was a plus.

“Just yesterday I was on the roof and happened to notice the data center chiller was completely off. This would send many people into full panic mode, but because of the design that our mechanical engineering group put together, which includes a 1,000-gallon water storage tank, there’s often enough chilled water in the system to recirculate and cool the room with the compressors shut down—we’re only looking at 52° water.”

“When we did the commissioning of the whole system, and had heaters in the racks, we simulated a completely full room and we’ve made no changes since then,” adds Sturgill. “And, with four of the cooling units shut down, the temperature in the room only went up by 2°. So we don’t have to wait to see if it works; we know it works.”

“One of the highlights of that commissioning process was when we could hear all the fans begin to ramp up,” Kovalik said. “It was really impressive to see how the system reacts to temperature change.”

“We have no hot spots, none at all,” Sturgill said, grinning. “You don’t fully appreciate what the cooling system is doing until you open the back of a cabinet and you feel the heat pouring out—it’s literally like a blow-dryer coming on you from inside the cabinet. Then, you shut the door and it’s gone.

“Even better, maybe, is that there’s no sound. The loudest pieces of equipment in our new room are the switches,” he added.

The planning team from Licking Memorial Hospital took advantage of the close proximity of the Rittal U.S. headquarters in nearby Urbana, OH, to tour the manufacturing facility. During that visit, the group witnessed a series of tests that helped to make believers of them.

That decision was clearly the right one, reinforced by a test of the installed system in the Newark facility. The installation includes 36 Rittal enclosures supported by 16 LCP air-to-water heat exchangers. During the in-house tests, four of the LCP units were shut down completely and the overall room temperature rose just 2°.

“From the very first, we worked hard to consider all the contingencies we could imagine encountering,” Arnett explains. “And, that included making sure we have backup and redundancy at every level. We have an N+1 at the very minimum—we have N+2 when it comes to cooling.”

“Our power redundancy, for example, it very robust,” said Kovalik. “We have two separate 1,200-amp breakers coming from two different substations within the building—and our A and B feeds are color-coded for clarity. The entire facility is backed up via three 2,000-kilowatt (kW) generators and the data center has a dedicated 1,500-kW generator that backs up the center regardless of the status of the three large generators.

“The two independent electrical feeds are each backed up by an uninterruptible power source (UPS) as well,” he continues. “We were able to realize a financial savings by not putting the chillers on the UPS, and instead utilize the 1,000-gallon chilled water storage tank, and place only the pumps on UPS. That way we never stop circulating chilled water through the system.”

“When we completed this installation we literally threw breakers that we would have never dared even consider touching,” Kovalik recalls, “and, nothing bad happened! We never would have had confidence enough to even test them, before.”

MEETING COMMITMENTS

The older data center hasn’t yet been phased out completely. As hardware and software is replaced and upgraded it goes into the new data center. While the process may seem slow, it allowed the hospital to change to the new data center without any shutdowns or disruptions in systems operation—a critical element when dealing with health care records and diagnostic technology.

Today, more than 2,500 sq ft of data processing space sits atop a 36-in. raised floor—with half the room awaiting its call to duty—and there’s an additional 426 sq ft dedicated to the electrical system and a separate 480-sq-ft room just for receiving.

Making the move from one data center to the next went smoothly and the overall construction and completion of the new data center was a juggling act driven by the hospital’s project management mantra of four key words: On Budget, On Schedule.

“Everyone was aware from the beginning of our tight schedule to make this whole project come together,” explains Keck, “and everyone was confident we’d make our January 1, 2011 target date. And we did!”