Liquid Immersion Cooling And Modular Data Centers
Combined, these two technologies can add up to extreme cost savings
As high-powered computing pushes to the most rugged geographies on the globe, liquid immersion cooling firms are finding new opportunities for modular data centers.
Imagine a sand-swept military forward base, a sea of tents in the Afghanistan desert. One of these tents is the command post, which usually houses the data center, the “brain” of the camp. Here there are rack shelves of tightly packed blade servers, powered by diesel generators and cooled by submersion in protective fluid-filled compartments. There is no need to worry about contaminating the electronics with sand or rain. There are no breakable moving parts, like fans, heat exchangers, and pumps inside the server chassis. There is no need for high ceilings, raised floors, or aisles to circulate air. And there is no extra infrastructure like chillers, air conditioners, and compressors to drive the cooling process. With liquid immersion, the size of fluid coolers, where heat is recycled or rejected, can be reduced by about half.
“I’m surprised the world has taken so long to figure this out because it’s not that complicated,” said Herb Zien, a thermal engineer and CEO of Liquid Cool Solutions (LCS). “Liquid immersion cooling is the most elegant solution: the simplest, the least costly, and the least likely to fail.”
Immersion cooling has focused on high-powered computing, but companies are starting to realize that it also has the quotidian benefits of withstanding rugged environments and saving on power, energy, and space. This is creating new opportunities on the battlefield, on oil rigs, onboard vehicles and ships, and near cheap renewable energy sources.
This technology goes hand-in-hand with modular data centers, which are much smaller than brick-and-mortar designs. Modular data centers use compact units or modules, often made to fit in ISO shipping containers. Modules can support all data center functions: cooling, power, and IT. They are ideal for retrofitting expensive data centers that have run out of power or space. They allow clients to build what they need now rather than what they need based on say a 10-year projection. In addition, built-in security software may offer higher levels of protection against hacking attacks. However, there may be challenges as far as site selection and preparation costs. Clients may have to procure land, obtain permits and planning permission, and pay high security costs. Nonetheless, there are big cost savings. One of the biggest draws is the lower power usage and high efficiency. When equipped with liquid immersion cooling, modular data centers can save up to 20% on overall cost, up to 40% on power, and up to 60% on space by some estimates. In one case, Zien found the total cost of ownership, which incorporates upfront capital, energy and maintenance to be 30% less per year for 12 years.
“These are small bets,” Zien said. “Alternately, a $100 million dollar conventional data center is a big gamble because IT technology is sure to change during the life of the facility.”
The market forecast is fairly promising, according to Tech Navio, a UK-based technology research and advisory company. The data center server market will likely grow at a compound annual growth rate of about 3% during the period 2012-2016. Data center cooling solutions will likely grow at about 16% through 2019, while the modular data center market will grow at 29% through 2018. They expect the military to be one of the “major drivers” of modular data centers because it operates in remote locations and requires security and mobility. They say there is also a market trend in oil and gas industry to use modular data centers, but they don’t consider it a market driver.
Liquid immersion-cooled centers are rugged, efficient, and easily maintained. They reduce the required power and energy necessary for cooling by more than 95% (in comparison with air cooling), while allowing for higher power densities because there is no need for extra space and air conditioners to manage airflow. Most systems have been found to have a cooling power usage effectiveness (PUE) of between 1.01 and 1.03 (ideal is 1). (Some firms make their numbers artificially low by considering fans in the server chassis as doing computing work). Liquid immersion systems can scale up to deal with higher server heat loads, simply by increasing coolant flow rate. They are used either in a rack-mounted system — a unit of electronic equipment housed in a metal framework — or in fluid-filled tubs. A rack usually contains multiple "bays” that each holds one piece of equipment, such as a blade server. A blade server is a thin electronic circuit board holding microprocessors and memory and is used for one dedicated application, such as serving Web pages.
Alternative technologies underperform liquid immersion cooling in several crucial ways. Cold plates usually cool only the processing chips (which generate 60% of heat) and require fans to cool the rest of the system, so they save a fraction of the energy and do not isolate electronics from air. (Fans also create their own heat). And cold plates require humidity control.
“The world traded the horse and carriage for a horseless carriage, not to go thirty miles per hour but to get rid of the horse,” Zien said. “Fans are the horses of information technology that liquid immersion cooling replaces.”
Single-phase liquid immersion cooling — in which the fluid stays a liquid — has for decades seen applications in high-voltage transformers, industrial capacitors, and gaming PCs with higher clock speeds, but little has been done with data centers. A pump moves fluid to the encapsulated blade server, and then to a fluid cooler. Only three companies construct or license this type of immersion cooling: LCS, Iceotope, and Green Revolution Cooling (GRC). They sell custom racks of servers and computers equipped with liquid cooling technology. All can take advantage of cooler outside temperatures to save on energy.
LCS, an American company, makes cooling systems for the most extreme conditions. It can run its sealed servers underwater or under desert sand for extended periods. Currently, it has deployed on filthy factory floors and in a tannery, where previous cooling systems contaminated the electronics. However, it has not yet officially deployed a modular setup. LCS pumps Core Coolant, a synthetic oil with antioxidants (to prevent corrosion), to a rack shelf of servers. The dimensions of a standard rack are about 1x1x2 meters, and it contains 72 individually sealed servers stacked in nine shelves, with eight per shelf. It offers liquid-submerged servers (LSSs) and liquid-submerged computers RT (LSC Rugged Terrain: a standalone computer rejecting heat via a passive radiator) that benefit from a patented directed flow technology where the coolest fluid touches the hottest spots first. The system can cool 85 kW of server output per rack. The company is looking to expand into the oil and military markets and has submitted several proposals already for upgrades and refits.
“If one spot is running 30ºC or 40ºC hotter than everything else, that becomes a limiting factor and the reality is you have to get to that spot first,” Zien said of his direct flow technology.
The UK-based company, Iceotope, builds a system resembling the LCS design, except that it has a university research base of 10 years and uses natural convection to expel heat. Its Petagen two-coolant system cools servers through a liquid jacket and a cheap secondary coolant circulating around the rack. It can use 3M’s Novec coolant among four others, as primary coolants. The primary coolant self propels using natural convection, but the secondary coolant called Iceotope Blue, is pumped to each server. Heat is moved from the Novec to the secondary loop through a heat exchanger (the server wall). It doesn’t need antioxidants because its coolant is nonflammable. It can cool about 60 kW per rack. A standard rack has 72 servers on eight shelves, with nine per shelf. In addition, it has a high per square meter footprint of up to 30 kW when deployed because it doesn’t need “racks of supporting gear like everyone else.” Iceotope declined to speak of its activities in military operations or oil exploration on grounds of confidentiality. However, all sectors have had "very strong" growth in "opportunities,” according to Peter Hopton founder and CVO.
GRC, an American Company, manufactures the CarnotJet system, which resembles a rack of LCS servers tipped on its back, except it doesn’t seal its servers and uses big coolant-filled tanks. However, LCS servers are custom, where GRC racks support servers from many OEMs. The tanks each have 250 gallons of fluid called Green DEF, essentially mineral oil, with servers inserted vertically into slots. It can cool up to 100 kW per rack. GRC tries to keep the system as simple as possible and buys off-the-shelf equipment domestically wherever possible to keep costs low because they are deployed in seven countries. GRC started with a grant from the National Science Foundation. Intel conducted a year of testing and reported “groundbreaking” results. GRC has currently done some work with the U.S. Air Force, and oil and gas projects are its “longest running deployments,” according to Dhruv Varma, interim director of marketing and business development.
Two-phase immersion cooling has existed for about five years, with government sponsored research pre-dating the first commercial systems. Open Bath Immersion (OBI), one instance of it, removes heat by evaporation. A pump floods a horizontal tank with fluid, in which servers are inserted from the top. The fluid boils into a vapor to expel heat, condenses, and drips back into the tank. Companies using OBI can locate their servers near renewable energy sources like greenhouses or hydroelectric dams to put waste heat to good use. Two companies practice it — Allied Control and Silicon Graphics, Inc (SGI) — and are similar in their approaches.
The Hong Kong-based company Allied Control, which recently signed an agreement to be acquired by Bitcoin mining giant BitFury, emphasizes that it is building its systems for supercomputers of the future, when it foresees much larger heat loads than today. It can cool racks exceeding 240 kW; most supercomputers nowadays use around 20 to 30 kW, while the largest are in the 150 kW per rack range, according to Alex Kampl, VP of engineering. 3M engineers taught him how to use Novec. There are 200 servers per steel tank, each pulling less than 1.2 kW. Two-phase can absorb much more heat than single-phase immersion because it uses boiling. The company pioneered the world's first commercial application of two-phase cooling less than three years ago in early 2012. The servers are not sealed and have a horizontal configuration because tanks require condensers on their glass lid. It does not use antioxidants. The company has worked on data centers in Hong Kong, which has limited space and some of the highest rents in the world. They can use natural energy, which reduce energy needs.Allied Control took advantage of river water to cool IT equipment in a steel-making factory, expelling heated water into the sea. The company is active in the oil industry; it is not currently involved in any military projects, but has been very involved in Bitcoin mining.
“Mining is often conducted by industrial mining farms, with densely packaged hardware that becomes outdated within months of coming online as electricity costs exceed profits,” Kampl said in an Electronics Cooling Magazine article.
There are various concerns involved with Novec. One of the mainobstacles is expense. With current hardware density that is made for air cooled systems, there is sometimes 95% of free space, which means more coolant and higher cost. Novec cannot ignite like the hydrocarbon oils of other liquid immersion companies, but unfavorable reactions can produce toxic residue, according to Hopton. The toxicity level is very low, which allows for a “large margin of safety,” according to 3M. However, one notable part is the small amount of Novec necessary for cooling.
“We use as little as a glass full of soda to cool a few kilowatts. Now that costs less than a pizza or a double espresso at Starbucks," said Kampl. "That's the magic.”
The deployment time to the field is quick for modular immersion-cooled technology. Setting up is comparable to opening a refrigerator (upright or turned on its side), sliding in racks with server blades, closing the door, and attaching some piping. Most companies can deploy in a couple of weeks if the project isn’t custom. LCS outsources its manufacturing to channel partners, to whom it charges a small license fee, seeing itself as akin to Microsoft which started out licensing its technology to IBM. Most projects are custom, with various demands on memory and processing speeds, so all companies usually take a matter of months to deploy. As far as maintenance, most liquid immersion-cooled blade servers can be hot-swapped (replaced without shutting down the system) in minutes. However sealed servers can sometimes make it difficult to access the components and to change configurations. On the other hand, GRC and two-phase systems can be somewhat messy to maintain because servers must be removed by hand from a tub of fluid.
Liquid immersion companies are already starting to deploy modules. Allied Control builds the Immersion 2 Container Data Center. It is smaller than a shipping container, coming in a 12-meter-long ISO holding 20 racks, each with three separate tanks or “baths.” Kampl said smaller versions can fit in ground-based vehicles. Iceotope, which is partly owned by Schneider Electric, has a division devoted to modular systems. The company has seen interest mainly in smaller modular solutions, not full containers. GRC has sold two modular systems to the US Air Force and several to oil companies. Their modular container comes in a standard ISO container. In a bid to consolidate a military base data center, LCS sold a test system to a military base that managed to consolidate one of its 610-sq-meter, 25 kW air cooled data centers into a roughly five-square-meter rack shelf, while reducing input energy by half. The rack system had 64 servers on eight shelves with a protective container/carrying case around it. The newest air cooled system can reduce this into about two racks. Dell has built a Tactical Mobile Data Center, which is particularly for military and government applications. The container itself is equipped with temperature, humidity, airflow, and fire suppression controls, according to Wired. Armag Corporation, an American company, builds a modular data center system called the A.R.C Vault SCIF, an armored rapid-deploy vault constructed of quarter-inch-thick steel).
Oil and gas companies are increasingly using high-powered computing to quickly analyze geological data. The standard means of collecting and processing data is to fly it in and out of the project area to avoid exposure to the elements, Zien said. With liquid immersion cooling, the data centers can stay permanently in the field, since environment isn’t a factor. LCS is working through channel partners to reach oil and exploration companies, but declined to speak about specific projects. Allied Control has seen increased revenues for oil projects, which are coming up from zero just less than three years ago, when the company was founded. Iceotope has also seen interest in oil exploration for “tactical and strategic reasons.” Total, the French gas and oil company, uses SGI's ICE X liquid-cooled supercomputer (world's ninth largest) for exploration purposes. Schneider Electric recently completed a modular data center for a large oil and gas company in Angola and added additional data center capacity to another company expecting to double in two years.
Liquid immersion companies have set out upon various military and government-related projects. The National Security Agency (NSA) has been testing GRC immersion cooling for its brick-and-mortar data centers. The NSA says their $1.5 billion Utah facility and one still under construction will be used partially to provide intelligence about cyber threats. The U.S. Navy engaged Lawrence Berkeley National Laboratory (LBL) to test liquid immersion technology for an ongoing project, in an effort to save energy in tropical military expeditionary camps within the U.S. Pacific Command. It demonstrates these technologies in military exercises in Thailand and the Philippines. This is all part of the Transformative Reductions in Operational Energy Consumption testing platform.
Military expeditionary camps are like small tent cities that use portable equipment and are often located in remote environments, to which it is expensive to transport petrol to power generators. LBL received applications from three liquid immersion-cooling companies, but LCS was the only company they chose to examine. They may test or already have tested other similar companies, such as SGI, as part of ongoing research. They began evaluating LCS’s system last autumn and issued the final report this summer. LBL ran the system through a series of environmental conditions representing a range of temperatures and humidity in the representative climate of Bangkok. They also varied the server load. From this, they calculated a coefficient of performance to be 30 times better. They however didn't test the ruggedness or space saving ability of the equipment. Immersion cooling technology can also recycle heat for use in heating barracks, in cooking, and heating showers. Richard Brown, LBL researcher, said that he believes there is a trend in the US military to move servers outside of the command tent to containers.
Kampl sees the acceptance of liquid cooling as following the three stages of reaction to a revolutionary idea per visionary and science writer Sir Arthur C. Clarke: (1) It’s completely impossible. (2) It’s possible, but it’s not worth doing. (3) I said it was a good idea all along.