shyshka - Fotolia
Data center liquid cooling market heats up
As organizations increase server rack density, more C-level execs investigate liquid cooling technology. Industry experts see these systems as an operational and cost necessity.
To meet the demands of technologies such as artificial intelligence, the internet of things and edge computing, data centers are denser and emit more kilowatts per rack than ever. This energy uptick brings organizations new challenges to keep servers cool.
With the latest CPU and server advancements, data center liquid cooling may just be the answer. Processing units have increased power needs and most data centers now require new rack densities; this can be a barrier for IT departments that decide to adopt more cutting-edge processing technologies.
"At some point, air cooling just runs out of performance capability. You can cool something, but [there is a line where] the efficiency just drops to the floor, in terms of how much power you need to push into air cooling. The main benefit [of liquid cooling] is to help us ship what we need to in terms of cutting-edge CPU or accelerator technology," said Jayarama Shenoy, vice president of technology at Hyve Solutions.
Shenoy notes facility-wise, data centers use three main types of liquid cooling: closed-loop liquid cooling, water-based cooling and liquid immersion cooling. These technologies use dielectric liquid or water to draw heat away from servers. Organizations can use direct-to-chip liquid cooling products on an individual server level.
Liquid-based cooling technology can more effectively conduct heat, especially with high-density racks that reach up to 30 to 40 kw per rack or use GPUs. Experts say data center liquid cooling adoption will grow as more IT departments look to use GPU-based servers and high-density racks.
Liquid cooling also brings cost savings, especially if the data center has a high-power output or requires a lot of energy to run.
"Take countries with very high-power costs: They saw water cooling as a way to drive down their power bill. Because they're paying so much per kilowatt hour, any time you can reduce the kilowatt [total], it's fairly big savings," said Scott Tease, executive director of high-performance computing (HPC) and AI at Lenovo.
Acoustics are another factor as IT departments increase fan use within data centers. With fans, there is a point where admins must wear hearing protection when they're in the data center. With liquid cooling, the amount of noise significantly drops, which prevents hearing damage.
The state of data center liquid cooling adoption
The HPC sector is a consistent adopter of liquid cooling, and Tease estimated that at least 70% of HPC systems are liquid cooled in 2020. Now, IT teams are interested in adoption for non-HPC systems.
For servers, the average energy output is around 8 to 10 kw per rack, which is within air-cooling technology capabilities, noted Steve Madara, vice president of thermal and data centers at Vertiv. As more IT departments support AI and machine learning, even on a small scale, they will require technology other than air cooling to meet these high-density racks.
Adoption rates for data center liquid cooling depend on the technology type, Shenoy said. Certain liquid cooling systems are more mature than others across the market.
Closed-loop liquid cooling can be expensive, but the cost is often justified for critical use cases, which has cemented the technology's adoption across the industry.
The long-established option to use facility water to cool infrastructure holds a smaller part of the market, due to concerns of water placement near IT infrastructure.
Liquid immersion cooling, which is a newer technology, is in the early adoption phase, as it is expensive and presents maintenance hurdles.
Cooling infrastructure change brings challenges
IT teams must adopt a new management framework for liquid-based cooling equipment; traditional air-cooling options have benefits for usability and upkeep.
IT departments have more flexibility with maintenance for air-based cooling, and components are easier to service. This is a big positive, especially if IT teams do an equipment refresh approximately every three to five years.
"An air-cooled [setup] is really easy. I change one server for another server and they're cool [and the data center] doesn't really change too much. Most of the hurdles are associated with changing the infrastructure from what I have and understanding [that] if I change the infrastructure to support [liquid cooling], is that the same infrastructure that I'm going to use when I refresh the equipment in three years, four years or five years?" Madara said.
Liquid cooling systems can bring vendor lock-in concerns. Because IT teams use multiple vendors for server hardware, they must account for what liquid cooling options their data center hardware supports.
In a typical free air-cooling setup, there are standard size specifications for whichever system or rack size IT teams use, and admins can simply plug the technology and cabling into the rack, regardless of server vendor, said Vik Malyala, senior vice president of field application engineering at Super Micro Computer Inc.
Admins must see if their desired liquid cooling offering can support a heterogenous data center, or they must go through a specific vendor, especially if an organization decides to use direct-to-chip liquid cooling.
IT departments must ensure that admins can even implement liquid cooling technology; this preparation can bring extra expenses.
"Truly adopting liquid cooling also means that you have to support a 60 kw, a 70 kw or higher-power density per rack. There [can be] a secondary cost of retooling your data center to support higher power densities," Shenoy said.
There's also a mental hurdle, according to Tease. Industry professionals familiar with mainframes have been around liquid cooling technology for a long time and may wonder why liquid-based cooling wasn't adopted sooner, but some organizations and IT teams are cautious to have water or liquid near infrastructure.
To address such concerns, admins can investigate liquid cooling setups that use smaller amounts of water to cool individual servers, run internal tests on the hardware, consider closed-loop liquid cooling systems, and evaluate hardware serviceability and accessibility.
Vik MalyalaSenior vice president of field application engineering, Super Micro Computer
There are also coolants that are certified for safe use around electronics if an IT team prefers to avoid water use all together.
Processing power needs to drive liquid cooling adoption
In terms of data center liquid cooling adoption, IT departments should investigate the technology sooner instead of later. For teams that invest in higher-density server racks, more data-intensive use cases and higher processing power, liquid cooling is a necessity to efficiently run a data center.
"I think the next year or two are going to be the decisive years for how people adapt. The industry is moving toward a direction where, in data centers, chillers are going to eventually become the norm, which wasn't the case a few years ago. I also think that there's going to be consolidation and collaboration between companies to come up with [technology] that actually helps adoption," Malyala said.
He stated that IT teams will either decide they must use liquid cooling technology to support the higher-density racks, or they'll evaluate their technology and realize that their servers don't emit enough heat to justify the investment.
With the processors and GPUs expected to come out in the next two years, it may be a challenge to fit the newer hardware into traditional air-cooled servers, Tease said.
"In this upcoming generation of servers, you will see liquid inside of nearly every sink [and] nearly every device that's out there, because the parts are going to be generating so much heat that fans will be unable to move the heat away from that part quick enough. You're going to need a liquid system to move the heat away from that part," he said.