Hybrid Cooling: What It Is and How to Prepare for It18 min read
Some would prefer to stick with air cooling as it is simpler and cheaper. However, with the extreme densities of some equipment being produced, liquid cooling is required. The truth is that both are needed. What is emerging is a hybrid cooling infrastructure with the liquid cooling infrastructure to handle extreme densities and air cooling to support the bulk of equipment being produced.
Data center operators have grown accustomed to air-based cooling. They could do it a lot better (as we will discuss in this blog), but they are familiar with it and typically don’t regard it as a serious problem.
The shift toward more and more liquid cooling, however, is destined to bring about disruption. The introduction of water, plumbing, proprietary liquids and their associated infrastructure calls for a change of approach. And this runs directly into several fundamental challenges that have existed for some time.
It is one of the banes of data center operations that IT and facilities personnel have not operated in an ideal state of harmonious interaction. Granted, there have been improvements in this area over the last decade. But it is often still the case that these are two very different worlds. One grew up with building protocols and building systems and the other with IT systems and protocols. Gradually, these two fields are coming together but there is a way to go. As a result, planning and coordination can sometimes go awry.
If anything, the rise of the colocation sector has made the IT/facilities divide worse in some cases. In those areas where the IT department of the business leaves almost all the physical labor and hosting to another provider, there can often be a serious lack of understanding of the facility viewpoint.
These factors are especially important when you consider just how much plumbing, cabling, and facility rework will be required to bring liquid cooling to existing data centers. IT and facility personnel need to work closely together to ensure they get it right.
Challenges with Air Cooling
Before the difficulties of liquid cooling are faced, it might be advisable for both sides to collaborate on raising the current level of data center efficiency by addressing weaknesses related to air cooling.
Cooling units can often be optimized to better match IT needs. However, poor airflow management – such as placing switches with rear-exhaust fans so they blow hot air into the cold aisle – can undermine cooling efficiency. Other issues, like gaps in containment or missing blanking panels, can also allow hot air to recirculate into cold aisles, reducing overall effectiveness. Both sides should take an objective look at how they each can contribute to airflow management and therefore the ability to improve both cooling effectiveness and overall efficiency rather than simply pointing out the failings of their counterparts.
Remember that it took more than a decade for power usage effectiveness (PUE) numbers to reach respectable levels. That it still hovers around 1.6 on average means there is lots of room for improvement. Some of the hyperscalers are reporting PUEs of around 1.15. Many data centers can start moving toward such figures if the liaison and cooperation factor between IT and facilities steps up to a new level.
Hybrid is Imminent
Only with close interaction in place can liquid cooling expect a fast return on investment. The point of liquid cooling is to enable higher rack densities, more demanding AI workloads, and greater efficiency. For that to be done in a financially viable way, the basics of air cooling must be firmly in place and both sides must be involved from day one about how liquid is going to be introduced or expanded.
Hybrid may be coming and data centers need to prepare. Here are some key steps to take:
- Understand that there is no such thing as 100% liquid cooling. There will always be some air cooling. Don’t hobby horse from one to the other.
- The introduction of liquid cooling will not be done all at once in a complete data center transformation. It might be wise to get familiar with liquid cooling by first deploying a few liquid cooled enclosures so that you are ready when large scale deployments start showing up on the roadmap.
- For now, avoid immersion cooling. This technology is experiencing teething problems and, in any case, is both expensive and more disruptive. The smart thing to do is to let the large colos and hyperscalers iron out the kinks. Wait for them to ready this technology for prime time.
- Develop a plan that coordinates the efforts of IT and facilities in improving data center PUE and related SLAs. This should include key areas such as:
- Potential upgrades to CRAC and CRAH units that increase efficiency and lower running costs. This might include variable frequency drives for the fan motors, unit refurbishment, or even replacement.
- Do a joint inspection of airflow management best practices to ensure that: every supply tile is properly placed, all excess supply tiles are replaced with solid tiles, all open spaces in cabinets are filled with blanking panels, all raised floor penetrations are sealed, IT equipment is not blowing hot air into the cold aisle, and if not already in place consider the deployment of aisle end doors and aisle containment.
- Execute a joint plan to address these areas and optimize the cooling infrastructure to bring about an improvement in PUE.
- Plan out how best to add liquid cooling to the data center and work together to ensure initial projects achieve success.
The industry's easiest to install containment!
AisleLok® solutions are designed to enhance airflow management,
improve cooling efficiency and reduce energy costs.
The industry's easiest to install containment!
AisleLok® solutions are designed to enhance airflow management,
improve cooling efficiency and reduce energy costs.
Drew Robb
Writing and Editing Consultant and Contractor
0 Comments