Blog: Keeping Cool with Containment as Data Center Densities Increase

Olaf de Jong, R&D Manager at Minkels:

Liquid cooling may be a buzz topic right now, as data centers prepare for the AI application explosion, but air cooling can cope with significant rack density increases, as Minkels proves to its customers.

Some of our customers are skeptical about Minkels’s claims regarding the airflow capacity of the Nexpand cabinets we propose to supply for their data center environments. They question whether the cooling capacity we promise to manage a specific heat load will be achieved once the cabinets are installed. For example, a recent customer doubted our claim that 80% of the ventilation is good enough to cool that many kilowatts.

Proven Performance in Real-world Testing

Fortunately, we’ve carried out several studies to examine the differences between different door perforation levels. As long ago as 2010, we worked with an air conditioner supplier, using their climate chamber to install a complete containment system with cooling equipment and a heat load. And we documented the test results. We tested a 65% perforated door, which, at the time, was the minimum requirement of the perforation required for servers. This was the specification coming from the server manufacturers. And we also compared these results with an 80% vented trial. And we saw a significant difference – the total power per cabinet was up to 16 kilowatts, and it was cooled quite easily. This was what we were trying to prove, that this, for the Minkels cabinets, was normal. With our airflow package, combined with a CRAC unit, it was possible to cool this heat load with air in the data center.

Meeting the Challenges of Increasing Cooling Needs

Fast forward to today, and the need for cooling is increasing rapidly due to AI deployments – I’ve even heard talk of one megawatt in the future! Of course, at those levels, air cooling’s use will be very limited. However, for the foreseeable future, the increase to 20kw or more – customers are asking Minkels, ‘Can we cool these densities with air, or do we have to go direct to chip’? And that’s why customers are asking us to prove what an optimally designed containment and air-cooling set-up can achieve – and we are convinced that we can meet the demands of higher densities for now – it’s doable.

So, more and more customers are upgrading the kilowatts per cabinet (so long as the extra power is available), but they want to stick with air cooling for as long as possible. It’s less expensive than liquid cooling, it’s generally much easier to install and use, and, of course, existing data centers have already invested significantly in air cooling infrastructure.

Addressing Power and Cooling Challenges

Typically, in higher density environments, customers will be putting in fewer cabinets as they have to power these up, so there is a combined power and cooling challenge – more power per cabinet and more cooling to the cabinet. Part of the required approach might involve wider containment solutions - movement away from standard 120 cm aisle widths to more of an 180 cm approach. And we’re even seeing 2.4-meter aisle widths. For example, a while ago the University of Groningen used 240 cm aisle widths to have enough air in front of their high-performance computer.

https://www.minkels.com/images/TqXdt/XL-CONTAINMENT-2400-collage.png

Considerations for High-density Cooling

When it comes to creating high density cooling areas, there are different solutions available – heat exchangers, for example, where space is limited, but the higher air speeds required in this situation can lead to problems such as hotspots. And, of course, you will consume more power. We always advocate that air speeds are low – so we come back to the need for more space, so that there’s enough air in front of the servers. Different types of data center business models will or won’t allow for this ‘extra’ space, and that’s when decisions need to be made about overall data design and liquid cooling versus air cooling solutions.

In terms of using air cooling to servers, as in the example shared earlier, customers may be concerned that even an 80% perforated door will obstruct the airflow and may lead to fault spots. However, we emphasize the importance of ensuring not just the maximum efficiency of airflow but also the maximum airtightness of any solution. Our Nexpand cabinets, used in conjunction with our containment solutions, deliver minimal, if not virtually zero, air loss. We’ve published a white paper on the approach we take (Reducing data center costs and environmental impact by deploying six effective cabinet airflow management measures) and it’s amazing to me that still our competitors have not followed our approach.

https://www.minkels.com/images/4Fq2N/Women reading whitepaper 12 small.png

Additionally, as we discovered during the tests mentioned earlier, it is important to be able to measure and, therefore, demonstrate air loss in relation to air pressure differential drops in relation to airflow and losses. With the exception of ourselves, there’s very little information provided by cabinet vendors when it comes to such air pressure-differential – critical information which then allows engineers to calculate the amount of losses they have in their airflow circuit.

Demonstrating Performance and Reliability

Ideally, if customers continue to remain wary of our (tested and proven) air cooling claims, we can carry out a pilot test for them and demonstrate, for example (as we have done with another of our own test projects) that we can cool 24 kilowatts in a cabinet with air in an existing data center. Just to re-emphasize, to achieve such a performance level, it is crucial that a high-quality containment solution is deployed to minimize, if not eliminate, air losses.

Alternatively, modelling or simulation can, if used correctly (whether the ‘old’ CFD or the ‘new’ digital twin) help to demonstrate to customers how any air cooling and containment solution performs in a variety of data center layouts, to help identify the optimum result.

Minkels uses the same test mandated in The Netherlands to measure the energy performance of any new building (by measuring the heat leakage) to develop its latest cabinet and containment solutions, so the performance metrics we provide to the CFD and digital twin providers for their data center equipment libraries, is independently verified and accurate. Problems can occur when the performance metrics provided by other manufacturers have not been as rigorously tested and validated.

This is perhaps a topic for another day, but worth mentioning to highlight the fact that customers might find a variation between the performance parameters indicated by software simulation and those in their actual, physical data center environment.

https://www.minkels.com/images/HG9tw/Echthetty_HR-3222 no interconnect logo.png

Making Informed Cooling Decisions

In conclusion, there seems to be general agreement that many, if not all, data centers will arrive at some kind of hybrid cooling model in the future. Key to making the right decisions when it comes to how and where to run different workloads within a data center and the equally important choice of the cooling (and, where necessary, containment) solution required to match the needs of the IT hardware serving these applications is a clear understanding of what any proposed cooling decision can provide depending on anticipated cabinet density.

The current hype around AI and liquid cooling ignores the fact that, for the vast majority of data center workloads, air cooling continues to have a major role to play, even as power densities continue to increase. However, not all air cooling and containment solutions provide the same level of performance, or the independent verification to provide true peace of mind, so customers would do well to understand exactly what it is they are buying before making a potential very expensive mistake.

Picture