Liquid Cooling in Air Cooled Data Centers on Microsoft Azure
With the advent of artificial intelligence and machine learning (AI/ML), hyperscale datacenters are increasingly accommodating AI accelerators at scale, demanding higher power at higher density than is customary in traditionally air-cooled facilities.
As Microsoft continues to expand our growing datacenter fleet to enable the world’s AI transformation, we are faced with a need to develop methods for utilizing air-cooled datacenters to provide liquid cooling capabilities for new AI. Additionally, increasing per-rack-density for AI accelerators necessitates the use of standalone liquid-to-air heat-exchangers to support legacy datacenters that are typically not equipped with the infrastructure to support direct-to-chip (DTC) liquid cooling.
A solution: standalone liquid cooling heat exchanger units.
Microsoft’s Maia 100 platform marked the first introduction of a liquid cooling heat exchanger in existing air-cooled data centers for direct-to-chip liquid cooling. Since that time, we have continued to invest in novel cooling techniques to accommodate newer, more powerful AI/ML processors. Today at OCP 2024, we are sharing contributions for designing advanced liquid cooling heat exchanger units (HXU). By open sourcing our design approach through the Open Compute Project, we hope to share our HXU development work to enable closed-loop liquid cooling in AI datacenters across the entire computing industry.
Heat Exchanger Unit Design Principles
Our designs for HXUs focus on enabling advanced cooling capacity for modern AI processors, improving operating efficiency to reduce power demand, and enabling AI accelerator racks to operate in traditionally air-cooled data centers.
Microsoft’s vision for enhanced effectiveness centers on using the same chilled air that legacy datacenters are already providing for air-cooled platforms. Our engineering spec for HXUs targets the relative liquid and air flow rates required to supply the cooling liquid at the required temperature to the IT equipment.
The design principles for HXUs are the result of a close partnership with Delta and Ingrasys. Working with these partners has helped us evolve our approach, including double-wide rack to increase heat dissipation capacity, and specialized packaging to ensure leak-free transport. Envisioning HXUs with a modular design allows field servicing of key components, including pumps, fans, filters, printed circuit board assembly, and sensors. Quick disconnects and strategically placed leak detection ropes, along with drip pans that guide liquids to the base of an HXU, help mitigate and contain liquid leaks. Fans are placed at the rear to avoid pre-heating within an HXU and eliminate entrainment issues in the cold aisle. The modular fluid connections between HXUs and server racks allow for various configurations.
We welcome further collaboration from the broader OCP community in enabling the future of datacenter power and cooling innovation with state-of-the-art infrastructure engineering capabilities.
Published on:
Learn moreRelated posts
Azure Developer CLI (azd) – November 2024
This post announces the November release of the Azure Developer CLI (`azd`). The post Azure Developer CLI (azd) – November 2024 appeared...
Microsoft Purview | Information Protection: Auto-labeling for Microsoft Azure Storage and Azure SQL
Microsoft Purview | Information Protection will soon offer Auto-labeling for Microsoft Azure Storage and Azure SQL, providing automatic l...
5 Proven Benefits of Moving Legacy Platforms to Azure Databricks
With evolving data demands, many organizations are finding that legacy platforms like Teradata, Hadoop, and Exadata no longer meet their needs...
November Patches for Azure DevOps Server
Today we are releasing patches that impact our self-hosted product, Azure DevOps Server. We strongly encourage and recommend that all customer...
Elevate Your Skills with Azure Cosmos DB: Must-Attend Sessions at Ignite 2024
Calling all Azure Cosmos DB enthusiasts: Join us at Microsoft Ignite 2024 to learn all about how we’re empowering the next wave of AI innovati...
Query rewriting for RAG in Azure AI Search
Getting Started with Bicep: Simplifying Infrastructure as Code on Azure
Bicep is an Infrastructure as Code (IaC) language that allows you to declaratively define Azure resources, enabling automated and repeatable d...
How Azure AI Search powers RAG in ChatGPT and global scale apps
Millions of people use Azure AI Search every day without knowing it. You can enable your apps with the same search that enables retrieval-augm...