
In the race for dominance in the tech sector, two distinct categories have traditionally defined success and innovation: software and hardware. These categories are also shaping the AI era, with companies developing outstanding AI applications, operating systems, and algorithms that are redefining how we conduct business.
Supporting this unprecedented drive is AI infrastructure, particularly the data centers needed to maintain a consistent pace of innovation. AWS is one of the leaders in AI infrastructure projects and is also creating specialized physical infrastructure for companies like Anthropic. Additionally, AWS has developed custom chips for AI workloads and machine learning, including GenAI infrastructure, most notably Amazon Bedrock.
Now, AWS is developing hardware to support cooling functions for next-generation NVIDIA AI GPUs.
Problem Solving
NVIDIA’s GPUs have been a critical resource in powering the GenAI boom. However, while they are incredibly powerful, they are also very resource-hungry and require separate components for cooling.
Instead of resorting to traditional cooling techniques that would consume significant floor space and water, Amazon engineers devised a novel liquid cooling solution: the In-Row Heat Exchanger (IRHX). This custom-built solution for NVIDIA AI GPUs allows AWS to revolutionize the housing and cooling of these chips, integrating the system into both existing and new data centers.
Developed in a rapid 11 months in collaboration with NVIDIA, the cooling technology is a testament to the efficiency and agility of the AWS-NVIDIA partnership. The system, which combines liquid and air-based components, circulates coolant to GPU chips through cold plates while removing heat via fan-coil arrays.

AI Agent & Copilot Summit is an AI-first event to define opportunities, impact, and outcomes with Microsoft Copilot and agents. Building on its 2025 success, the 2026 event takes place March 17-19 in San Diego. Get more details.
Closing Thoughts
For Amazon, this innovation represents a series of significant wins. The new cooling system designed by AWS has enabled it, among other innovations, to quickly and efficiently make P6e-GB200 UltraServers, powered by NVIDIA Grace Blackwell Superchips, generally available.
By creating its own infrastructure instead of relying on third-party solutions, AWS is strategically positioning itself in the AI arms race. The cooling system, designed to fit within the company’s existing data centers, enables rapid scalability and readiness for the most powerful AI technologies.
Another significant aspect of the AI Era is the unprecedented partnerships and co-creation taking place. NVIDIA and AWS worked closely together to develop this bespoke solution, highlighting the collaborative nature of the industry.
This trend of tech companies collaborating to create customized solutions for customers is increasingly prevalent, from notable partnerships like that between Oracle and Microsoft, to the physical development of hardware to support specific technologies. This inclusivity is driving the industry forward.
The AI Era is ushering in a new way of working, where outdoing one another is becoming a thing of the past. Of course, there is always competition, but the focus is shifting from trying to develop a product specific to every organization for every use case, to what is best for customers, what is leading the field, and how to incorporate the technology into their own stack. This shift in focus is what makes the possibilities of AI so infinite.
Ask Cloud Wars AI Agent about this analysis