
It has become increasingly common for companies in the Cloud Wars Top 10 to partner with providers of large language models (LLMs) to enhance their offerings to customers. Anthropic is a key player in this space, and many companies are leveraging the capabilities of its LLMs to deliver next-generation AI services. Companies like:
Occasionally, the tables turn, and it is Anthropic that reaches out to major companies to offer its services as demonstrated by recent news from Google Cloud, where Anthropic will expand the use of the company’s TPUs and other services.
What’s New?
In a major expansion of its use of Google Cloud’s TPU chips, which it uses for training its exemplary Claude models, Anthropic will tap more than a gigawatt of compute capacity. It is, to date, the most substantial increase in the company’s TPU usage with one million TPU chips available.
To put that number into perspective, one gigawatt is enough energy to power 900,000 — 1,000,000 homes in the U.S or the output from a large-scale nuclear power station. It’s huge.
“Anthropic and Google have a longstanding partnership and this latest expansion will help us continue to grow the compute we need to define the frontier of AI,” said Krishna Rao, CFO of Anthropic.
“Our customers — from Fortune 500 companies to AI-native startups — depend on Claude for their most important work, and this expanded capacity ensures we can meet our exponentially growing demand while keeping our models at the cutting edge of the industry.”
The latest expansion of the partnership also includes increased utilization of Google Cloud services to support Anthropic’s research teams by providing AI-optimized infrastructure. Anthropic has been leveraging Google Cloud’s AI infrastructure to train its models since 2023.

AI Agent & Copilot Summit is an AI-first event to define opportunities, impact, and outcomes with Microsoft Copilot and agents. Building on its 2025 success, the 2026 event takes place March 17-19 in San Diego. Get more details.
Final Thoughts
“Anthropic’s choice to significantly expand its usage of TPUs reflects the strong price-performance and efficiency its teams have seen with TPUs for several years,” said Thomas Kurian, CEO, Google Cloud.
“We are continuing to innovate and drive further efficiencies and increased capacity of our TPUs, building on our already mature AI accelerator portfolio, including our seventh-generation TPU, Ironwood.”
Google Cloud is addressing the astronomical demands of model developers by continually investing in and refining its AI infrastructure. In addition to its offerings for marketing, delivery, training, and testing, this makes the company an incredibly strong candidate for organizations leading the LLM space that require a technology partner.
This news also highlights the power of the AI Revolution to support companies across every realm of the industry. While many organizations wish to leverage the phenomenal power of models, others have the infrastructure in place to make that power a reality. These relationships stem from some of the most robust strategic alliances in the history of business, and the biggest winner? The customer.
Ask Cloud Wars AI Agent about this analysis
 
		 
									 
					


 
						



