Anthropic to Use 1 Million Google TPUs 

Anthropic has announced plans to expand its use of Google Cloud services, including deploying one million Tensor Processing Units (TPUs). 

This expansion, valued at ‘tens of billions of dollars,’ with an expected capacity of over a gigawatt coming online in 2026. 

“Anthropic’s choice to significantly expand its usage of TPUs reflects the strong price-performance and efficiency its teams have seen with TPUs for several years,” said Thomas Kurian, CEO at Google Cloud.

This expansion indicates a long-standing partnership between the two companies that started in early 2023, involving the use of Google’s TPUs and cloud services for training and deploying AI models. 

In addition to TPUs, Anthropic also uses Amazon’s Trainium chips, alongside NVIDIA’s GPUs. “This multi-platform approach ensures we can continue advancing Claude’s capabilities while maintaining strong partnerships across the industry,” Anthropic stated.

The expanded compute capacity will assist Anthropic in meeting the ongoing demand for its Claude models, particularly in the business and enterprise sectors. 

The company reported serving over 300,000 business customers, and the number of large accounts—those generating more than $100,000 in annual revenue—has increased nearly sevenfold in the past year.

Over the past few months, Anthropic has revealed significant updates to its Claude model lineup, including the launch of Claude 4.5 Sonnet, 4.5 Haiku, and the larger 4.1 Opus. 

TPUs are hardware systems developed by Google specifically for AI tasks. These chips are accessible via Google Cloud services, and the company revealed that they are used internally for training and deploying the Gemini AI models. 

Recently, Google revealed that its latest Gemini 2.5 models were trained on a massive cluster of Google’s fifth-generation TPUs. Google’s latest TPU, the Ironwood, marks its 7th iteration. 

Additionally, Google has been involved in various funding efforts for Anthropic, and a report from The New York Times earlier this year stated that it owns 14% of Anthropic, citing legal findings.

Besides Anthropic, Apple was another prominent company that used Google’s TPUs for AI workloads. 

In 2024, Apple announced it used 8,192 TPU v4 chips within Google Cloud to train its ‘Apple Foundation Model’, a large language model supporting its AI projects.

The post Anthropic to Use 1 Million Google TPUs  appeared first on Analytics India Magazine.

Scroll to Top