Anthropic to Expand Use of One Million Google TPUs, Reaching 1GW of Computing Power by 2026

Oct 30, 2025

Anthropic plans to expand its use of Google Cloud technology, increasing the number of TPUs (Tensor Processing Units)—accelerators specifically designed for AI—to one million. This move aims to further enhance computing resource reserves and continuously empower research breakthroughs and product development in the AI ​​field. Anthropic states that this expanded collaboration is worth tens of billions of dollars and is expected to achieve a computing power supply exceeding 1GW of electricity by 2026.

In terms of computing architecture, Anthropic adopts a "diversified" strategy: in addition to Google's TPUs, it also flexibly utilizes AWS (Amazon Web Services) Trainium chips and NVIDIA GPUs, forming a collaborative support structure across three major chip platforms.


According to industry estimates, a 1GW data center costs approximately $50 billion, with computing chips accounting for the largest share, approximately $35 billion. Currently, competitor OpenAI's Stargate project aims to build 10GW of computing power, and Anthropic's expansion demonstrates a strong drive to catch up—both to ensure the continued leadership of Claude's models and to solidify its collaborative ecosystem with various industry partners.


Notably, Amazon, as Anthropic's core training partner and cloud computing provider, is continuously advancing "Project Rainier." This massive computing cluster spans multiple data centers across the United States, equipped with hundreds of thousands of AI chips, and Anthropic has explicitly stated its intention to further deepen its partnership with Amazon.


Currently, Anthropic serves over 300,000 enterprise customers globally, with the number of "large accounts" (those contributing over $100,000 in revenue annually) growing nearly sevenfold in the past year. This computing power expansion will effectively meet rapidly growing customer demand, while providing robust support for more rigorous model testing, model alignment research, and large-scale, responsible computing deployment.


“Anthropic and Google have a long-standing partnership, and this latest expansion will help us continue to grow, providing the necessary computing power to advance advanced AI technologies,” emphasized Krishna Rao, CFO of Anthropic. “From Fortune 500 companies to emerging AI startups, our customers rely on Claude for core tasks; this expanded computing power ensures we maintain our industry-leading model position amid exponential growth in demand.”


“Anthropic’s decision to significantly increase its TPU usage is a testament to the superior performance and cost-effectiveness our team has achieved in TPU development over the years,” said Thomas Kurian, CEO of Google Cloud. “Going forward, we will continue to innovate, further improving TPU efficiency and productivity, and expanding our mature AI accelerator portfolio—including the seventh-generation Ironwood TPU.”


Looking back at the partnership, Anthropic and Google Cloud first announced their partnership in 2023: at that time, Anthropic used Google Cloud’s AI infrastructure for model training and delivered models and technologies to enterprise customers through Google Cloud’s Vertex AI platform and Google Cloud Marketplace. Today, against the backdrop of surging global demand for AI computing power, the two parties have further expanded their collaboration on AI accelerators, including the Ironwood TPU.

The picture is from the Internet.
If there is any infringement, please contact the platform to delete it.