Google Plans Up to $40 Billion Investment to Deepen Anthropic Partnership

Google parent company Alphabet announced a $10 billion investment in Anthropic, with a commitment to add up to $30 billion more if performance targets are met, for a total investment cap of $40 billion. Anthropic confirmed the initial $10 billion is priced at a $350 billion valuation, consistent with its February financing round.

This deal is not just about capital—it represents deep binding at the compute infrastructure level.

Compute Commitment: 5GW + 1 Million TPUs

Under the disclosed terms, Google Cloud will provide Anthropic with compute capacity equivalent to 5 gigawatts (GW) over five years, and up to 1 million TPU chips. The scale of this commitment is notable—5GW of power capacity is roughly equivalent to a medium-sized nuclear power plant, and 1 million TPUs means Anthropic’s future model training and inference will be heavily dependent on Google’s infrastructure.

ElementScale
Initial Investment$10 billion
Additional (Performance-Based)$30 billion
Compute Commitment5GW / 5 years
TPU ChipsUp to 1 million
Valuation Anchor$350 billion

The “Circular Deal” Controversy

This investment has sparked discussion about “circular deal” structures in the industry: Google invests in Anthropic, and Anthropic uses that capital to purchase Google Cloud compute. Critics argue this structure inflates both parties’ financial metrics—Google gains investment returns and cloud service revenue, while Anthropic gets funding and compute lock-in, but actual new external cash flow is limited.

Similar patterns have appeared in Microsoft’s investment in OpenAI (for Azure compute) and Amazon’s investment in Anthropic (for AWS compute). Now Google joins with $40 billion, meaning all three major cloud providers—Microsoft, Google, and Amazon—are deeply tied to leading model companies through capital bonds.

Impact on Industry Dynamics

Anthropic’s compute demand is growing rapidly with the adoption of Claude Code. The token consumption for coding agent tasks is growing exponentially.

For Google, locking in Anthropic means securing one of the largest AI inference and training workloads for years to come. For Anthropic, scaling up TPU ecosystem usage could reduce dependence on Nvidia GPUs and gain more controllable compute costs.

Primary Sources