Microsoft, Nvidia deepen AI bets with $15 bn investment into Anthropic

Microsoft, Nvidia deepen AI bets with  bn investment into Anthropic

Microsoft and Nvidia will invest a combined $15 billion in Anthropic. The Claude-maker has, in turn, committed to spend $30 billion on Microsoft’s Azure cloud services.


Microsoft and Nvidia are deepening their bets on generative AI with a joint investment in Anthropic, the company behind the Claude family of models. The multi-year arrangement — announced ahead of Microsoft’s Ignite developer conference — will see Anthropic spend $30 billion purchasing compute capacity on Microsoft Azure, in return for investments totalling up to $15 billion from the two technology giants.

According to Reuters, Nvidia will provide as much as $10 billion of investment, with Microsoft contributing up to $5 billion. The agreement positions Microsoft as Anthropic’s primary cloud partner, while Nvidia’s contribution ties Anthropic’s infrastructure directly to its next-generation Grace Blackwell and Vera Rubin chips.

Anthropic said it will deploy up to one gigawatt of compute power on Nvidia hardware under the plan — a scale comparable to some of the world’s largest data-centre operations. Through the partnership, customers of Microsoft’s Azure AI Foundry will gain access to Anthropic’s latest models, including Claude Sonnet 4.5, Claude Opus 4.1, and Claude Haiku 4.5.

The collaboration means Anthropic’s Claude series will be available across all three major clouds — Microsoft, Amazon, and Google — making it the only frontier model currently accessible through each of the big providers. Microsoft chief executive Satya Nadella said the move underscored the company’s strategy of broadening AI access, explaining that it was “all about deepening our commitment to bringing the best infrastructure, model choice, and applications to our customers.”

While Microsoft remains closely tied to OpenAI, the Anthropic deal signals a clear intention to diversify its AI portfolio. The company is aiming to secure access to a range of frontier models as competition among cloud and AI providers accelerates.

For Nvidia, the partnership extends its position at the centre of the AI supply chain — not only as a chip designer but as a strategic investor in model developers. Locking in Anthropic as a long-term customer helps ensure demand for its latest hardware at a time when AI data-centre capacity is expanding faster than ever.

Analysts see the scale of the commitment as evidence that the AI arms race is shifting from software innovation to infrastructure dominance. A $30 billion compute purchase highlights how power generation, chip supply, and cloud real estate are now as critical to the AI economy as algorithms themselves.

The announcement comes amid growing scrutiny of what some economists call “circular” AI deals — where cloud operators, chipmakers, and model developers invest in each other, blurring lines between supplier and client. Shares in Microsoft and Nvidia reportedly dipped on the news, reflecting investor caution around the long-term returns of such capital-intensive arrangements.

Still, the partnership positions the three companies at the core of the AI ecosystem’s next phase — where compute scale, energy access, and model diversity will shape competitiveness more than any single breakthrough. For enterprise customers, it could translate into greater choice of advanced models through Azure, and potentially more predictable access to compute resources as demand grows.

As Anthropic expands its partnerships across competing clouds, the balance of power in AI infrastructure appears to be tilting from exclusive alliances toward a multi-platform model. For Microsoft and Nvidia, the message is that AI leadership now depends not just on innovation — but on control of the physical and digital foundations beneath it.


Stories for you