OpenAI secures $38bn Amazon AI deal

OpenAI secures bn Amazon AI deal

OpenAI secures $38 billion AWS deal for AI infrastructure. The agreement grants OpenAI access to extensive computing resources, marking a shift from Microsoft Azure and highlighting the fierce competition among cloud providers in the AI sector.


OpenAI has signed a landmark $38 billion agreement with Amazon Web Services (AWS) to secure the immense computing power required to train and deploy its next generation of artificial intelligence systems, marking one of the biggest technology infrastructure deals ever struck.

The partnership, announced this week, will provide the maker of ChatGPT with access to vast fleets of graphics processors, including hundreds of thousands of Nvidia chips, hosted within Amazon’s cloud network. OpenAI will begin using AWS infrastructure immediately, with full deployment expected by the end of 2026 and room to expand further beyond 2027.

This move represents a significant shift for OpenAI, which until now has heavily relied on Microsoft’s Azure platform to power its models. The deal also underscores the intensifying race among cloud giants to dominate the lucrative AI infrastructure market. Following the announcement, Amazon shares surged to a record high, briefly valuing the company at more than $2.74 trillion, as investors hailed the agreement as a strong endorsement of AWS’s capabilities.

Sam Altman, OpenAI’s chief executive, stated that the partnership was crucial to scaling what he termed “frontier AI”. “Scaling frontier AI requires massive, reliable compute,” he said. “Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”

For Amazon, the deal serves as a powerful vote of confidence in AWS at a time when some analysts had questioned whether the cloud division was falling behind rivals Microsoft and Google in the AI race. The agreement ensures AWS remains central to the next phase of AI development — an arena now defined by unprecedented hardware and capital demands.

OpenAI has been on a global dash to secure greater compute capacity. In addition to the Amazon deal, it has signed agreements with Nvidia, AMD, and Oracle to access more powerful processors and cloud data centres. Altman has previously said the company plans to invest around $1.4 trillion in computing resources over the coming years, targeting 30 gigawatts of infrastructure — enough to power roughly 25 million American homes.

The agreement also follows OpenAI’s internal restructuring with its largest backer, Microsoft, which valued the company at $500 billion and paved the way for it to evolve from a non-profit research outfit into a profit-driven business. The reorganisation transferred some control to a new non-profit foundation that holds equity in OpenAI’s commercial arm while removing Microsoft’s right of first refusal to supply its compute services, effectively clearing the path for the new partnership with Amazon.

However, the deal has reignited debate about whether the AI sector is heading into a speculative bubble. Nvidia, whose chips underpin most AI systems, recently became the world’s first $5 trillion company, with its market value now roughly half the size of Europe’s entire benchmark equities index. Analysts warn that the rapid rise in valuations, coupled with vast capital outlays by AI developers, may prove difficult to sustain if the promised productivity gains do not materialise.

Despite these concerns, the OpenAI–Amazon deal sends a clear message: AI’s future will be shaped by those with the deepest computing resources. As cloud titans jostle for position, the partnership not only secures OpenAI’s access to power on an unprecedented scale but also cements AWS’s place at the heart of the global AI infrastructure boom.

For now, Altman appears undeterred by warnings of overheating. His stated ambition is to add one gigawatt of compute capacity every week — each unit carrying an estimated capital cost of more than $40 billion. If that pace continues, OpenAI’s collaboration with Amazon may only be the beginning of an even larger technological arms race redefining how artificial intelligence is built, trained, and delivered worldwide.



  • Co-sourcing: the hybrid model for optimal business performance

    Co-sourcing: the hybrid model for optimal business performance

    Co-sourcing is fast becoming a cost-effective choice for businesses in the financial sector and beyond. Russell Gammon, chief innovation officer at UK-based Tax Systems explains why.


  • Benifex names Mohamad Awada chief services officer

    Benifex names Mohamad Awada chief services officer

    Benifex has hired Mohamad Awada to scale customer delivery globally. The appointment comes as employers face growing pressure to prove benefits programmes are being implemented well, adopted by staff, and linked to wider business performance.


  • Oracle moves CX workflows beyond copilots

    Oracle moves CX workflows beyond copilots

    Oracle is pushing CX software from support functions to execution. Its new Fusion Agentic Applications target sales, marketing, and service teams with governed automation inside core workflows.