Amazon Inks $38 Billion Deal With OpenAI For Nvidia Chips
For Amazon, which has struggled to compete in the AI age, the deal is an endorsement of its ability to build and run enormous networks of data centers.

Amazon.com Inc.’s cloud unit has signed a $38 billion deal to supply a slice of OpenAI’s bottomless demand for computing power. Amazon shares surged.
Amazon Web Services will provide the ChatGPT maker with access to hundreds of thousands of Nvidia Corp. graphics processing units as part of a seven-year deal, the companies announced on Monday.
The arrangement is the latest indication of OpenAI’s transition from research lab to artificial intelligence powerhouse that has reshaped the technology industry. The company has committed to spending $1.4 trillion on infrastructure to build and power its AI models, an unprecedented binge that has prompted concerns about an investment bubble.
For Amazon, which has struggled to compete in the AI age, the deal is an endorsement of its ability to build and run enormous networks of data centers. “As OpenAI continues to push the boundaries of what’s possible, AWS’s best-in-class infrastructure will serve as a backbone for their AI ambitions,” AWS Chief Executive Officer Matt Garman said in a statement.
Amazon shares jumped about 6% in early trading on Monday. Nvidia rose as much as 3%.
Amazon is the world’s largest seller of rented computing power. But until Monday, AWS had been an outlier, having watched as just about every other significant cloud-computing company in the US joined the ranks of those building or retrofitting data centers to back OpenAI.
Microsoft Corp. — OpenAI’s largest investor and previously its exclusive cloud-computing provider — recently announced a new commitment from OpenAI to spend some $250 billion on its Azure cloud unit. Oracle Corp. has inked a $300 billion deal to provide OpenAI with data centers, and earlier this year OpenAI disclosed that Alphabet Inc.’s Google Cloud Platform was among the companies powering ChatGPT. The startup also has a $22.4 billion deal with CoreWeave Inc., a leader among a new crop of so-called neo clouds pitching their services to AI developers.
Under the deal announced Monday, OpenAI will immediately start using AWS computing power, with all of the targeted capacity to be provided before the end of 2026 and the option to expand its work with AWS in subsequent years. Amazon will deploy hundreds of thousands of chips, including Nvidia’s GB200 and GB300 AI accelerators, in clusters designed to help ChatGPT produce responses to user prompts or train next-generation models.
“Scaling frontier AI requires massive, reliable compute,” OpenAI CEO Sam Altman said in a statement. “Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”
Amazon is also among the biggest backers of Anthropic PBC, the AI developer founded by OpenAI veterans. The Seattle-based company announced last week that a data center complex built for the startup and powered by hundreds of thousands of AWS’s homegrown Trainium2 AI chip, was operational. Last month, Google said it would supply up to 1 million of its specialized AI chips to Anthropic, a deal worth ten of billions of dollars.
