In a move that sent reverberations through the enterprise technology industry, OpenAI’s models went live on Amazon Web Services’ Bedrock platform Tuesday — less than 24 hours after Microsoft’s exclusive license to distribute OpenAI’s artificial intelligence products officially expired. The arrival of OpenAI on AWS Bedrock marks one of the most significant realignments in the cloud industry since the original Microsoft-OpenAI partnership was forged in 2019.
The Microsoft-OpenAI exclusivity arrangement, which had given Azure the sole right to serve OpenAI’s API-based products to enterprise customers, was renegotiated over the weekend in a sweeping new agreement between the two companies. Under the revised terms, Microsoft retains a non-exclusive license to OpenAI’s models and products through 2032, remains the company’s “primary cloud partner,” and continues as a major OpenAI shareholder. However, Microsoft will no longer pay a revenue share to OpenAI — and crucially, OpenAI is now free to serve all of its products across any cloud provider.
Amazon moved with remarkable speed to capitalize. AWS CEO Matt Garman took the stage at a San Francisco launch event Tuesday to announce that OpenAI’s newest models, its Codex code-writing agent, and a new product called Amazon Bedrock Managed Agents powered by OpenAI were all available in preview. Garman was blunt about the years of built-up demand the OpenAI AWS Bedrock integration was finally meeting: “Their production applications run in AWS. Their data is in AWS. They trust the security of AWS, and we’ve forced them for the last couple of years, to get great OpenAI models, to go to other places.”
OpenAI CEO Sam Altman, appearing via recorded video — his schedule, as he put it, having been “taken away” by the Elon Musk trial unfolding across the Bay Bridge in Oakland — expressed enthusiasm for the partnership. OpenAI Chief Revenue Officer Denise Dresser has previously written internally that the company’s Microsoft-exclusive arrangement had “limited our ability to meet enterprises where they are,” with inbound demand for AWS access described as “frankly staggering.”
The financial architecture underpinning the OpenAI AWS Bedrock expansion is substantial. Amazon invested $50 billion in OpenAI earlier this year, and OpenAI has committed over $100 billion in cloud spending on AWS through a multi-year agreement. The company also pledged to utilize approximately two gigawatts of Amazon’s custom Trainium chip capacity for model training and inference — a significant bet on Amazon’s in-house silicon.
The newly launched Amazon Bedrock Managed Agents product is perhaps the most strategically significant piece. Co-developed by AWS and OpenAI, the service enables enterprises to build AI agents capable of maintaining context across interactions, executing multi-step workflows, and integrating with existing AWS security and compliance infrastructure. For enterprise customers who have resisted migrating AI workloads away from AWS, the product removes a longstanding obstacle.
For Microsoft, the loss of OpenAI exclusivity represents a material shift — though analysts note the arrangement had already become complicated. Microsoft has quietly diversified its AI model portfolio, integrating Anthropic’s Claude into Microsoft 365 Copilot and investing in its own in-house MAI model program. Azure remains deeply integrated with OpenAI’s consumer and enterprise products, and Microsoft’s shareholder stake ensures it participates in OpenAI’s growth regardless of cloud exclusivity. Still, the end of OpenAI AWS Bedrock exclusivity closes a chapter that defined enterprise AI procurement for nearly seven years.