For nearly seven years, if you wanted to run OpenAI’s models in production at hyperscale, you had exactly one choice: Microsoft Azure. That ended this week. As of April 2026, GPT-5.5 and GPT-5.4 are now available in preview on Amazon Bedrock — and the implications go far beyond “more cloud options.” This is one of the most consequential restructurings in the history of enterprise AI.
What Just Happened (The Quick Version)
On April 27, OpenAI and Microsoft announced a fundamental restructuring of their partnership. The exclusive license that gave Microsoft sole hyperscaler rights to OpenAI’s models — in place since 2019 and reinforced in the 2023 deal — is gone. In its place: a non-exclusive license that runs through 2032. Microsoft also stops paying revenue share to OpenAI but continues capped payments through 2030.
One day later, on April 28, AWS and OpenAI announced their own partnership. OpenAI’s frontier models (GPT-5.5 and GPT-5.4) and the Codex coding agent are now in preview on Amazon Bedrock, with general availability “in the next few weeks.” The deal is reportedly worth around $38 billion in committed AWS spend.
Translation: in a single 48-hour window, OpenAI ended its exclusive cloud relationship with Microsoft and signed up its largest direct competitor as a primary distribution channel.
Why Microsoft Let Go
The temptation is to read this as Microsoft “losing” — but that’s not quite right. Microsoft was sitting on a contract that gave it exclusivity “until OpenAI achieves AGI,” with no firm date. That sounds great until you realize what it actually meant: an open-ended legal entanglement with a non-profit-affiliated entity that was simultaneously trying to restructure into a for-profit, raising more capital than any private company in history, and feuding with its own board. Every quarterly earnings call brought new questions about Microsoft’s exposure if OpenAI’s governance blew up again.
By converting exclusivity into a non-exclusive license through 2032 and capping its revenue share through 2030, Microsoft just bought regulatory and financial certainty. It also keeps the IP rights it actually cares about — the ability to embed OpenAI tech across Microsoft 365, Azure, GitHub, and Dynamics — without the legal landmines.
Honestly, I think Microsoft got the better end of the trade. Exclusivity sounds powerful in headlines but creates massive antitrust risk. A definite-term, non-exclusive license is much easier to defend in front of regulators in Brussels and Washington.
Why OpenAI Needed This
OpenAI’s compute hunger is the worst-kept secret in tech. The company has been signaling for over a year that Azure alone could not meet its training and inference demands. Sam Altman has been talking about $7 trillion in chip infrastructure since 2024. You don’t get to that scale running on one cloud.
The AWS deal solves three problems simultaneously:
Compute diversification: AWS brings Trainium2 silicon, massive Nvidia GPU inventory, and global data center capacity that Azure cannot match in certain regions.
Distribution: Bedrock is the default AI procurement path for tens of thousands of AWS-native enterprises that have refused to spin up Azure accounts just to access OpenAI. Those customers are now reachable for the first time.
Negotiating leverage: Having a real second hyperscaler partner means OpenAI can finally negotiate Azure pricing and capacity from a position of strength rather than dependence.
What This Means for Anthropic and Claude
This is where the story gets spicy. Bedrock has been Claude’s home turf since 2023. Anthropic’s entire enterprise distribution strategy was built around the premise that AWS customers who wanted a frontier model would pick Claude — because OpenAI literally wasn’t available. That moat just evaporated.
Now, when an AWS enterprise architect opens Bedrock to choose a foundation model, they’ll see Claude Sonnet 4.6, Claude Opus 4.6, GPT-5.5, GPT-5.4, plus Amazon’s own Nova models, Mistral, Cohere, and others. Claude is no longer the default frontier choice on AWS. It’s one of three or four.
I expect Anthropic to respond by leaning even harder into its Google partnership (the $40 billion deal we covered separately today) and pushing differentiation around Claude’s actual capabilities — particularly long-context reasoning, code review, and safety guarantees that enterprise buyers care about. But the easy wins on AWS are over.
What This Means for Azure
Microsoft is not in a panic, but Azure’s most powerful sales pitch — “come to us if you want OpenAI” — just lost its teeth. Microsoft will need to lean harder on its other differentiators: deep Office 365 integration, Copilot embedding across the productivity stack, the Azure OpenAI Service’s enterprise security features, and the fact that Microsoft has spent years building OpenAI-specific tooling that AWS has not yet matched.
Don’t be surprised if Azure responds with aggressive pricing on OpenAI inference in Q3, plus accelerated rollout of Microsoft’s own Phi and MAI model lines.
What This Means for You
If you’re an AWS-native shop that has been wrangling Azure access just to use GPT-4 or GPT-5: your procurement headache is about to go away. You can call OpenAI models through Bedrock using the same IAM roles, VPCs, and billing infrastructure you already have. This is genuinely transformational for enterprise AI procurement.
If you’re a developer using OpenAI’s API directly: nothing changes immediately. The OpenAI platform continues to operate the same way. But over time, expect to see better global latency as OpenAI inference gets distributed across both Azure and AWS regions.
If you’re picking a foundation model today: this is now genuinely a horse race on AWS. Run your own evals. Don’t let “Claude is the AWS default” drive your decision anymore — that default is gone.
The Codex Wildcard
The detail that didn’t get enough attention: OpenAI’s Codex coding agent is launching on Bedrock alongside the chat models. Codex on AWS puts OpenAI in direct competition with AWS’s own CodeWhisperer/Q Developer offering, AND with Anthropic’s Claude Code. AWS has effectively turned Bedrock into a coding-agent battleground where it sells all three options and lets the market pick.
For developers, this is fantastic. For the agent vendors, it’s brutally competitive. Watch for aggressive pricing on agentic coding tools in Q3.
What I’m Watching Next
1. The Bedrock pricing matrix: AWS will publish official pricing in the coming weeks. The gap between OpenAI and Claude pricing on Bedrock will tell us a lot about who has the better deal terms.
2. Microsoft’s response: Build 2026 is around the corner. Expect Microsoft to roll out a wave of Azure-only differentiators — deeper Office integration, exclusive Copilot features, custom MAI models — to remind customers why Azure is still the right choice.
3. The Google move: If Anthropic is now sharing Bedrock with OpenAI, can Google somehow get OpenAI’s models onto Vertex AI too? Probably not in 2026, but the door is no longer locked.
Bottom Line
The headline is “OpenAI on AWS,” but the real story is the death of exclusivity in frontier AI distribution. From here on out, every major model will be available on every major cloud. Differentiation moves from “who do you have a deal with” to “which model is actually best for your workload.” That’s healthier for the industry, harder for incumbents, and an unambiguous win for the rest of us.
Microsoft built the original moat. Microsoft just chose to drain it. And in 48 hours, OpenAI went from a one-cloud company to a two-cloud company — with the leverage that comes with it. Welcome to the multi-cloud frontier AI era.