It might be just our region, but for a long time we couldn't access current frontier models at all. Only old GPT4 level models. Meanwhile, Anthropic is rolling out access to every model within 24 hours of announcement to Bedrock.

Not sure how much Azure OAI has changed, but when I last used it 2-3 years ago, it was just a farce to get you using provisioned throughput. The throughput quotas were small, the process to request more was bureaucratic, and the Azure SAs were

It was also very clear the OAI and MS teams held each other in contempt (not relevant, but was interesting and grew in the immediate aftermath of the Altman drama).

So why were we using it? OpenAI don’t really have an enterprise go to market, bedrock still relied on Claude 2, and we weren’t willing to YOLO on clickthroughs.

Once Claude 3 came out, we jumped ship. That sucked too, although I hear it’s gotten better though.

I have been out of openai azure deployments for a whole,1year+, but we had often spikes in latency, escalated to the Head of Azure Europe, and still no official clues about them, meanwhile they were trying to get us in some kind of collaboration announcements. And it was the only reasons we had a few meetings with that guy.

So yeah Azure sucked ass and plenty of outages or latency, like 3min for first byte while usually it was max 30sec to 1min, if not even faster (memory is a bit fussy)