They had a first-mover advantage for sure.

It used to be revolutionary, but now there is a huge difference: plenty of competition, and a growing number of high-quality models that can run offline (for free!) or cheaper (Gemini-Flash for example).

They are in some way the Nokia of AI, "we have the distribution, product will sell", but this is not enough if innovation is weak.

They are even lagging behind (GPT-5 is a weaker coder than Claude, Sora is a toy compared to Seedance 2.0, etc).

One Apple releases the AIPhone, running offline models, with 32 GB of unified memory, with optional cloud requests, then it's going to be super though for OpenAI.

Local ai is cool and all but the models that run on typical consumer hardware doesn’t really compare to the breadth of information available by the likes of chatGPT, lets be real.