The small ones that you can run on a MacBook are quite useless for programming. Once you have access to a state-of-the-art model, it's difficult to accept any downgrade. That's why I think AI-driven programming will always rely on data centers and the best models.
> if Anthropic is down, there is still Google, OpenAI, Mistral, Deepseek and so on
No company is going to pay for subscriptions to all of them. Either way, we'll see a new layer of fragility caused by overdependence on AI. Surely, though, we will adapt by learning from every major event related to this.
> The small ones that you can run on a MacBook are quite useless for programming.
That really depends on your Macbook :). If you throw enough RAM at it, something like a qwen3-coder will be pretty good. It won't stack up to Claude, or Gemini or GPT, of course, but it's certainly better than nothing and better than useless.
> No company is going to pay for subscriptions to all of them.
They don't have to, every lab offers API based pricing. If Anthropic is down, I can hop straight into Codex and use GPT-5 via API, or Gemini via Vertex, or just hop onto AWS Bedrock and continue to use Claude etc.
I don't think this is an issue in practice, honestly.