>People will be able to buy those used GPUs cheap and run small local LLMs perhaps.
That's not really how SaaS works these days. It will be a "cheap" subscription or expensive and focused on enterprise. Both those require maintance costs, which ruin the point of "cheap and small run LLM's".
And they sure aren't going to sell local copies. They'd rather go down with their ship than risk hackers dissecting the black box.
Exactly. SaaS does not enter into it.
People will locally run the open models which are freely released, just like they do today with Llama and Whisper.
Most of the AI SaaS companies won't be around to have anything to say about it, because they will be casualties of the bust that will follow the boom. There will be a few survivors with really excellent models, and some people will pay for those, while many others simply use the good-enough freely available ones.