The renewable power infrastructure for those datacenters will still exist.
People will be able to buy those used GPUs cheap and run small local LLMs perhaps. A 10 year old computer today won't do state of the art games or run models, but is entirely acceptable for moderate computing use.
> People will be able to buy those used GPUs cheap and run small local LLMs perhaps.
Maybe; I find it unlikely though, because unlike CPUs, there's a large difference in compute/watt in subsequent generations of GPUs.[1]
I would imagine that, from an economics PoV, the payback for using a newer generation GPU over a previous generation GPU in terms of energy usage is going to be on the order of months, not years, so anyone needing compute for more than a month or two would save money by buying a new one at knockdown prices (because the market collapsed) than by getting old ones for free (because the market collapsed).
[1] Or maybe I am wrong about this - maybe each new generation is only slightly better than the previous one
>People will be able to buy those used GPUs cheap and run small local LLMs perhaps.
That's not really how SaaS works these days. It will be a "cheap" subscription or expensive and focused on enterprise. Both those require maintance costs, which ruin the point of "cheap and small run LLM's".
And they sure aren't going to sell local copies. They'd rather go down with their ship than risk hackers dissecting the black box.
Exactly. SaaS does not enter into it.
People will locally run the open models which are freely released, just like they do today with Llama and Whisper.
Most of the AI SaaS companies won't be around to have anything to say about it, because they will be casualties of the bust that will follow the boom. There will be a few survivors with really excellent models, and some people will pay for those, while many others simply use the good-enough freely available ones.
But in terms of compute/watt those 10 year old data centers are going to suck and that is what counts for a data center.