Honestly, I think AI will eventually be a good thing for the environment. If ai companies are trying to expand renewables and nuclear to power their datacenters for training, well, that massive amount of renewables and battery storage becomes available when training is done and the main workload is inference. I know they are consistently training new stuff on small scale but from what I've read the big training batches only happen when they've proven out what works at small scale.
Also, one has to imagine that all this compute will help us run bigger / more powerful climate models, and google's ai is already helping them identify changes to be more energy efficient.
The need for more renewable power generation is also going to help us optimize the deployment process. I.e. modular nuclear reactors, in situ geothermal taking over old stranded coal power plants, etc
I find this take overly optimistic. First, it's bases on the assumption that the training will stop, and that energy will be available for other, more useful, purposes. This is not guarantees.
Besides this, it completely disregards the fact that today, tomorrow, energy will be utilized. We will keep emitting co2 for sure, and maybe, in the future, this will cause a surplus of energy? It's a bet I wouldn't take, even because LLMs need lots of energy to run as well as for training.
But in any case, I wouldn't want Microsoft, Google, Amazon and OpenAI to be the ones owning the energetic infrastructure in the future, and if we realize, collectively, that building renewable sources is what er need, we should simply tax them and use that wealth to build collective resources.
Honestly, I think AI will eventually be a good thing for the environment. If ai companies are trying to expand renewables and nuclear to power their datacenters for training, well, that massive amount of renewables and battery storage becomes available when training is done and the main workload is inference. I know they are consistently training new stuff on small scale but from what I've read the big training batches only happen when they've proven out what works at small scale.
Also, one has to imagine that all this compute will help us run bigger / more powerful climate models, and google's ai is already helping them identify changes to be more energy efficient.
The need for more renewable power generation is also going to help us optimize the deployment process. I.e. modular nuclear reactors, in situ geothermal taking over old stranded coal power plants, etc
I find this take overly optimistic. First, it's bases on the assumption that the training will stop, and that energy will be available for other, more useful, purposes. This is not guarantees. Besides this, it completely disregards the fact that today, tomorrow, energy will be utilized. We will keep emitting co2 for sure, and maybe, in the future, this will cause a surplus of energy? It's a bet I wouldn't take, even because LLMs need lots of energy to run as well as for training.
But in any case, I wouldn't want Microsoft, Google, Amazon and OpenAI to be the ones owning the energetic infrastructure in the future, and if we realize, collectively, that building renewable sources is what er need, we should simply tax them and use that wealth to build collective resources.