> It's not clear how much energy all of our AI systems will use, and while it's tempting to outright believe they'll simply use more and more, even that's not yet clear based on arguments presented.
For the last 20 years, power consumption of HPC is increased per cubic inch as systems are miniaturized and density increased. The computing capacity increased more than the power use, but this doesn't mean we didn't invent more inefficient ways to undo significant part of that improvement.
It's same for AI. Cards like Groq and inference oriented hardware doesn't consume as power as training oriented cards, but this doesn't mean total power use will reduce. On the contrary. It'll increase exponentially. Considering AI companies doesn't care about efficiency yet means we're wasting tons of energy, too.
I'll not enter into the water consumption debacle, because open-loop systems waste enormous amounts of water.
All in all, we're wasting a lot of water and energy which would sustain large cities and large number of people.
with regards from your friendly HPC admin.