Yeah, something we are confident about is that
a) training is where the bulk of an AI system's energy usage goes (based on a report released by Mistral)
b) video generation is very likely a few orders of magnitude more expensive than text generation.
That said, I still believe that data centres in general - including AI ones - don't consume a significant amount of energy compared with everything else we do, especially heating and cooling and transport.
Pre-LLM data centres consume about 1% of the world's electricity. AI data centres may bump that up to 2%
You mean this Mistral report? https://mistral.ai/news/our-contribution-to-a-global-environ...
I don't think it shows that training uses more energy than inference over the lifetime of the model - they don't appear to share that ratio.
> don't consume a significant amount of energy compared with everything else we do, especially heating and cooling and transport
Ok, but heating and cooling are largely not negotiable. We need those technologies to make places liveable
LLMs are not remotely as crucial to our lives
You gotta start thinking about the energy used to mine and refine the raw materials used to make the chips and GPUs. Then take into account the infrastructure and data centers.
The amount of energy is insane.
And yet still tiny in relationship to transportation energy requirements and transportation itself is stuck on fossil fuels mostly.
At the end of the day green energy is perfect for AI and AI workloads.