Hard to say. Sam wrote that on June 10th this year: https://blog.samaltman.com/the-gentle-singularity
GPT-5 came out on 7th August.
Assuming the 0.34 value was accurate in the GPT-4o era, is the number today still in the same ballpark or is it wildly different?
the "AI" industry have identified that energy usage is going to be used as a stick to beat them with
if I was altman then I'd release a few small numbers to try and get influencers talking about "how little energy chatgpt uses"
and he can never be accused of lying, as without any methodology as to how it was calculated it's unverifiable and completely meaningless
win-win!
I would bet that it's far lower now. Inference is expensive we've made extraordinary efficiency gains through techniques like distillation. That said, GPT-5 is a reasoning model, and those are notorious for high token burn. So who knows, it could be a wash. But selective pressures to optimize for scale/growth/revenue/independence from MSFT/etc makes me think that OpenAI is chasing those watt-hours pretty doggedly. So 0.34 is probably high...
...but then Sora came out.
Yeah, something we are confident about is that
a) training is where the bulk of an AI system's energy usage goes (based on a report released by Mistral)
b) video generation is very likely a few orders of magnitude more expensive than text generation.
That said, I still believe that data centres in general - including AI ones - don't consume a significant amount of energy compared with everything else we do, especially heating and cooling and transport.
Pre-LLM data centres consume about 1% of the world's electricity. AI data centres may bump that up to 2%
You mean this Mistral report? https://mistral.ai/news/our-contribution-to-a-global-environ...
I don't think it shows that training uses more energy than inference over the lifetime of the model - they don't appear to share that ratio.
> don't consume a significant amount of energy compared with everything else we do, especially heating and cooling and transport
Ok, but heating and cooling are largely not negotiable. We need those technologies to make places liveable
LLMs are not remotely as crucial to our lives
You gotta start thinking about the energy used to mine and refine the raw materials used to make the chips and GPUs. Then take into account the infrastructure and data centers.
The amount of energy is insane.
And yet still tiny in relationship to transportation energy requirements and transportation itself is stuck on fossil fuels mostly.
At the end of the day green energy is perfect for AI and AI workloads.