I think you're overestimating how much power LLMs consume. Let's say one video pegs a top of the line Blackwell chip at 100% utilization for 10 minutes. I think a Blackwell chip (plus cooling and other data center overhead) is somewhere around 3000 watts when running 100%. So that's about 0.5 kilowatt-hours. I suspect this is a severe overestimate because there's probably a significant amount of batching that happens that cuts down on amortized power usage, and non-pro Sora 2 might be processed with weaker, smaller models, but I'm not very confident.
Data centers seem to have wholesale rates of around 4 cents per kilowatt-hour on the higher end.
This gets you 2 cents per video. If you're generating 50 million videos per day (an estimate on the higher side of how many TikTok videos are uploaded every day), that costs you a million dollars a day.
So if you entirely subsidized for free the entirety of all of TikTok's video generation just using LLMs, I don't think energy generation costs exceed 365 million a year (and I think this might be very severely estimating costs, but there are some large error bars here).
I'm pretty sure OpenAI (or any company) would be pretty happy to pay 365 million dollars a year for the soft social power of something like TikTok. Just the influence this buys in politics and social discourse would be worth the pricetag alone.
And that's of course leaving aside any form of monetization whatsoever (where in reality you'd likely be charging the heaviest users the most).
N.B. I'm also not sure it's actually more power efficient for users to post their own content in absolute terms. It seems not unlikely that the amount of energy it takes to produce, edit, and process a TikTok video exceeds half a kilowatt-hour. But maybe you're focused solely on the video hoster.
> It seems not unlikely that the amount of energy it takes to produce, edit, and process a TikTok video exceeds half a kilowatt-hour.
That would be really remarkable, considering the total power capacity of a phone battery is in the neighborhood of 0.015 kWh
Yeah I should clarify. This is a very vague estimate around "total energy spent for making a video you wouldn't otherwise do" which includes stuff like lighting, transportation, video transcoding on the server, script writing, actor coordination, etc. E.g. if someone drives somewhere solely to make a video they otherwise wouldn't, then it gets included.
I hedged as "not unlikely" because I'd need to think harder about the amortization of more energy expensive videos vs less energy expensive ones and how much energy you can actually attribute to a video vs the video solely being an activity that would be an add-on to something that would happen anyways.
But it's not just the energy expenditure of a phone.
(I also think that 0.5 kilowatt-hours is an overestimate of energy expenditure by potentially up to two orders of magnitude depending on how much batching is done, but my original comment did say 0.5 kWh).
> Let's say one video pegs a top of the line Blackwell chip at 100% utilization for 10 minutes.
Where do you get this from?
You didn't include the amortized cost of a Blackwell GPU, which is an order of magnitude larger expense than electricity.
Yeah that's fair (although the original comment was only talking about energy costs).
But this is kind of a worst case cost analysis. I fully expect that the average non-pro Sora 2 video has one to two orders of magnitude less GPU utilization than I listed here (because I think those video tokens are probably generated at a batch size of ~100 per batch).