It's not really a stable measure of compute, but it's a good indication of burn rate as energy cost is something we closely track in economies and it actually dominates a lot of the cost of operating data centers. At least short term. Over time we'll get more tokens per energy unit and less dollars for the hardware needed per energy unit. Tokens currently is too abstract for a lot of people. They have no concept of the relation ship of numbers of tokens per time unit and cost. Long term there's going to be a big shift from op-ex to cap-ex for energy usage as we shift from burning methane and coal to using renewables with storage.

We need a Moore's law for tokens, and energy.