It's a supply-demand gap, but since the reasons for it are very apparent, it's completely reasonable to describe it as "consumers paying for [the existence of] datacenters".
It's a supply-demand gap, but since the reasons for it are very apparent, it's completely reasonable to describe it as "consumers paying for [the existence of] datacenters".
I don't see how? It's much more reasonable to state "all electrical consumers are paying a proportionate amount to operate the grid based on their usage rates". This is typically spelled out by the rate commissions and designed to make sure one power consumer is not "subsidizing" the other.
In the case of your quoted article - taking it at face value - this means "everyone" is paying .02/khw more on their bill. A datacenter is going to be paying thousands of times more than your average household as they should.
I don't see a problem with this at all. Cheap electricity is required to have any sort of industrial base in any country. Paying a proportionate amount of what it costs the grid to serve you seems about as fair of a model as I can come up with.
If you need to subsidize some households, then having subsidized rates for usage under the average household consumption level for the area might make sense?
I don't really blame the last watt added to the grid for incremental uptick in costs. It was coming either way due to our severe lack of investment in dispatchable power generation and transmission capacity - datacenters simply brought the timeline forward a few years.
There are plenty of actual problematic things going into these datacenter deals. Them exposing how fragile our grid is due to severe lack of investment for 50 years is about the least interesting one to me. I'd start with local (and state) tax credits/abatements myself.