> consumers paying for electricity used by server farms
wait what? consumers are literally paying for server farms? this isn't a supply-demand gap?
> consumers paying for electricity used by server farms
wait what? consumers are literally paying for server farms? this isn't a supply-demand gap?
It's a supply-demand gap, but since the reasons for it are very apparent, it's completely reasonable to describe it as "consumers paying for [the existence of] datacenters".
I don't see how? It's much more reasonable to state "all electrical consumers are paying a proportionate amount to operate the grid based on their usage rates". This is typically spelled out by the rate commissions and designed to make sure one power consumer is not "subsidizing" the other.
In the case of your quoted article - taking it at face value - this means "everyone" is paying .02/khw more on their bill. A datacenter is going to be paying thousands of times more than your average household as they should.
I don't see a problem with this at all. Cheap electricity is required to have any sort of industrial base in any country. Paying a proportionate amount of what it costs the grid to serve you seems about as fair of a model as I can come up with.
If you need to subsidize some households, then having subsidized rates for usage under the average household consumption level for the area might make sense?
I don't really blame the last watt added to the grid for incremental uptick in costs. It was coming either way due to our severe lack of investment in dispatchable power generation and transmission capacity - datacenters simply brought the timeline forward a few years.
There are plenty of actual problematic things going into these datacenter deals. Them exposing how fragile our grid is due to severe lack of investment for 50 years is about the least interesting one to me. I'd start with local (and state) tax credits/abatements myself.
No, it's a lie. Consumers paying more because of data centers raising demand could be true, but that's not equivalent to them paying for the data centers' usage. The data centers also have to pay an increased rate when prices go up.
Data centers get commercial or maybe even industrial rates depending on their grid hookup and utilities love predictable loads. Those are lower than residential rates. If you're dishonest and don't understand the cost of operating a grid, you could say that's users paying for data centers. But then you'd need to apply it to every commercial/industrial user.
If the regular users were paying for data centers usage, why are so many of them going off-grid with turbines or at least partially on-prem generation?
The solution is more and cheaper energy.