I love the idea of high quality code. I do, however, dismiss the energy efficiency issue.
> Data centers already consume 200 TWh annually.
That's 7.2 x 10^16 joules
Some estimates suggest that's low. Let's multiply by 5. Annually, that's
3.6 x 10^17 joules
The total radiant power of the sun that hits the Earth's atmosphere is about 174 PW, and about 47% of that lands on the surface. Even with the most drastic global warming models, approximately 100% of that is radiated away at the same time. Evidence: we're not charred embers. So the Earth's power budget, at the surface, is about 81 PW. Annually, that's
2.6 × 10^24 joules
Global annual oil production is about 3 x 10^10 barrels, that's
1.8 x 10^20 joules
So, the data centers consume 0.2% of the global petroleum budget and 1.4 × 10^-7 of the planet's power budget.
The digital economy is about 15% of global GDP, so I submit tech is thoroughly on the right side of their energy conservation balance sheet normalized against other industries and there are other verticals to focus your energy concerns on (e.g. internal combustion engines, including Otto, Brayton, Rankin, and Diesel cycle engines). Among serious people trying to solve serious problems, you would do your credibility a favor by not expending so many words and thus so much reader time on such a miniscule argument.
If you want to talk about energy related to code, take the total power consumption of the software engineer's life, and find a way to big-O that. I'm willing to bet they contribute more to global warming driving than all their compute jobs combined.
Yes, the moral panic about power consumption from Data centers has always been a red herring at best.
Given how cheap renewables are in 2025, and how far we still are from an inverted duck curve world-wide, it's absolute nonsense.