>people mistakenly repeating the conclusion that AI consumes huge amounts of water comparable to that of entire cities

Does it not?

"We estimate that 1 MWh of energy consumption by a data center requires 7.1 m3 of water." If Microsoft, Amazon and Google are assumed to have ~8000 MW of data centers in the US, that is 1.4M m3 per day. The city of Philadelphia supplies 850K m3 per day.

https://iopscience.iop.org/article/10.1088/1748-9326/abfba1/...

Why do we need to assume so many things, when we can peg it to reality.

Worldwide, Google's data centers averaged 3.7GW in 2024. Globally, they use 8.135e9 gallons of water in the year, which is 30.8e6m³ per year, which is 84e3m³ per day. Double that to meet the assumed 8GW data center capacity, 168e3m³/day. QED: the estimate 1.4e6m³/day is high by a factor of 10x. Or, in other words, the entire information industry consumes the same amount of water as one very small city.

I believe this is why Google states their water consumption as equivalent to 51 golf courses. It gives a useful benchmark for comparison. But any way you look at it the water consumption of the information sector is basically nothing.

All the golf courses where I live use grey water - water that would otherwise be dumped into oceans/estuaries/rivers/etc.

That's not really not comparable to data centers using potable water.

Even the golf course trade association only claims 10% grey water use.

Also, you're going to be shocked, data centers can cool with grey water as well. The now-cancelled Project Blue data center near Tucson was going to build and operate a wastewater pipeline and treatment plant and give it to the city, but the shouting NIMBYs prevailed anyway. The developer now intends to use air-to-air cooling, which costs more energy.

Yeah, but that is for everything. YouTube, Amazon itself, AWS, Azure, GCP, ... not just AI stuff. I mean, it is still a lot of water, but the numbers are not that easy to calculate IMHO

Many if not most data centers are pulling water out of the ground that will never be replaced. The problem is multidimensional, not just volume.

Resource consumption of AI is unclear on two axes:

1) As other commenters have noted: raw numbers. In general, people are taking the resource consumption of new datacenters and attributing 100% of that to "because AI," when the reality is generally that while AI is increasing spend on new infrastructure, data companies are always spending on new infrastructure because of everything they do.

2) Comparative cost. In general, image synthesis takes between 80 and 300 times fewer resources (mostly electricity) per image than human creation does. It turns out a modern digital artist letting their CPU idle and screen on while they muse is soaking significant resources that an AI is using to just synthesize. Granted, this is also not an apples-to-apples comparison because the average AI flow generates dozens of draft images to find the one that is used, but the net resource effect might be less energy spent in total per produced image (on a skew of "more spent by computers" and "less by people").

Comparing humans with machines on resource use gives some seriously dystopian vibes.

I agree, but that's what people are implicitly doing every time they toss out one of those "The machine drinks a glass of water every time it" statistics. We are to assume a human doesn't.

how much is it in burgers and steaks? serious question

Don’t forget cotton.

234 m3 per tonne, of clean water.

25M tonnes per year.

=> 16M m3 of clean water per day

Edit: convert to comparable units

Philadelphia? 80K m3 water for 10K lbs beef per day. But that's not potable water, which is a lot of what data centers are using