Barnes Tech

Back

Hank Green put together a really good breakdown of the AI water use discourse. Sam Altman claims the average ChatGPT query uses about a fifteenth of a teaspoon of water. Meanwhile, Morgan Stanley projects AI data center water use could hit a trillion liters by 2028. Both numbers can technically be true, and that’s the problem. The “per query” framing conveniently ignores the lifecycle analysis—training models for weeks or months on massive GPU clusters, cooling those data centers with evaporative systems, and the water used by power plants generating all that electricity. It’s very easy to mislead by choosing what to include or exclude.

The more interesting takeaway is that water might not even be the biggest concern. Power consumption is where the real impact lies—it’s an order of magnitude larger relative to existing infrastructure and will hit carbon budgets and wallets harder. And as Hank points out, context matters: American corn production alone uses nearly 80 times more water annually than all AI servers worldwide. Resource analysis is genuinely complex, and anyone packaging it into neat headlines is probably fudging something.

🔗
Links
* https://www.youtube.com/watch?v=H_c6MWk7PQc
AI Water Use
https://barnes.tech/blog/ai-water-use
Author Barnes Tech Blog
Published at December 27, 2025