Every time you ask ChatGPT a question, water evaporates somewhere in the world. It sounds dramatic, but it's backed by peer-reviewed research.
How AI Consumes Water
AI models run on powerful servers housed in massive data centers. These facilities generate enormous amounts of heat, and the most common cooling method is evaporative cooling — essentially, using water to keep servers from overheating.
But that's only half the story. The electricity powering these data centers comes largely from thermal power plants, which themselves consume significant amounts of water to generate energy. Researchers call these two pathways Scope 1 (direct cooling water) and Scope 2 (indirect water from electricity generation).
The Numbers
According to recent studies by Li et al. (2025) and data from the EESI, a single conversational exchange with a large language model can consume anywhere from 0.5 to 3 liters of water, depending on the model, the data center location, and the energy grid powering it.
To put that in perspective: 10 AI queries could use more water than a standard glass of drinking water. Multiply that across billions of daily queries worldwide, and the numbers become staggering.
Why This Matters
Water scarcity is already a reality for over 2 billion people globally. As AI adoption accelerates — with estimates suggesting a 10x increase in compute demand by 2030 — the water footprint of artificial intelligence is becoming a sustainability concern that can no longer be ignored.
What You Can Do
The most impactful action at the individual level is surprisingly simple: write better prompts. An optimized prompt gets the right answer on the first try, eliminating the need for follow-up queries and retries. Research shows that the average unoptimized prompt requires approximately 2.5 attempts, while an optimized one reduces this to about 1.1 — cutting water consumption by over 50%.
Tools like IacuWise help you optimize your prompts before sending them, showing you exactly how much water, energy, and CO₂ you save with each optimization.
The Bottom Line
AI is transforming how we work and live, but it comes with a hidden environmental cost. Being aware of that cost — and taking steps to minimize it — is the first step toward truly sustainable AI usage.