Rows of warehouse-sized structures hum nonstop somewhere in the high desert outside of Phoenix, Arizona, drawing water and electricity from an area that has been struggling with drought conditions for years. There are no conspicuous smokestacks or obvious indications of industrial stress. Just the quiet loss of a resource that local communities and farmers are already fighting to preserve, and the constant hum of cooling fans.
From the outside, artificial intelligence appears like this. It’s also very different from what people think.
The water story is arguably stranger and more immediate, but the discussion of AI’s environmental impact often focuses on carbon emissions, which are real and significant. An estimated 700,000 liters of water evaporated during the training of GPT-3—one model, one time. To put it in concrete terms, a water utility, not just an environmentalist, would be concerned about that kind of figure. When you combine that with the billions of daily queries that keep those same data centers operating, along with the dozens of significant models that are trained annually, the numbers become truly intimidating.
It is estimated that ten to fifty back-and-forth exchanges during a brief conversation with an AI chatbot will cause about half a liter of water to evaporate. That’s a bottle of water gone for a brief email edit or a discussion about dinner recipes. The majority of people who use these tools might not have given it a second thought. Why would they do that? The costs are hidden in a data center hundreds of miles away in a state that is already rationing water to its citizens, but the interface is clear, the response is quick, and the costs are undetectable.
Since the mechanics underlying this are less evident than the carbon story, it is worthwhile to comprehend them. Massive amounts of heat are produced by data centers. These models’ servers require more than just electricity; they also need to remain cool or they will malfunction. Evaporative cooling systems, which operate by passing air over water and removing heat through evaporation, are used in many facilities. There is no useful use for the water. Unrecoverable, it disperses into the atmosphere. Approximately two million liters of water can be used daily by a single 100 megawatt data center, which is theoretically sufficient to provide drinking water to thousands of households. There are hundreds of these facilities, many of which are concentrated in areas where the availability of water is already a major policy issue.
| Field | Details |
|---|---|
| Topic | Water consumption by AI data centers and large language models |
| Key Statistic | AI data centers estimated to consume 312.5–764.6 billion liters of water annually by 2026 |
| Per-Interaction Cost | ~500 ml of water evaporated per 10–50 AI chat interactions |
| Single Data Center Usage | A 100-megawatt facility can consume ~2 million liters of water per day |
| GPT-3 Training Water Use | Estimated ~700,000 liters evaporated during training |
| GPT-3 Training Carbon Output | ~300,000 kg CO₂ — equivalent to roughly 125 New York–Shanghai round trips |
| Problem Regions | Arizona, Texas (USA), Chile — all water-stressed |
| Data Centers in Stressed Regions | ~Two-thirds of data centers built since 2022 |
| Main Cooling Method | Evaporative cooling — water is lost, not recycled |
| Potential Solution | Closed-loop cooling systems can reduce water use by 90–95% |
| Industry Transparency | Most tech companies do not fully disclose water usage data |

Since 2022, about two-thirds of data centers have been constructed in water-stressed areas. Parts of Arizona, Texas, and northern Chile are on the list; in these areas, city planners and agricultural communities are already having to make difficult decisions regarding the distribution of water. In some of these areas, there is a growing tension between the resource strain that comes with tech infrastructure and the tax income and jobs it generates. Although it’s still unclear how those trade-offs will be resolved, the discussion is made more difficult by the fact that the majority of large technology companies don’t fully disclose their water consumption figures. A number that is not public can’t be disputed.
It is difficult to find solace in the bigger picture when considering the coming years. The need for AI tools is growing worldwide, and the models being trained today are bigger and require more processing power than those of the past. AI-related data centers are predicted to use between 312 and 764 billion liters of water per year by this year alone, which is comparable to the amount of bottled water produced worldwide. That difference is telling in and of itself; part of the reason for the uncertainty is that the industry hasn’t been forced to be transparent enough to generate better figures.
It would be unfair to ignore the solutions that are being developed. Closed-loop cooling systems can cut consumption by up to 90 to 95 percent because they recycle water instead of evaporating it. In order to save treated freshwater for other purposes, some facilities have begun using non-potable water for cooling. The thermal load and initial need for aggressive cooling are lessened when data centers are built in colder climates. Additionally, researchers are developing more computationally efficient model architectures that require less power to operate and, consequently, less cooling.
However, it’s difficult to ignore the fact that efficiency gains have continuously lagged behind demand growth. The models continue to grow in size. The volume of queries continues to increase. Additionally, the water continues to evaporate. The question of whether the industry moves quickly enough to close that gap before it becomes a true resource crisis deserves more urgency than it currently receives, especially in areas where water stress is already a daily reality.
