
As artificial intelligence scales up rapidly, the water usage tied to its infrastructure, especially data centers, is becoming a significant environmental concern, says IEEE Spectrum. AI systems consume water in two major ways: directly, for server cooling and humidity control; and indirectly, via electricity generation in water-intensive power plants.
One study projects that by 2027, global AI operations could withdraw between 4.2 and 6.6 billion cubic meters of water annually, an amount larger than the yearly withdrawals of small countries. In concrete terms, training a large model like GPT-3 may require hundreds of thousands of liters of water just for cooling and related infrastructure needs.
Besides sheer volume, location matters. Data centers located in water-stressed regions exacerbate stress on local water supplies, particularly when design or regulatory safeguards are lacking.
On the contrary, AI itself offers tools to reduce water usage elsewhere, such as optimizing irrigation in agriculture, improving wastewater treatment, detecting leaks, predicting demand, and more.
There are mitigation strategies. Data centers are exploring advanced cooling technologies (liquid cooling, reclaimed or recycled water, more efficient heat exchangers), better site planning, hardware efficiency, and model optimization to reduce compute overhead.
AI’s water footprint isn’t trivial, and it’s growing. If industry, policymakers, and researchers don’t address this early, thousands of AI deployments could compound the strain on fragile water systems. But there’s also opportunity: the same tech driving up water demand can help us use water in smarter ways.