Last updated: AI water consumption: Generative AI’s unsustainable thirst

AI water consumption: Generative AI’s unsustainable thirst

1 share

Listen to article

Download audio as MP3

Generative AI has been all the rage since OpenAI’s ChatGPT gave average users the ability to interact with artificial intelligence like it was a friend down the street. Now, it turns out the craze to build such programs took an unforeseen toll on water resources across the nation.

In its annual sustainability report, Microsoft, a multibillion-dollar investor in OpenAI, divulged that its data centers in Iowa and other areas consumed nearly 1.7 billion gallons of H20 in 2022. That’s 34% more than it used in 2021, and enough to fill 2,500 Olympic-sized swimming pools.

While Microsoft hasn’t specifically said what led to the unusual surge, experts say it’s no coincidence it occurred while the company’s data scientists were believed to be training the large language models (LLMs) that power ChatGPT’s intelligence.

That conclusion about AI water consumption would seem to make sense since Google also reportedly guzzled more than 5.6 billion gallons of water in 2022, or 20% more than the previous year, while training LLMs for its generative AI tool, Bard.

Man smiling while holding smart phone, looking at copy describing SAP Customer Experience LIVE Virtual 2023.

AI, water, and data centers

In the race to meet business and consumer demand for next-generation AI tools, companies have been ramping-up data centers activities like never before to train models and respond to inquiries from tool users.

Hotter data center equipment in need of cooling requires a lot of water for evaporative systems to do their job of keeping temperatures moderately low.

In fact, a large data center can use between 1 million and 5 million gallons of water a day, or as much as a town of 10,000 to 50,000 people, according to The Washington Post.

What’s more, a paper from University of California at Riverside researchers finds ChatGPT needs an average of a 500ml bottle of water for every 10 to 50 questions it’s asked, depending on where the servers are hosted.

“This is unsustainable from an environmental, cost, and performance perspective,” Joe Capes, CEO of cooling systems developer LiquidStack, told Information Week. “Rising energy costs make this [approach] increasingly expensive, and the powerful processors required for today’s data-intensive technologies … simply generate too much heat for air cooling to handle.”

Gen AI: Where’s the water going?

There are several reasons why AI-related data centers are so hot and thirsty, including:

  1. High power density: AI servers heat up fast when processing the huge volumes of data needed to feed AI knowledge.
  2. Continuous operation: Data centers used for LLM training often run 24/7, which requires constant cooling.
  3. Energy efficiency: Water-based cooling isn’t dependent on outside temperatures, so it tends to be more efficient than alternative air-cooling systems.
  4. Scalability: As data centers scale to accommodate larger AI models, their increased power requires even more cooling to maintain their performance and reliability.

Shaolei Ren, who co-authored UC-Riverside’s study, says this energy-driven water consumption isn’t a concern in the short term because generative AI is still in its early stages. Long term, though, he says reports about high tech’s increased water usage should spur public debate about future conservation.

Both Microsoft and Google have publicly committed to being water positive – meaning they will replenish more water than they take – by 2030.

How to reduce AI’s environmental impact

Industry experts say there are several steps companies can take to ensure that generative AI does not seriously drain future water reserves.

If AI-supporting infrastructure needs large volumes of water for cooling purposes, it makes sense to locate it near lakes, rivers, and ponds. But if those bodies of water happen to exist in drought-troubled areas like the western US, that setup could lead to significant operational and business issues whenever water supplies are suddenly restricted or even cut off.

For that reason, Ren recommends companies begin looking at ways to use software for load-balancing AI training across locations or schedule it for cooler times of the day or year to minimize water evaporation during cooling.

He adds that as people learn about the environmental impact of AI, companies need to be sensitive to how residents might view their plans to build local data centers. Google’s efforts to establish a data center that would reportedly use 7.6 million liters of water per day, (enough to support domestic daily use by 55,000 people), sparked fierce local protests in drought-stricken Uruguay.

Where possible, experts say to utilize equipment that uses outside air to cool facilities. But when temperatures rise above 85F, which happens often in hotter climates like Phoenix or East Asia, that might not be possible. In those situations, companies need to research and develop new cooling technologies that use less water.

Microsoft has done some work in this area by using adiabatic cooling, where air handler units push air over evaporative media to add humidity to the air and lower temperatures with minimal energy use. In Gävle, Sweden, it’s also capturing rainwater to inject cooling humidity into its data center whenever the outside air falls below 5% humidity.

Increased use of cooling systems that use recycled versus fresh water is another tactic, experts say.

Protecting precious water supplies

Ren says the public must demand transparency about water use and conservation commitments. AI solutions from companies that prove they’re doing their best to save water will be more attractive to customers, he adds.

Ren says there’s still time to build water conservation into AI training and technology, but time will run out if this issue isn’t taken as seriously as possible.

“Generally speaking, we haven’t come to the point yet where AI has tangibly taken away one of our most essential natural resources,” he says. “If we’re more mindful of the use of AI, I think we can definitely make sure the overall benefits of AI are positive.”

Giants don’t leave (carbon) footprints.
Future-proof your business –
and the world.
Start HERE.

Share this article

1 share

Search by Topic beginning with