AI queries are processed inside data centres, massive buildings holding upwards of thousands hundreds of racks of high-powered processing chips – GPUs. — Pixabay
Yes, every question you ask AI uses up water – and many are worried.
A recent University of Chicago survey revealed that 4 in 10 US adults are “extremely” worried about artificial intelligence’s environmental impact – and the Associated Press reports that many US citizens oppose data centers entering their community.
Much of the environmental concerns center around AI’s water usage. But scoping out the scale of its use can be difficult.
According to a Morgan Stanley report, AI data centers’ global annual water consumption is set to reach 1,068 billion liters by 2028, an estimate 11 times higher than last year’s projection.For context, each individual American uses roughly 243,174 liters (64,240 gallons) a year.
Yet, earlier this year OpenAI CEO Sam Altman claimed each ChatGPT query only uses “about 0.000085 gallons of water; roughly one fifteenth of a teaspoon.”
And while each claim paints a vastly different picture, neither of them are technically wrong, as science YouTuber Hank Green explained in a recent video. The answer behind the differences boils down to when we start running the metre on AI’s water use.
“The actual lie is that it’s only counting the water used during the part of the life cycle when you’re actually querying the chatbot, not all of the other parts of the process that are necessary for this system to exist,” Green explains.
Why does AI need water anyway?
AI queries are processed inside data centres, massive buildings holding upwards of thousands hundreds of racks of high-powered processing chips – GPUs. It is here that AI models run their calculations and return responses to users via the cloud. All that electricity carrying data generates a lot of heat. Without a cooling element, the GPUs will overheat and fail.
Cooling a data centre is often done through either evaporative or a closed loop cooling system, both involving water.
“The GPUs and the AI servers, they generate orders of magnitude more heat than the traditional [data centres], so you cannot use traditional data centre air based cooling, like air conditioning,” senior partner at McKinsey Pankaj Sachdeva says, adding that liquid is simply more effective than air at heat exchange.
“Getting heat off the chip and getting heat out of the building are two different engineering problems with two different supply chains that happen to meet in the middle,” Brandon Daniels, CEO Exiger, of one of the largest AI supply chain technology providers in the US, says. “The problem today is that cooling these new servers requires intricate systems that need expensive fluids, tons of good quality water, heavy filtration, and a massive amount of power.”
Traditional evaporative cooling towers are extremely power-efficient, but they do it by evaporating very large volumes of water, Daniels explains. On the other hand, he says, dry coolers and air cooled chillers use almost no water, but they have to work harder – so the electrical footprint increases.
For server cooling, closed-loop, direct-to-chip cooling solutions are vital to keeping the GPUs at the optimal temperature.
“We’re moving toward rear-door heat exchangers, Direct-to-Chip (D2C) cold plates that bring coolant right to the processor package, and immersion systems where entire servers live in dielectric fluids,” says Daniels. “That’s not a luxury upgrade anymore. For leading edge AI clusters, liquid is the precondition for the densities everyone is trying to build.”
But as industry giants like Nvidia promise to deliver higher processing units, their deployment will require more advanced cooling solutions.
“Some of the largest, most technologically advanced companies in the world are involved in trying to solve this problem. So there’s a lot of horsepower behind it,” says the SVP of Data Center Business Development at TierPoint Don Schuett. Tierpoint manages over 40 data centres across the country. “That gives me some optimism that they continue to evolve solutions that allow us to keep up with Nvidia.”
So how much water does AI use?
Some estimates say that two liters of water might be needed for every kilowatt hour of energy, MIT Computing and Climate fellow Noman Bashir told MIT News.
Though Altman says a fraction of a teaspoon of water is used per AI query, Green suggests that a more honest accounting of water consumption would include the water used in the process of creating the AI itself – which is how Morgan Stanley arrived at its estimated number of 1,068 billion annual liters consumed by 2028.
The report outlines the technology’s water footprint as coming from not only the on-site data centre cooling, but the electricity generation and semiconductor manufacturing needed to create the GPUs. By tracing the water consumption to every part necessary to answer a query, the numbers quickly add up.
While the numbers are contradicting and confusing at times, even the most liberal of water use estimates look minor in comparison to other industries.
“Even under the maximalist goals of AI companies the projected increase of water use is small compared to what cities and industries already use,” Green said.
The Morgan Stanley report agrees, with it noting that “while the increase in AI’s water consumption is expected to be substantial, the absolute amount remains
modest compared with traditional global water withdrawals across major sectors.”
For instance, Green highlights the environmental and water impact of the existing agricultural systems in the US, particularly resource intensive industries like agriculture.
In one instance local outlet the Red Canary Magazine in Arizona estimated that in Maricopa County, upwards of 177 million gallons of water are used to cool data centres daily. Yet that would only account for 30% of what is used by the agricultural industry.
Still, it is pertinent to monitor the water consumption and energy use of the growing technology, although AI companies like OpenAI are making it harder for others to develop solutions.
“OpenAI doesn’t share this information, which is part of why it is so easy to get numbers that are both fairly correct and very different from each other,” Green said. And part of why it’s so easy to lie about this from either direction.” – Inc./Tribune News Service
