Public water supplies in America will need billions invested to meet the peak requirements of datacenters during the hottest periods of the year, even if their overall annual consumption is relatively modest.
A study by researchers at the University of California, Riverside, acknowledges that water is an efficient means of cooling for server farms, which are looking to minimize their power usage.
But it warns that the growing water demand will lead to substantial peak withdrawals, which many communities in the US do not have the capacity to supply, particularly during the hottest days of the year.
Without new water efficiencies, datacenters across America may require 697 million to 1.45 billion gallons of extra peak water capacity per day by 2030, the study estimates. This compares with New York City's daily water supply of about a billion gallons.
Even with the most optimistic projected water use reductions, the new capacity required could amount to half of New York's supply for most of the year, says the report.
Water consumption by server campuses has become something of a hot topic, with some operators disputing that their usage represents a problem.
Datacenter cooling typically occurs in two stages, the study explains.
The first is server-level cooling, which transfers heat from the IT equipment to an intermediate facility-level heat exchanger through either air-based or closed-loop liquid cooling. This typically does not involve direct water consumption.
The second stage is facility-level cooling, which transfers heat from the facility to the outside environment. This may involve water consumption depending on the technology employed, such as cooling towers that rely on evaporation, or air-cooled systems supplemented by direct evaporation or adiabatic cooling to reduce the peak power demand during the hottest days of the year.
A large server farm that relies on evaporation cooling can suck up millions of gallons of water per day during the hottest periods of the year, the report says, significantly more than at other times.
Not all water withdrawn by datacenters is "consumed" by being evaporated or otherwise removed; some is discharged or returned. But any water that is taken in by a server farm is not available for other users, which is where the problem arises.
The report states that there are roughly 50,000 community water systems across the US, of which approximately 40,000 are small systems each serving no more than 3,300 people. About 9,000 are medium-sized, and only 708 are large systems serving upward of 100,000 people.
Nearly all hyperscale and colocation facilities across the country are supplied by community water systems (mostly from potable sources), with only a few drawing their H2O from private groundwater sources.
Such public systems are designed to safely and reliably meet maximum demand at all times, with additional margins to allow for extreme conditions such as prolonged heatwaves and droughts.
Depending on local climate conditions and cooling system design, the UC Riverside team estimates that a 100 MW IT load will require approximately 0.5 to 2.5 million-gallons-per-day (MGD) of water. And this capacity is modest compared with the gigawatt-scale AI facilities being planned for deployment across America.
After accounting for operational safety margins and reliability headroom, it may be difficult for many public water systems to support the needs of an evaporation-cooled 100 MW IT load, let alone giant facilities.
In fact, the researchers claim that many datacenter projects have required substantial upgrades to local water infrastructure, even when their peak water demand was as low as 0.1 MGD.
Overall, the report concludes that US server farms are projected to require 697 to 1,451 MGD of new water capacity, at a cost of up to $58 billion and comparable to New York City's average daily supply.
The authors recommend that datacenter operators report peak water use, not just yearly averages, to aid in planning. They might also partner with local communities to fund water infrastructure upgrades, and work more closely with utilities by adjusting cooling methods. The latter would see them use water-based cooling when the power grid is stressed, but switch to dry cooling when the community water system is stressed. No telling what would happen on peak-heat days when both systems are stressed.®
Source: The register