Researchers report US data centers may need 697–1,451 million gallons per day of new peak water capacity by 2030 in a baseline scenario, even if national totals stay small. Model local peak-day water constraints, not just annual averages, when planning new clusters.

The researchers model three scenarios through 2030 using 2024 operator reports and public utility data, and the baseline case is the headline number: 697–1,451 MGD of new peak capacity capacity estimate. The abstract in [img:0|paper abstract] adds an optimistic case where industry-wide water-use intensity falls 10% per year, cutting the required new capacity to 227–604 MGD.
That framing matters for infrastructure planning because the paper is about capacity, not just annual consumption. As the thread notes, annual data-center water use in the baseline still reaches 60–110 billion gallons, but the operational bottleneck is whether local systems can supply cooling loads on the hottest days.
The paper also proposes a policy mechanism: the summary says operators should fund expansion of local water capacity before new server clusters connect. For engineers working on site selection, that turns water from a generic sustainability metric into a hard dependency alongside power interconnects and transmission queues.
The national numbers can look small while still breaking specific utility systems. the thread calls out Northern Virginia as the clearest hotspot, with similar pressure in Georgia, Indiana, Wisconsin, Iowa, and Oregon; single facilities can draw 1–8 MGD and take a large share of local public supply.
That local-versus-national split is also the core rebuttal in the opposing blog. The author, summarized in blog thread, argues 2023 direct onsite data-center use was about 50 MGD, or 0.04% of total US water use, and says AI itself was only a fraction of that. But the same thread acknowledges that “individual data centers can impact local water systems,” which lines up with the paper’s claim that planning errors show up first in host communities, not in national totals blog summary.
For operators, the engineering takeaway is simple: annual ESG reporting will miss the binding constraint if the deployment depends on evaporative cooling during summer peaks. In the paper summary, that is exactly where “aging local public water systems” run into the largest stress
Epoch AI estimates that NVIDIA, Google, AMD, and Amazon consumed nearly all high-bandwidth memory and advanced packaging tied to frontier AI chips in 2025. Track this if you are planning compute, custom silicon, or open-weight infrastructure strategy.
releaseOpenClaw shipped version 2026.3.22 with ClawHub, OpenShell plus SSH sandboxes, side-question flows, and more search and model options, then followed with a 2026.3.23 patch. Teams get a broader plugin surface, but should patch quickly and review plugin trust boundaries as the ecosystem grows.
releaseCursor shipped Instant Grep, a local regex index built from n-grams, inverted indexes, and Bloom filters that drops large-repo searches from seconds to milliseconds. Faster candidate retrieval shortens the coding-agent loop, especially when ripgrep-style scans become the bottleneck.
breakingChatGPT now saves uploaded and generated files into an account-level Library that can be reused across conversations from the web sidebar or recent-files picker. It removes repetitive re-uploading and makes past PDFs, spreadsheets, and images part of a persistent working context.
breakingEpoch AI says GPT-5.4 Pro elicited a publishable solution to one 2019 conjecture in its FrontierMath Open Problems set, with a formal writeup planned. Treat it as an early milestone worth reproducing, not blanket evidence that frontier models can already automate math research.
New paper on US data center water usages and future directions. - On a national level, data centers will continue to use a very small percentage of our total water supply by 2030 compared to farming and everyday public use. - However, companies currently use evaporative cooling Show more
US data centres are projected to need 697–1,451 MGD of new peak water capacity by 2030 in the baseline case, roughly equal to New York City's entire daily supply of 1,000 MGD against a national public supply total of 35,400 MGD. The analysis by UC Riverside, Rochester Institute Show more
Paper on data center water use makes two points 1) National data center water use in 2030 will remain “modest” compared to total public water supply (1.8%–3.7%) or agriculture (0.6%–1.2%) 2) For some localities, serving peak demand could be a big deal & require new infrastructure