Tanya Rastogi, Stanford University
Artificial intelligence entails profound water and energy costs, much of which are shouldered by rural communities in the United States. Tax abatements and discounted utility rates have encouraged corporations to construct data centers in regions with inexpensive land and political compliance. This environmental burden, however, is not abstract. Research on current AI models suggests that a 100-word prompt can consume roughly the equivalent of a bottle of water, depending on size, efficiency, and the cooling systems employed.
Data centers require vast quantities of electricity to power continuously operating servers, much of which is still generated from fossil fuels. Water also plays a critical role in preventing chip overheating. , with evaporative systems capable of consuming millions of gallons annually. While single prompts seem insignificant, the aggregate of billions of interactions, as expected, scales resource use dramatically. Where we sit and stand now, we’re far removed from this; however, the rapid consumption of our water, of course, has very relevant implications.
AI competition with China has further accelerated US data center construction, with policymakers framing AI dominance as both an economic and national security imperative. Comparisons to China’s DeepSeek AI model, objectively more efficient than domestic counterparts by many metrics including power and speed, have reinforced the urgency that drives the building of this infrastructure. Expedited permissions and public subsidies have made it easier for firms to expand without appropriate scrutiny on environmental trade-offs. Rural counties, many of which face economic stagnation after declines in agriculture or manufacturing, are drawn in by promises of employment and increased tax bases.
Unfortunately, research indicates that modern data centers are highly automated and generate relatively few long-term jobs compared to their environmental demands. xAI’s Colossus data center in Memphis, Tennessee, for example, is predicted to generate millions of dollars of health-related costs in addition to significant pollution. In Bessemer, Alabama, $14.5 million ‘Project Marvel’ is expected to consume, at full operation, over 90 times the amount of energy of all its residences combined. Such a mismatch fosters uncertainty; communities gamble resources on facilities whose returns may not fully materialize.
Federal policy reflects this acceleration. The U.S. AI Action Plan, released in July 2025, outlines steps to reduce regulatory barriers for technology firms, promote “American values,” and integrate AI into federal operations, including the United States Department of Defense. At the same time, water-constrained regions are developing frameworks to evaluate this increase in resource use. In Loudoun County, home to one of the world’s densest concentrations of data centers, officials have begun assessing whether high-volume water users meet standards of immediacy and public necessity. Internationally, the United Kingdom has moved toward mandatory reporting of water use for data centers and is enforcing stricter standards for site selection. In response to mounting scrutiny, major firms have begun announcing sustainability commitments. Google and Microsoft have pledged to become “water positive” by 2030, meaning they intend to replenish more water than they consume. Of these pledges, critics note that “water positive” accounting can vary in methodology and measurement.
Engineering advancements do offer some promise. Closed-loop cooling systems recycle water rather than releasing it through evaporation, and free-cooling techniques limit reliance on water-based systems. A 2024 report from the U.S. Department of Energy highlights the need for further research to determine appropriate investments in grid modernization and nuclear energy to sustain AI growth without overwhelming fuel-dependent grids. Interestingly, federal briefs also suggest deploying AI itself to analyze fragmented infrastructure data and recommend optimized investment strategies.
Users of LLMs should remain conscious of the environmental footprint associated with their digital activity. Still, efficiency gains are evolving rapidly, and assumptions about water use per prompt may shift as technologies improve.
The distribution of AI’s costs remains fragmented. Rural communities often shoulder the burden of infrastructure expansion for minimal returns. As federal policy accelerates the integration of AI into a greater range of systems and structures, choices made now will determine whether the newest revolution has a positive effect on the US, the world, and our water supply.
