New research reveals how strategic data center placement could slash water usage by 90%

America’s data centers consumed a staggering 228 billion gallons of water in 2023—and that number is set to skyrocket as artificial intelligence drives unprecedented demand for computing power. But groundbreaking research from Cornell University offers a surprisingly simple solution: build these energy-hungry facilities where the wind blows and the sun shines.

The water crisis extends far beyond what most people realize. While data centers directly use about 17 billion gallons annually for cooling their heat-generating servers, the real culprit lies hidden in the power grid. Traditional power plants—whether coal, gas, or nuclear—require massive amounts of water to generate steam and produce electricity, accounting for over 70% of data centers’ total water footprint. Even hydroelectric dams contribute to water loss through reservoir evaporation.

Cornell researchers discovered that strategic location choices could reduce environmental impact by up to 100-fold. Their top recommendation might surprise you: bone-dry West Texas. Despite its arid climate, the region’s abundant wind energy dramatically reduces grid-related water consumption, while sparse population and available groundwater make direct cooling feasible. Montana, Nebraska, and South Dakota also emerged as prime locations for similar reasons.

The findings challenge conventional wisdom about the water-rich Pacific Northwest, where heavy reliance on hydropower actually increases water footprints despite cheaper electricity costs. With tech companies pouring hundreds of billions into AI infrastructure, these location decisions could determine whether the digital revolution becomes an environmental disaster or a model of sustainable development. The choice is literally in their hands—and where they choose to build.