As artificial intelligence sweeps through every corner of the global economy, America is racing to build the data centers to power it all at a breakneck pace. But a recent study out of Cornell University suggests that many of them are being built in the wrong places, offering the clearest national map yet of where the next wave of AI infrastructure should go.
The United States is in the middle of a data center buildout unlike anything in recent industrial history. The biggest names in tech are projected to pour hundreds of billions of dollars into AI infrastructure in 2025 alone — far more than they spent even a year ago — and the pace keeps accelerating. Since the launch of ChatGPT in 2022, the country’s AI spending has jumped from $13.8 billion to $41.2 billion per year. That’s a 200 percent increase in just three years.
But the computing behind modern AI models is extraordinarily energy-intensive. A single chatbot query burns about ten times the electricity of a standard web search. And in order to answer said prompts, those servers require a constant river of potable water to stay cool. Communities near major build sites are already reporting dwindling water supplies and strained utility bills as a result. If this trajectory continues, analysts say data centers could account for roughly one-eighth of all U.S. electricity consumption within a few years, tripling their current share to 12 percent. That same pattern is unfolding globally as well.
Where Should America’s Data Centers Go?
Cornell University researchers found that AI data centers should be sited in regions with abundant water, clean-energy potential and stable grids. Their top picks fall across the Midwest and Plaines, including Nebraska, South Dakota, Montana and parts of Texas.
Yet, even amid this warp-speed expansion, the country still doesn’t have a clear sense of what the AI boom is doing to U.S. resources. We don’t really know how much strain these data centers put on aquifers, power plants or local grids, or how much pollution nearby communities can reasonably absorb. Left unchecked, AI’s rapid growth could gulp massive amounts of water and drive carbon emissions higher and higher with each passing year.
Where we choose to construct these facilities is one of the strongest levers we have to blunt those impacts — but strategic and sustainable planning does not seem to be driving those decisions at the moment. Instead, data centers are continuing to cluster in the handful of states that offer tax incentives, dense fiber networks and established cloud computing hubs, rather than in regions better positioned to handle the long-term environmental costs these facilities pose. Meanwhile, the spending commitments keep climbing. Meta recently announced plans to invest $600 billion into U.S. infrastructure projects — including AI-focused data centers — by 2028. Not to be outdone, OpenAI has committed $1.4 trillion over the next eight years to build “the infrastructure for a future economy powered by AI.”
Mapping tools show more than four thousand data centers nationwide, with over six hundred in Virginia alone and hundreds more in drought-prone California. Without a strategic siting plan, the incoming wave of AI infrastructure could strain the very systems it relies on. The most prudent play, as the team at Cornell University reveals, is to leverage the nation’s natural geographic advantages — not fight against them.
In short: If the United States wants to curb the climate impacts of the AI boom, it must rethink where it builds.
Where Are America’s Data Centers Being Built?
The regions with the highest number of data centers — Northern Virginia, Silicon Valley and the Southwest — come with some notable environmental drawbacks.
Virginia
Virginia is the densest data-center hub in the country, with more than 500 facilities concentrated along the “Data Center Alley” corridor, which handles roughly 70 percent of all internet traffic globally. The state’s dense fiber networks, stable power grid and generous tax incentives have made it a magnet for hyperscale tech companies, with Virginia alone accounting for about 25 percent of the nation’s total data center load. But the rapid expansion is driving up electricity demand, threatening to exceed the state’s clean-energy capacity and put additional pressure on the grid.
California
California’s chronic water scarcity clashes with the heavy cooling needs of its many data centers. A single large-scale facility can use 300,000 to 550,000 gallons of drinking water per day. And because most centers use evaporation to cool everything down, a one-megawatt operation can lose millions of gallons a year. On top of that, electricity consumption and on‑site water use in the state’s data centers nearly doubled between 2019 and 2023 as a direct result of AI workloads surging. Despite this growing demand, transparency around data center water usage remains weak. In fact, Governor Gavin Newsom recently vetoed AB 93, a bill that would have required facilities to disclose their water usage.
Arizona and the Broader Southwest
Arizona, New Mexico, Nevada and parts of Utah sit in one of the most water-stressed regions in the country. Arizona alone draws more than 40 percent of its water from the rapidly declining Colorado River. Yet the region continues to attract massive data center projects, from Google’s complex in Mesa to Meta’s expanding footprint in New Mexico.
What draws these companies in isn’t environmental factors, but legacy advantages, like the region’s long-established fiber routes, aggressive tax incentives and deeply rooted cloud infrastructure. This “convenience is king” dynamic is the same sentiment shared with overcrowded grocery stores and urban sprawl — familiar, accessible but fundamentally unsustainable.
Where Should the Data Centers Go Instead?
The study identifies a cluster of states — Nebraska, South Dakota, Montana and parts of Texas — as the prime destinations for future data center development. These states sit in what the researchers called the “energy-water-climate nexus”: regions with lower water stress, cleaner electricity grids, loads of untapped renewable energy potential and enough physical space to absorb large-scale buildouts with far less public pushback.
But these locations barely appear on today’s data center map. South Dakota, for instance, remains one of the least-developed states in the country even though its strong water profile and renewable energy capacity make it a standout in the modeling. Texas is more complicated. The state already hosts a proliferating data center industry, and its vast wind and solar resources make it an attractive option, but its isolated, weather-dependent grid may be too unstable. Still, the Lone-Star state keeps pulling companies in, even as other regions are maxing out.
This growing mismatch between where data centers are and where they should be is starting to trigger political backlash, too. In recent months, Democratic lawmakers called on the White House to investigate rising utility costs and local infrastructure strain tied to the AI-driven data center boom, arguing that federal policy hasn’t caught up with the environmental risks.
The study’s core takeaway is that America can significantly shrink AI’s environmental impact simply by changing where it chooses to build data centers. Siting new facilities in regions with abundant water and clean-energy potential — particularly in states with strong wind and solar capacity — would ease the pressure on communities.
The authors also call for greater transparency from AI companies, urging standardized reporting of energy, water and carbon emissions so the public can see the real environmental trade‑offs. As the Federation of American Scientists puts it, “Without standardized metrics and reporting, policymakers and grid operators cannot accurately track or manage AI’s growing resource footprint,” which could be up to 662 percent higher than figures lead us to believe. Providing “nutrition labels” that detail the energy, water and carbon footprint of AI-oriented data centers and services, could give policymakers the clear, standardized information they need to craft effective policies that protect their communities from potential strains on local resources.
Without these changes, the United States could end up concentrating the next wave of AI infrastructure in regions where water and power grids are already stressed, locking in environmental pressures that will be much harder and more expensive to address down the road.
Even the Best Scenario Falls Short of ‘Net Zero’ by 2030
The researchers also tested whether AI leaders like Google, Microsoft and Amazon can realistically meet its various net-zero pledges by 2030. Their conclusion: they can’t — at least not without leaning heavily on carbon offsets and water-restoration programs, tools that many experts say are too uncertain to anchor long-term climate planning.
In the study’s best-case scenario, AI servers will consume around 731 million cubic meters of water annually between 2024 and 2030, and produce roughly 24 million tons of CO₂-equivalent each year. In less forgiving circumstances, water consumption could reach 1,125 million cubic meters while CO₂-equivalent emissions hit 44 million tons per year. The difference between these ranges is that of a medium-sized city versus the annual output of entire countries like Hungary, Portugal or New Zealand.
Efficiency improvements could help. Better chips, modular cooling systems and cleaner grids all have the potential to meaningfully reduce impacts. But even the most optimistic reductions depend on energy infrastructure upgrades that don’t exist yet. Several utilities, including in Tennessee and Arizona, are already struggling to connect new facilities without leaning on fossil-fuel buildouts.
AI’s Future Energy Demand Is Almost Impossible to Predict
Both the study’s authors and outside experts warn that AI’s energy consumption is shifting so quickly that any long-term forecasts come with major uncertainties. Several developments could meaningfully reduce demand, such as on-site solar arrays, energy storage and on-site microgrids at data center campuses. There’s also talks of a possible nuclear power revival, as well as breakthroughs in liquid or immersion cooling. Some analysts also note that energy needs could level off if investment in large-scale AI models and hyperscale deployments started to slow.
Then again, the opposite is just as (if not more) likely. Rapid adoption of agentic AI systems, enterprise-scale automation and ever-larger multimodal models could push demand well beyond the study’s “worst-case” scenario. At the same time, if next-gen hardware platforms like Nvidia’s Blackwell deliver on their promised Moore’s-Law-like efficiency gains, the added compute demand could become far less burdensome than it looks at this stage. That kind of acceleration would force utilities to rethink their timelines for adding new generation and infrastructure — and likely overwhelm the planning assumptions they’re using today.
Frequently Asked Questions
Where are data centers being built?
The key data center hubs in the United States include Virginia, California and Texas, with notable follow up by Arizona, Ohio, Wisconsin, Iowa and parts of Nebraska.
Why does the location of AI data centers matter?
Where data centers are built affects water use, carbon emissions and local grid strain of that area. These factors not only shape a region’s environmental health but also its long-term infrastructure planning.
