Thirsty Silicon: The Hidden Costs of Big Tech’s AI Boom (Part 1)
What you'll learn:
- How the explosive growth in the number of large data centers, especially those supporting AI services, is approaching, or exceeding the generating capacities of many regional power grids.
- How many regions’ water resources are being pushed beyond sustainable limits by the centers’ seemingly insatiable thirst for cooling water.
- How electric utilities are quietly brokering deals with AI providers to build new generating capacity, mostly at the expense of their existing ratepayers.
By now, it’s no secret that utility companies are struggling to meet the unprecedented surge on North America’s aging power grids, particularly due to rapidly rising demands for AI-based services from new data centers popping up across the country. New energy plants, transmission lines, and faltering coal plants are all leading to increased utility bills for ratepayers.
In addition, the AI boom is creating a second, less well-known crisis: The data centers’ thirst for cooling water strains the water supplies and water-related infrastructures in many areas (Fig. 1).
It turns out that data centers’ steadily growing appetite for energy and water are closely related, both driven by the greater demand for computational power that they’re being asked to deliver. In a single day, one large data center could use as much energy as 80,000 households. And since it takes almost as much energy to cool the servers’ electronics as it does to run them, that data center will also use as much water as 50,000 people to keep its servers from overheating.
Consequently, the AI industry’s frenzied “arms race” to build ever larger and more powerful server farms is creating large, unanticipated adverse impacts on the environment and the communities that host them.
While most of the AI industry prefers to treat these issues as a small price to pay for the economic benefits produced by their services, they rarely factor the true economic and environmental costs of their operations into their own spreadsheets. Some companies are becoming more transparent, but data centers aren’t legally required to disclose how much water and energy they use, making their true impact difficult to measure. Adding to the problem, some data centers won’t even disclose their locations.
A Gigawatt-Scale Challenge
To get a better sense of the scale of the problem, consider that, according to Element Six, a typical server used in AI applications contains eight GPUs drawing about 1,500 to 3,000 W each, a 10X increase from the RISC- or CISC-based CPUs used in conventional servers that typically draw 100 to 200 W. This is compounded by the fact that server power consumption is expected to continue to grow as both GPUs and CPUs become increasingly complex, powerful, and power-hungry (Fig. 2).
Although experts at Flex, a software manufacturer that develops platforms for AI infrastructure, say that while not every CPU will be replaced by a GPU (typically around 20%), the sharply rising power demands of AI applications are ushering in an era of gigawatt-class data center campuses. Flex estimates that a typical facility would house roughly 20,000 densely packed 50-kW racks.
Even today, data centers are straining the generating capacity of many regions. For example, Texas Royalty Brokers examined AI energy demands across the United States to determine which states dedicate the largest share of their electricity to artificial intelligence. Among their findings:
- Indiana dedicates more electricity to AI than any other state, with data centers consuming nearly half of the local energy output.
- Texas hosts the most AI facilities with 17 clusters, but their massive power grid keeps AI's share at 15% of the state’s power generation.
- Wisconsin's single AI facility uses a quarter of the state's electricity, consuming 15.8 million MWh yearly from only one location.
You can access the complete research findings at this link.
>>Download the PDF of this article
Cooling thousands of racks creates a daunting set of thermal-management challenges. As chips become more powerful, they generate more heat. As mentioned earlier, a typical data center facility consumes nearly as much power for cooling as it uses to operate the servers, making efficient cooling methods vital to reduce both operating costs and impact to the environment.
Cooling Methods
Until recently, most data centers used forced air cooling to push chilled air through each server box. In colder climates, free cooling can be used to replace the building’s heated air with cooler air drawn from outside without running it through some sort of active chiller system. According to Green Grid, a consortium that encourages development of energy- and water-efficient data centers, this type of semi-passive cooling is only feasible for centers located in cool climates.
In most areas, however, forced air cooling requires a heat exchanger that uses chilled water to pull the excess heat from the facility. Most of today’s heat exchangers employ a water chiller that captures heat in a cooling loop and moves it to one or more evaporative cooling towers, where it’s dissipated.
Such an evaporative process typically turns 80% of the water fed through it into water vapor, with the remainder sent to municipal wastewater processing plants. In addition to potentially overstressing an area’s water resources, the resulting volumes of unplanned wastewater can easily overwhelm the capacity of existing processing facilities.
Some server manufacturers have begun to use immersive cooling, a more efficient but also more complex and expensive approach. The entire processor board is submerged in tanks filled with dielectric fluids that absorb the heat and carry it away.
An Appetite for Water and Power
In the water cycle of data centers, potable or non-potable water can be drawn from nearby wells, streams or wells, municipal utilities, or reclaimed water that’s been collected from industrial processes. The cooling process for many systems evaporates a significant portion of that water, making it unavailable for reuse. In numerous areas, especially in drier regions, this can put unanticipated strain on municipal water systems and draw down local aquifer levels faster than they can be replenished.
To get a sense of the scale of the problem, it’s estimated that a single 100-word AI prompt typically “drinks” approximately one 500-mL bottle of water to cool the servers processing it.
While a water bottle might not sound like a significant amount, the ounces add up. Across the U.S., 5,426 data centers consume billions of gallons of water annually. Large data centers can consume up to 5 million gallons per day, which is equivalent to the water use of a town of 10,000 to 50,000 people.
Training AI takes enormous amounts of water, too — Microsoft's data centers directly evaporated more than 700,000 liters of clean freshwater just to train the GPT-3 language model.
But cooling the server’s CPU, GPU, and memory chips is only part of the overall impact. The fossil-fuel-powered generators that deliver more than half of the electricity used by U.S. data centers also require significant amounts of water, both to generate the steam needed to turn their turbines and to cool the generating machinery.
In 2023, data centers accounted for about 4% of electricity consumption in the U.S. However, that figure could triple in the next few years, reaching anywhere from 325 to 580 TWh of electricity consumption in 2028.
But how much of the electricity used by centers is used for AI functions? By 2028, it could be about half, according to projections from the Lawrence Berkeley National Laboratory.
And, unlike power, there’s no way to generate more water. According to the United Nations Environmental Report, nearly two-thirds of the world's population already experience severe water shortages for at least one month a year. And by 2030, almost half of the world's population could face severe water stress.
Communities and the Environment Struggle Under Demands
Many of these potential impacts are being ignored as local economies race to provide AI data centers with the resources and infrastructure they need to avoid being “left behind.” These decisions often impose hidden costs on the public — both economic and environmental.
If your electric bill looks higher lately, it’s because you could be helping foot the bill for data center development. Utility rates have risen in 41 states as utility companies rush to supply demand for new data infrastructure. It’s also expected that many of the new data centers now under development will use more energy than large cities.
As utility companies scramble to build new transmission lines and power plants to keep up with demand, the cost of many of these projects are being quietly passed on to consumers rather than the centers themselves. This is the result of confidential deals that data centers often strike with utility providers, allowing them to avoid paying the standard rates for power normally set by the governing commissions.
In many parts of the country, data centers are also drinking the land they stand on dry. This is especially evident in Virginia, home to the largest concentration of data centers in the world. About half of all data centers in the U.S. are located on a total of 49 million acres of Loudoun County alone, all of which are close to creating an unsustainable demand on municipal water systems and power grids.
According to the Virginia Conservation Network, Loudoun County’s data center potable water consumption has increased by 250% in the last four years, totaling 899 million gallons in 2023. Meanwhile, concerns over data center concentration resurfaced after a recent Amazon Web Services outage caused major disruptions due to one of Amazon’s data centers in Northern Virginia.
A similar crisis is brewing in San Antonio, Texas, where, despite a long-term drought, two data centers used a combined 463 million gallons of water in 2023 and 2024. The San Antonio Water System imposed water restrictions on residents, but data centers remained unregulated on how much water they can use, even in periods of scarcity.
Similar problems are beginning to arise in other countries where economic pressure, lower costs, and less stringent regulation are encouraging deployment of data centers. For instance, activists in Chile are raising concerns about the impact of AI development on the Atacama Desert.
Part 2 of this two-part series looks at some of the innovative solutions to this developing water crisis, from closed-loop cooling to adopting ultra-efficient chip technology to soil health restoration. Stay tuned.
>>Download the PDF of this article
About the Author

Bea Karron
Contributing Editor
Bea Karron is a recent graduate of Marist University, where she earned a B.A. in Journalism and Public Relations. While at school in the Hudson Valley, she developed a passion for storytelling focused on science and sustainability, and how climate issues connect to our everyday lives.
Although she doesn’t have an engineering background, she’s picked up a fair amount of technical knowledge via osmosis from her father, an electrical engineer. Bea is excited to contribute to Electronic Design by exploring the complex intersection between technology and the environment.



