A new study reveals that the rapid growth of data centers could significantly increase electricity costs and strain power grids, posing environmental challenges.
A recent study conducted by the Union of Concerned Scientists highlights the potential consequences of the rapid construction of data centers, warning that this surge in demand for electricity could lead to soaring energy costs and environmental harm.
Published on Monday, the report indicates that the pace at which data centers are being built is outstripping the ability of utilities to supply adequate electricity. Mike Jacobs, a senior manager of energy at the organization, emphasized the challenge: “They’re increasing the demand faster than you can increase the supply. How’re you going to do that?”
The report, titled “Data Center Power Play,” models various electricity demand scenarios over the next 25 years, alongside different energy policy approaches to meet these demands. The study aims to estimate the potential costs in terms of electricity, climate impact, and public health, which could amount to trillions of dollars.
Jacobs noted that implementing clean energy policies could mitigate these costs while reducing air pollution and health impacts. He pointed out that the construction of an electric grid capable of meeting the rising demand for power will take significantly longer than building new data centers.
“This is a collision between the people whose philosophy is ‘move fast and break things,’ with the utility industry that has nobody that says move fast and break things,” Jacobs remarked, referring to the rapid expansion of data center facilities. He also mentioned that predicting future demand for data centers is challenging due to limited information from utilities and major tech companies. How this demand is addressed will be crucial for both public health and environmental sustainability.
Jacobs further stated, “This is really a great moment for regulators to do what’s within their authority and sort out and assign the costs to those who cause them, which is an essential principle of utility ratemaking.”
In recent years, tech companies have aggressively expanded their data center operations, driven by the booming demand for artificial intelligence. Major firms such as OpenAI, Google, Meta, and Amazon have made substantial investments in data centers, with projects like Stargate serving as critical infrastructure for AI development.
While the growth of data centers brings job opportunities and digital advancements, it also raises significant concerns regarding their substantial energy and water consumption. Data centers typically rely on water-intensive cooling systems, which can exacerbate existing water scarcity issues.
For instance, a single 100 megawatt (MW) data center can consume over two million liters of water daily, an amount comparable to the daily usage of approximately 6,500 households. This demand is particularly concerning in regions already facing water shortages, such as parts of Georgia, Texas, Arizona, and Oregon, where it places additional stress on aquifers and municipal water supplies.
The findings of this study underscore the urgent need for a balanced approach to energy policy and infrastructure development, ensuring that the growing demands of data centers do not come at the expense of environmental sustainability and public health, according to The Union of Concerned Scientists.

