Paige Gross, stateline.org
Imagine this the next time you’re asking a Zoom meeting or ChatGpt question. This instantly zips a room of a hot humming server that travels hundreds of miles, perhaps thousands of miles, then returns to you in just a second or two.
While it can be difficult to wrap your mind around, large data centers are where almost all artificial intelligence systems and computing take place today, according to Vijay Gadepally, a senior scientist at the Lincoln Institute at the Massachusetts Institute of Technology.
“Each of these AI models tends to be very big because they have to sit on a server somewhere,” he said. “So when millions of users are talking to the system at the same time, a computing system really needs to grow, grow and grow.”
As the US is committed to global AI superpower, it will be home to hundreds of data centers in buildings that store and maintain the physical equipment needed to calculate information.
For users of increasingly popular AI tools, it may seem like all changes are online, without a physical footprint. However, there are concrete effects on the rise in AI. Data centers and the physical infrastructure needed to run them use large amounts of energy, water and other resources, experts say.
“We’re definitely trying to think about that climate with a critical eye,” said Jennifer Brandon, a consultant in science and sustainability. “All of a sudden, some of these places are putting a huge strain on the grid.”
The rise of data centers
As society traded large desktop computers for sophisticated laptops and internet infrastructure began to support AI models and other software tools, the US built a physical infrastructure that supported the growing computing power.
David Acosta, co-founder and chief artificial intelligence officer of Arboai, said that while large-scale language modeling (LLMS) and machine learning (ML) technology (the foundation of most modern AI tools) have been used by engineers for decades, they have been commercialized and used by the public.
To train and process information, these high-speed learning AI models house graphics processing units (GPUs), servers, storage, cables and other networking equipment all in data centers across the country. Computers have been storing and processing data offsite in dedicated centers for decades, but the dot-com bubble and the migration to cloud storage in the early 2000s required far more storage capacity in the past decade.
As more things move online and computing hardware and chip technologies support faster processing, AI models have become achievable for the public, Acosta said. Current AI models operate using thousands of GPUs and train a single chatbot like ChatGPT to use roughly the same energy as 100 homes over a year.
“And you multiply the thousands of models trained in that era,” Acosta said. “It’s pretty intense.”
There are currently more than 3,600 data centers in the United States, but about 80% are concentrated in 15 states, data center maps show. According to Forbes, the market has doubled since 2020, up 21% from the previous year. Over the years, data centers in almost every country have been housed in Virginia, and the state is considered a global hub with nearly 70% of the world’s internet traffic flowing through nearly 600 centers. Texas and California follow Virginia, with 336 and 307 centers, respectively.
High-tech companies where large amounts of computing power, private equity companies and banks that invest in them, and other real estate or specialist companies are the leading funders of the data center. In September, BlackRock, the global infrastructure partner of Microsoft and AI Investment Fund MGX, said it would invest $30 billion in new and expanded data centers, primarily in the US, and seek a total investment of $100 billion, including debt financing.
According to Acosta, investment in American data center infrastructure is encouraged by the global “AI Arms Race.”
“If you own data, you have the power,” Acosta said. “I think we just need to make sure we take the lead as ethical as possible.”
Energy and the Environmental Impact
Current estimates show that data centers are responsible for about 2% of U.S. energy demand, while Anthony Deorsey, research manager at Sustainable Energy Research Firm CleanTech Group, will be about 10% of demand by 2027.
With data centers being developed in new communities across the country, residents and their state legislators are seeing a combination of energy and environmental challenges and economic benefits.
The development of data centers brings some infrastructure jobs to the region, and in busy data center communities like Loudoun and Prince William counties in Virginia, the center can generate millions of tax revenues, Virginia Mercury reported.
Local governments may be eager to attack transactions with high-tech or private equity companies they are aiming to build, but the availability and cost of power is the main concern. The new, large data center will need the equivalent of approximately 750,000 homes, a February report from sustainability consultancy BSI and real estate services company CBRE.
Under the utility structure of many states, local residents can receive a rise in electricity prices to meet the large electrical needs of data centers. Some lawmakers, like Georgia Senator Chuck Hufstetler, are trying to protect residential and commercial customers from hitting higher utility bills.
Granville Martin, an Eastern Shore based in the Eastern Shore, Connecticut, is an attorney with expertise in financial and environmental regulations and said the same issues have come to his own community.
“The discussion was that locals didn’t want this data centre to come in and get it right and whether their views are right or wrong and right – it only increased our fees, so they didn’t want to suck up the bundle of available forces,” Martin said.
Some states are investigating alternative energy sources. In Pennsylvania, Constellation Energy has signed a contract to reopen nuclear power plants on the three-mile island and provide carbon-free electricity to offset Microsoft’s electricity usage at nearby data centers.
However, climate experts are concerned about data centers outside of electricity demand.
“The public has little realised that whatever cooling an industrial facility is actually a really important aspect,” Martin said.
Data center equipment runs 24/7, 365 days a year and produces a lot of heat. To regulate the temperature, pump most of the water through the tubes surrounding the IT equipment and use an air conditioning system to keep their structure cool. About 40% of Data Center’s energy consumption is used for cooling, the CleanTech group found.
Some have closed loop systems and recycle grey water through the same system, but many use fresh drinking water. The amount of water and energy used for cooling is sustainability consultant Brandon. I said.
“The current volume of AI data centres will cost six times the amount of water for a Denmark country,” she said. “And we are currently using the same amount of energy for our data centers as Japan, the fifth largest energy user in the world.”
Is there a sustainable future for data centers?
Energy is now a key issue for running AI companies, Deorsey said, and that an uncontrolled, rapidly evolving AI model is extremely expensive to train and operate. Deorsey pointed out Chinese AI company Deepseek and announced in January its attempt at a major cost-sensitive and energy-efficient language model R1.
The company claims it trained the model with 2,000 chips, far fewer than its competitors such as Open AI, ChatGpt’s parent company, and Google. It is not yet clear whether the model will withstand energy efficiency claims in use, but it is a sign that companies are feeling pressured to be more efficient, Deorsey said.
“I think companies like Deepseek are examples of companies that have limited optimizations,” he said. “They assume they can’t get all the power they need, they can’t get all the chips they need and they can’t get what they have.”
For Gadepally, who is also the chief technology officer of AI Company Radium Cloud, this selective optimization is a tool that more companies hope to get started. His recent research at the Lincoln Institute’s Supercomputing Center at MIT focused on the consumption of the lab’s own data center. When they realized how hot their equipment was, they audited.
Gadepally said simple switches, such as using cheaper and less robust AI models, will reduce energy use. Using AI models during off-peak hours saved money, like limiting the amount of power supplied to computer processors. The difference was nominal. For example, you could wait 1-2 seconds to get the answer back from the chatbot.
At Northeastern University, MIT builds software called Clover to monitor and adjust carbon intensity over peak periods, including automatically using low-quality AI models with low computing power when energy demand is high.
“We’ve been pushing people back for a long time, is it really worth it?” Gadeparry said. “You might make knock knock jokes better from this chatbot. But it now uses 10 times more power than before. Is it worth it?”
Both Gadepally and Acosta talked about localizing AI tools as another energy and cost saving strategy for businesses and data centers. In reality, it means building tools to do exactly what you need. There’s nothing more than that.
Healthcare and farming settings are great examples, Acosta said that tools can be built to provide these specialized settings rather than processing data in large “bloated, overly distorted” data centers.
Neither AI developer sees a slowdown in AI demand and data center processing power. But Gadeparry said environmental and energy concerns come to mind for tech companies when they realised that saving energy can save money. Gadepally said it has yet to be seen whether Deepseek has discovered the same success as some of its American competitors, but it will likely raise them question their practices.
“At least I’m going to question people before someone says, ‘We need $1 billion to buy new infrastructure’ or ‘We need to spend $1 billion on computing next month’,” Gadeparry said. “Now they might say, ‘Did you try to optimize it?” ”
©2025 States Newsroom. Go to stateline.org. Distributed by Tribune Content Agency, LLC.
Original issue: April 17, 2025, 1:20pm