U.S. electricity grid stretches thin as data centers rush to turn on onsite generators — Meta, xAI, and other tech giants race to solve AI's insatiable power appetite | Retrui News | Retrui
U.S. electricity grid stretches thin as data centers rush to turn on onsite generators — Meta, xAI, and other tech giants race to solve AI's insatiable power appetite
SOURCE:Tom's Hardware|BY: Jowi Morales
The AI infrastructure boom is pushing the national electricity grid to its limit, but artificial intelligence is too lucrative for companies to wait for a connection. So, they're taking the solution for their power supply problem into their own hands.
(Image credit: Ge Vernova)
Tom's Hardware Premium Roadmaps
Facebook founder Mark Zuckerberg said in May 2024 that power will be one of the biggest factors that will constrain artificial intelligence, and, true enough, tech giants and hyperscalers have begun to hit power constraints. According to SemiAnalysis, electricity loads of tens of gigawatts have been requested in the state of Texas alone, but only a little over a gigawatt has received approval, signalling that the power grid may be stretched thin.
This limitation has tech and power companies investing in small modular nuclear reactors (SMRs), which can potentially deliver large amounts of power in a relatively compact package. Microsoft has even recommissioned the old Three Mile Island nuclear power plant to deliver 819 MW of power for AI and cloud data center usage.
These initiatives will take years to take off, though. The Three Mile Island plant is expected to be operational only by 2028, while the earliest SMRs won’t enter service until the 2030s. There has even been a proposal to use retired U.S. Navy reactors for data centers, but the project proponent hasn’t offered a timeline for how quickly it could potentially get up and running.
Aside from the delay, this is nowhere near enough of the 155MW that the 100,000 H100 GPUs running on the site require. There was also a 150-MW substation under construction on the site, which also required additional setup. Waiting for these power sources to come online would have negated , so he turned to VoltaGrid to deliver the power he needed to run the Colossus site.
Just a few months later, OpenAI followed the billionaire’s lead and ordered 29 gas turbines capable of producing 34MW each for its Stargate data center in Abilene, Texas. All these turbines would output a total of 986MW of power, which should be enough to run up to half a million GB200 NVL72 chips. So, even if the company fails to secure power from the grid, it can get the needed electricity from its own turbines.
Aside from these two, several other projects are going off-grid, with 62% of data centers considering on-site power generation, according to Data Center Knowledge. Furthermore, Natural Gas Intel estimated that data centers will use 35GW of behind-the-meter power by 2030.
While gas generators seem to be the silver bullet that will help solve the electricity supply puzzle many data centers face, it also comes with its own set of problems. Uptime is the principal issue among them, since data centers require near constant uptime reliability. To achieve this, data centers cannot just purchase the generators that can deliver the load they need; they need to build in redundancy.
AI data centers must have N+1 or N+1+1 redundancy to ensure continued operation, even if some fail. N+1 means that they must have a backup generator, which takes over should one fail during normal operations. Meanwhile, N+1+1 recommends having an additional power source on site as well as a backup, in case one of the generators in regular operation is in maintenance. As well as all of this, maintenance, spare parts, necessary personnel, and fuel also remain additional considerations.
Despite these challenges, it’s estimated that AI data centers could generate $10 to $12 billion per gigawatt annually. Musk fired up the Colossus data center in July 2024, while the 150MW substation only delivered power to the site in November 2024. This meant that he could have made somewhere between $3 to $4 billion in revenue during that time, which likely would have offset the cost of running the entire site on natural gas turbines.
(Image credit: Microsoft)
Another issue that data centers face is permitting requirements, which can take as long as a year or more. The OpenAI/Oracle site at Abilene, Texas, is reportedly facing delays because of this. Musk’s Colossus 2 site mitigated the permit delay by building near the border of Tennessee and Mississippi, allowing him to hedge his bets by applying for permits from the two states and securing supply from both.
Aside from that, communities around these data centers might complain about the power plants they’re putting up just to provide the power they need. xAI faced this exact issue, with residents complaining about the pollution generated by the gas turbines deployed around the Memphis site.
Though investors could throw unlimited amounts of money at the problem, there's another factor they cannot escape — long equipment lead times. It takes about 12 to 36 months before any of the gas turbine manufacturers can deliver them, especially as these are intricate machines that use highly specialized materials and require specific processes to manufacture.
These lead times could get longer, especially as more AI data centers are competing for the same amount of production capacity from these companies.
A temporary solution or a permanent fixture?
xAI contracted VoltaGrid to deliver power to its Memphis data center while it was waiting for a connection from the TVA. But even as the utility company finally delivered the 150MW it needed, some turbine generators remained on site as a backup to the system. This begs the question: should AI data centers rely entirely on their own power? VoltaGrid and other firms seem to think so, even going as far as offering “energy-as-a-service,” or EaaS, to AI companies.
This entails an extended power purchase agreement between the provider and the data center, wherein the former will deliver everything that the latter needs for their operation — from power capacity and day-to-day operations to maintenance and uptime reliability. However, even though this may be favorable in terms of deployment time, it will still get prohibitively expensive in the long run.
At the moment, AI companies rent gas generators for bridge power, allowing them to get their data center operations up and running as soon as possible while waiting for approval from the local utility. But when they finally connect their site to the grid, it’s still often more economical to just keep these mobile generators as backup units in case their primary sources fail, not as their sole power source.
EaaS could possibly work as a permanent solution when the national electric grid is maxed out. When there is no more extra capacity to be had for power-hungry AI infrastructure, AI companies might have no choice but to rely on services like this just to get their projects up and running.
Jowi Morales is a tech enthusiast with years of experience working in the industry. He’s been writing with several tech publications since 2021, where he’s been interested in tech hardware and consumer electronics.