Billion-Dollar Data Centers Are Taking Over the World
The battle for AI dominance has left a large footprint—and it’s only getting bigger and more expensive.
When Sam Altman said one year ago that OpenAI’s Roman Empire is the actual Roman Empire, he wasn’t kidding. In the same way that the Romans gradually amassed an empire of land spanning three continents and one-ninth of the Earth’s circumference, the CEO and his cohort are now dotting the planet with their own latifundia—not agricultural estates, but AI data centers.
Tech executives like Altman, Nvidia CEO Jensen Huang, Microsoft CEO Satya Nadella, and Oracle cofounder Larry Ellison are fully bought in to the idea that the future of the American (and possibly global) economy are these new warehouses stocked with IT infrastructure. But data centers, of course, aren’t actually new. In the earliest days of computing there were giant power-sucking mainframes in climate-controlled rooms, with co-ax cables moving information from the mainframe to a terminal computer. Then the consumer internet boom of the late 1990s spawned a new era of infrastructure. Massive buildings began popping up in the backyard of Washington, DC, with racks and racks of computers that stored and processed data for tech companies.
A decade later, “the cloud” became the squishy infrastructure of the internet. Storage got cheaper. Some companies, like Amazon, capitalized on this. Giant data centers continued to proliferate, but instead of a tech company using some combination of on-premise servers and rented data center racks, they offloaded their computing needs to a bunch of virtualized environments. (“What is the cloud?” a perfectly intelligent family member asked me in the mid-2010s, “and why am I paying for 17 different subscriptions to it?”)
All the while tech companies were hoovering up petabytes of data, data that people willingly shared online, in enterprise workspaces, and through mobile apps. Firms began finding new ways to mine and structure this “Big Data,” and promised that it would change lives. In many ways, it did. You had to know where this was going.
Now the tech industry is in the fever-dream days of generative AI, which requires new levels of computing resources. Big Data is tired; big data centers are here, and wired—for AI. Faster, more efficient chips are needed to power AI data centers, and chipmakers like Nvidia and AMD have been jumping up and down on the proverbial couch, proclaiming their love for AI. The industry has entered an unprecedented era of capital investments in AI infrastructure, tilting the US into positive GDP territory. These are massive, swirling deals that might as well be cocktail party handshakes, greased with gigawatts and exuberance, while the rest of us try to track real contracts and dollars.
OpenAI, Microsoft, Nvidia, Oracle, and SoftBank have struck some of the biggest deals. This year an earlier supercomputing project between OpenAI and Microsoft, called Stargate, became the vehicle for a massive AI infrastructure project in the US. (President Donald Trump called it the largest AI infrastructure project in history, because of course he did, but that may not have been hyperbolic.) Altman, Ellison, and SoftBank CEO Masayoshi Son were all in on the deal, pledging $100 billion to start, with plans to invest up to $500 billion into Stargate in the coming years. Nvidia GPUs would be deployed. Later, in July, OpenAI and Oracle announced an additional Stargate partnership—SoftBank curiously absent—measured in gigawatts of capacity (4.5) and expected job creation (around 100,000).
Microsoft, Amazon, and Meta have also shared plans for multibillion-dollar data projects. Microsoft said at the start of 2025 that it was on track to invest “approximately $80 billion to build out AI-enabled data centers to train AI models and deploy AI and cloud-based applications around the world.”
Then, in September, Nvidia said it would invest up to $100 billion in OpenAI, provided that OpenAI made good on a deal to use up to 10 gigawatts of Nvidia’s systems for OpenAI’s infrastructure plans, which means essentially that OpenAI has to pay Nvidia in order to get paid by Nvidia. The following month AMD said it would give OpenAI as much as 10 percent of the chip company if OpenAI purchased and deployed up to 6 gigawatts of AMD GPUs between now and 2030.
It’s the circular nature of these investments that have the general public, and bearish analysts, wondering if we’re headed for an AI bubble burst.
What’s clear is that the near-term downstream effects of these data center build-outs are real. The energy, resource, and labor demands of AI infrastructure are enormous. By some estimates, worldwide AI energy demand is set to surpass demand from bitcoin mining by the end of this year, WIRED has reported. The processors in data centers run hot and need to be cooled, so big tech companies are pulling from municipal water supplies to make that happen—and aren’t always disclosing how much water they’re using. Local wells are running dry or seem unsafe to drink from. Residents who live near data center construction sites are noting that traffic delays, and in some cases car crashes, are increasing. One corner of Richland Parish, Louisiana, home of Meta’s $27 billion Hyperion data center, has seen a 600 percent spike in vehicle crashes this year.
Major proponents of AI seem to suggest that all of this will be worth it. Few top tech executives will publicly entertain the notion that this might be an overshoot, either ecologically or economically. “Emphatically … no,” Lisa Su, the chief executive of AMD, said earlier this month when asked if the AI froth has runneth over. Su, like other execs, cited overwhelming demand for AI as justification for these enormous capital expenditures.
Demand from whom? Harder to pin down. In their mind, it’s everyone. All of us. The 800 million people who use ChatGPT on a weekly basis. The evolution from those 1990s data centers to the 2000s era of cloud computing to new AI data centers wasn’t just one continuum. The world has concurrently moved from the tiny internet to the big internet to the AI internet, and realistically speaking, there’s no going back. Generative AI is out of the bottle. The Sams and Jensens and Larrys and Lisas of the world aren’t wrong about this.
It doesn’t mean they aren’t wrong about the math, though. About their economic predictions. Or their ideas about AI-powered productivity and the labor market. Or the availability of natural and material resources for these data centers. Or who will come once they build them. Or the timing of it all. Even Rome eventually collapsed.