Sunday, March 8, 2026

Billion-dollar data centers are taking over the world

Share

When Sam Altman said a year ago that OpenAI’s Roman Empire is the actual Roman Empirehe wasn’t joking. In the same way that the Romans gradually amassed an empire spanning three continents and one-ninth of the Earth’s circumference, the CEO and his cohorts are now dotting the planet with their own latifundia – not agricultural estates, but AI data centers.

Technology executives like Altman, Nvidia CEO Jensen Huang, Microsoft CEO Satya Nadella, and Oracle co-founder Larry Ellison fully buy into the notion that the future of the U.S. (and perhaps global) economy is novel IT-enabled warehouses. But data centers are, of course, not novel. In the early days of computing, air-conditioned rooms housed giant, power-hungry mainframe computers with coaxial cables carrying information from the mainframe to the terminal computer. Then the consumer Internet boom in the tardy 1990s ushered in a novel era of infrastructure. Huge buildings with shelves equipped with computers storing and processing data for technology companies began to appear in Washington’s backyard.

Ten years later, the “cloud” has become the sensitive infrastructure of the Internet. Storage has become cheaper. Some companies, such as Amazon, have taken advantage of this. Giant data centers continued to grow, but instead of technology companies using a combination of on-premises servers and rented data center racks, they moved their computing needs to a series of virtualized environments. (“What is this cloud?” a perfectly intelligent family member asked me in mid-2010: “and why am I paying for 17 different subscriptions?”).

All the while, tech companies were hoovering up petabytes of data that people willingly shared online, in corporate workspaces, and through mobile apps. Companies began looking for new ways to explore and structure “big data” and promised that it would change lives. In many ways it did. You had to know where this was going.

Currently, the technology industry is experiencing a period of dreaming about generative artificial intelligence, which requires new levels of computing resources. Big Data is tired; there are large data centers and cabling – for artificial intelligence. Faster, more efficient chips are needed to power AI data centers, and chipmakers like Nvidia and AMD have been jumping on the proverbial couch proclaiming their love for AI. The industry has entered an unprecedented era of capital investment in AI infrastructure, tipping the United States into positive GDP territory. These are huge, swirling deals that might as well be handshakes at a cocktail party, soaked in gigawatts and enthusiasm while the rest of us try to keep track of real contracts and dollars.

OpenAI, Microsoft, Nvidia, Oracle and SoftBank have made some of the largest deals. This year, an earlier supercomputing project between OpenAI and Microsoft, called Stargate, became the vehicle for a massive artificial intelligence infrastructure project in the US. (President Donald Trump called it the largest AI infrastructure project in history, of course he did, but maybe that wasn’t an exaggeration). Altman, Ellison and SoftBank CEO Masayoshi Son committed to the deal with a $100 billion seed investment commitment, and plan to invest up to $500 billion in Stargate in the coming years. Nvidia GPUs will be implemented. Later in July, OpenAI and Oracle announced an additional Stargate partnership – interestingly absent SoftBank – measured in gigawatts of capacity (4.5) and expected job creation (around 100,000).

Latest Posts

More News