Multi-billion dollar data centers are taking over the world


When Sam Altman said a year ago that OpenAI‘s Roman Empire East the current Roman Empirehe wasn’t joking. Just as the Romans gradually built a land empire spanning three continents and one-ninth of the Earth’s circumference, the CEO and his cronies now dot the planet with their own latifundia – not agricultural estates, but AI Data Centers.

Tech leaders like Nvidia CEO Altman Jensen HuangCEO of Microsoft Satya Nadellaand co-founder of Oracle Larry Ellison are fully convinced that the future of the American (and perhaps global) economy lies in these new warehouses equipped with IT infrastructure. But data centers, of course, aren’t really new. In the early days of computing, there were giant, power-hungry mainframe computers in air-conditioned rooms, with coaxial cables transferring information from the mainframe to a terminal computer. Then the consumer Internet boom of the late 1990s ushered in a new era of infrastructure. Massive buildings began popping up in Washington, D.C.’s backyard, with racks and racks of computers storing and processing data for tech companies.

A decade later, “the cloud” has become the fragile infrastructure of the Internet. Storage has become cheaper. Some companies, like Amazon, capitalized on it. Giant data centers continued to proliferate, but instead of a technology company using a combination of on-premises servers and rented data center racks, it offloaded its computing needs to a set of virtualized environments. (“What is the cloud?” a perfectly intelligent family member asked me in the mid-2010s, “and why am I paying for 17 different subscriptions?”)

Meanwhile, tech companies were hoovering up petabytes of data, data that people happily shared online, in corporate workspaces, and through mobile apps. Companies began to find new ways to leverage and structure this “Big Data” and promised that it would change lives. In many ways, it is. We had to know where this was going.

Today, the technology industry is living in the fever dream era of generative AI, which requires new levels of computing resources. Big Data is tired; big data centers are here, and wired, for AI. Faster, more efficient chips are needed to power AI data centers, and chipmakers like Nvidia and AMD are jumping up and down on the proverbial couch, proclaiming their love for AI. The industry has entered an unprecedented era of capital investment in AI infrastructure, tilting the United States toward positive GDP. These are massive, swirling deals that might as well be handshakes at a cocktail party, greased with gigawatts and exuberance, while the rest of us try to keep up with real contracts and real dollars.

OpenAI, Microsoft, Nvidia, Oracle and SoftBank have made some of the biggest deals. This year, a previous supercomputing project between OpenAI and Microsoft, called Stargate, became the vehicle for a larger AI infrastructure project in the United States. (President Donald Trump called it the largest AI infrastructure project in history, because of course it did, but maybe that wasn’t hyperbolic.) Altman, Ellison and Masayoshi Son, CEO of SoftBank were all in agreement, pledging $100 billion to start, with plans to invest up to $500 billion in Stargate in the coming years. Nvidia GPUs would be deployed. Later, in July, OpenAI and Oracle announced an additional Stargate partnership – SoftBank curiously absent – ​​measured in gigawatts of capacity (4.5) and expected job creation (around 100,000).



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *