⬤ AI data center construction is hitting unprecedented levels. Cumulative spending could blow past $300 billion by the end of the year—roughly 1% of U.S. GDP, surpassing Apollo's 0.8% and the Manhattan Project's 0.4%. The standout example is OpenAI's Stargate Abilene: a 1-gigawatt facility with over 250× the compute power of GPT-4's cluster, $32 billion in capital costs, land covering about 450 soccer fields, a workforce in the thousands, and an estimated two-year build. The main bottleneck? Power. Generating a model response takes over 100× longer than sending data across continents, so latency isn't the issue—electricity is.
⬤ Only a handful of countries can realistically host multiple 1+ gigawatt facilities. To put it in perspective, 30 gigawatts equals about 5% of U.S. power capacity, 2.5% of China's, and nearly 90% of the U.K.'s. Most projects start with on-site natural gas for reliable generation, then connect to the grid to pull in wind and solar when available. Inside these buildings, each half-square-meter server rack can draw as much power as 100 homes, forcing a shift from air to liquid cooling. At this pace, the sector could sustain roughly 5× annual growth in frontier training compute for the next two years without needing to decentralize—though operators might spread out to tap cheaper or stranded power. Right now, AI uses about 1% of U.S. electricity (compared to 8% for lighting and 12% for AC), and the sheer size and cooling infrastructure make these facilities impossible to hide.
⬤ Despite their Manhattan-Project-level scale, the public still has limited visibility into these ultra-large projects. Researchers have been combing through legal permits, satellite images, and news reports to piece together what's being built and where.
⬤ For investors and policymakers, the takeaway is clear: AI infrastructure is becoming a major macro story. The buildout strategy—pairing on-site generation with large grid connections, optimizing cooling, and chasing the cheapest clean power—means capital demands and power-market pressures will stay front and center. The scale alone guarantees ongoing scrutiny of supply chains, local grids, and permitting timelines as AI data centers evolve from big to ultra-mega.
Saad Ullah
Saad Ullah