⬤ AI data centers are now bumping up against energy limits, not just chip shortages. xAI had to bring roughly 1 gigawatt of power online for a single facility after Tennessee permitting issues pushed construction into Mississippi—requiring new high-voltage transmission lines and multiple turbines just to keep the lights on.
⬤ You can't estimate a data center's power draw by counting GPUs alone. Beyond the accelerators themselves, these facilities need electricity for networking gear, CPUs, storage arrays, and industrial-scale cooling systems designed to handle peak heat loads. Cooling infrastructure alone can tack on about 40% to the baseline compute energy bill.
⬤ Keeping everything running smoothly adds even more overhead. To handle power fluctuations and maintain stable operations, data centers build in an extra 20-25% capacity cushion. Put it all together, and roughly 110,000 GB300-class units—with networking, storage, cooling, and maintenance systems—pull around 300 megawatts of continuous power.
⬤ Frontier AI facilities now operate at the same energy scale as small cities. Building them means securing grid access, power generation capacity, and major infrastructure buildout—making electricity supply a core constraint on deploying cutting-edge AI systems.
Eseandre Mordi
Eseandre Mordi