⬤ Artificial-intelligence infrastructure is expanding faster than any other part of the tech sector - yet it has run into a barrier that money and engineering cannot remove. The obstacle is electricity. Training plus running large models demands power on a scale the grid was never designed to supply. Each new data hall adds more load and the total draw doubles every few years. Construction schedules now depend less on steel but also silicon than on whether local utilities can deliver megawatts.
⬤ Spending on AI hardware continues to rise, but the bill for the energy to keep it alive rises even faster. A modern AI facility differs from a conventional server room in the same way a steel plant differs from a workshop - it operates around the clock and pulls tens of megawatts at a single site. Planners who once treated power as a footnote now treat it as the deciding factor.
⬤ The difficulty is not solved - building more power plants. Electricity must travel through lines that already carry heavy traffic and the flow must stay stable to the millisecond. AI sites cannot endure flickers - they require redundant feeds, on site storage or fast-switch backups. At the same time electric vehicles, factories converting from gas and new homes equipped with heat pumps all claim shares of the same constrained supply. Progress in AI therefore, hinges on regional energy planning rather than on code alone.
⬤ Growth forecasts now start with maps of available substations also transmission corridors. Firms that possess capital and algorithms still stall if the local utility cannot commit megawatts for the next decade. The places that will host the next wave of AI clusters are those where grid operators, governments next to financiers move first to add lines, turbines and reactors. In short the route to smarter machines runs through power plants, switchyards plus copper wire.
Usman Salis
Usman Salis