⬤ Elon Musk announced that his AI company xAI is aiming to control more AI computing power than all competitors combined in less than five years. He pointed to a massive data center project called Colossus 2, which has been playfully branded "Macrohard" online and is being positioned as a serious challenger to major AI infrastructure players. The expansion is directly tied to training xAI's Grok AI platform.
⬤ Colossus 2 is currently under development across sites in Tennessee and Mississippi, with over 400 megawatts already installed or planned. The project's ultimate goal is to hit around 2 gigawatts at a single location, using dedicated power generation to speed up deployment far beyond what traditional utilities can deliver. If completed as planned, it would become one of the world's most power-hungry AI facilities.
⬤ Musk revealed that roughly 230,000 GPUs are already training Grok, but the real ambition is to reach 50 million "H100-equivalent" GPUs within five years. Based on Nvidia's H100 SXM specs of up to 700 watts per unit, that would demand approximately 35 gigawatts just for the GPUs themselves—a staggering energy requirement that underscores the scale of hyperscale AI growth.
⬤ The announcement highlights how quickly AI infrastructure ambitions are escalating, with xAI positioning itself as a major player in large-scale processing capacity. It also reflects the intense capital demands, chip supply competition, and energy planning challenges that come with building increasingly complex and compute-intensive AI models.
Victoria Bazir
Victoria Bazir