Elon Musk says xAI will have more AI compute than everyone else combined within five years — Macrohard-branded Colossus 2 data center a nod to Musk's AI project to challenge Microsoft

Elon Musk, Grok 3.5, xAI
(Image credit: Shutterstock)

Elon Musk took to social media to claim that xAI, his pet AI project, will have more computing power than everyone else in the world combined in less than five years. The billionaire said this in response to an X post that says xAI is the “best team to join if you want to innovate,” specifically mentioning efficiency (intelligence per watt/mass) and scale (total harnessed energy/matter). The post Musk was responding to also shared the assessment of semiconductor industry research group SemiAnalysis, saying that xAI painting “Macrohard” on the roof of its Colossus 2 data center shows how serious it is in challenging Microsoft’s dominance.

Macrohard is Musk’s tongue-in-cheek name for his project to build a software company from the ground up solely using AI. Even though Elon ostensibly coined the name as a joke, SemiAnalysis says that the progress behind it is very real, with the Tennessee site showing progress towards more than 400MW of computing power, with the billionaire ordering a complete powerplant from overseas to help hit his target of 2GW of computing power in just one location. More than that, xAI is raising up to $20 billion to purchase more Nvidia GPUs to further expand Colossus 2.

xAI has been pushing hard to expand its AI computing capabilities, with its first supercluster with 100,000 H200 Blackwell GPUs getting set up in just 19 days — a process which Nvidia founder and chief executive Jensen Huang says usually takes four years. It has also been growing rapidly, with Musk stating that xAI aims to target 50 million “H100-equivalent” AI GPUs in five years, with 230,000 GPUs already operational and training Grok. This could be a plausible target, especially as the billionaire is seemingly willing and able to burn through more than a billion dollars a month to see his project through.

However, we’re unsure whether Musk’s claim that xAI will have more computing power than everyone else combined will prove accurate. After all, even though he might have an enormous amount of resources behind him, other companies and institutions are also spending boatloads of money to stay ahead of the competition. For example, OpenAI has a large data center in Texas with a 300MW capacity — and it’s projected to reach gigawatt-scale by mid-2026. In addition, China is investing in its AI industry, with the government considering allocating $70 billion to domestic chip fabrication alone.

It’s conceivable for xAI to become the largest AI company in the world within the next five years. Still, it’s unlikely that it will be able to top the combined AI computing power of Amazon, Microsoft, Google, OpenAI, Oracle, and other tech giants that are also spending billions of dollars on the industry. Even if Musk throws everything he has into xAI, including his own resources, labor force, and industrial might, he likely lacks the resources, labor force, and industrial capacity to beat all of the industry combined.

Google Preferred Source

Follow 3DTested on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Jowi Morales
Contributing Writer
  • LordVile
    Honestly all of these deals could backfire and bankrupt nvidia. They’re giving hardware to companies based on speculation of their value and loans. If the bubble pops the companies will all fold and Nvidia has sank tens of billions into nothing
    Reply
  • ezst036
    Admin said:

    Elon Musk says xAI will have more AI compute than everyone else combined within five years — Macrohard-branded Colossus 2 data center a nod to Musk's AI project to challenge Microsoft

    Elon Musk says xAI will suck more electricity out of the grid than everyone else combined within five years - Macrohard-branded Colossus 2 data center a nod to Musk's AI project to elevate your monthly energy bill into the stratosphere.

    There, I fixed it.
    Reply
  • Robert more
    The assessment considering two main elements -

    One would be mass production of AI specific chips by Tesla

    The other would be spacex lunching DoS datacenter-on-a-satellite on starship in mass

    Those two points, assuming successful, would skyrocket the endeavour. Unchaining the project from earthbound electric \ cooling resources.
    Reply
  • snemarch
    Robert more said:
    The assessment considering two main elements -

    One would be mass production of AI specific chips by Tesla

    The other would be spacex lunching DoS datacenter-on-a-satellite on starship in mass

    Those two points, assuming successful, would skyrocket the endeavour. Unchaining the project from earthbound electric \ cooling resources.
    Lol.

    Do you have **any** idea how difficult is to do cooling in space?
    Reply
  • sftwn
    Robert more said:
    The assessment considering two main elements -

    One would be mass production of AI specific chips by Tesla

    The other would be spacex lunching DoS datacenter-on-a-satellite on starship in mass

    Those two points, assuming successful, would skyrocket the endeavour. Unchaining the project from earthbound electric \ cooling resources.
    On the first point, this seems highly unlikely given their lack of relationships with existing fabs and the immense complexity of semiconductor manufacturing.

    On the second, deploying data centers in space actually makes thermal management harder, not easier. There's no atmosphere to conduct heat away, so you're limited to radiative cooling. And generating the necessary power would require massive solar arrays or nuclear reactors, neither of which unbundles you from significant physical infrastructure constraints.
    Reply
  • Tanakoi
    snemarch said:
    Lol.

    Do you have **any** idea how difficult is to do cooling in space?
    If you're attempting radiative cooling at ambient operating temp, it is. Of course it wouldn't be done in that manner. I'd imagine a heat pump operating a closed ammonia or PDMS loop, with the radiators themselves operating at ~400K. That would allow exhausting of several MW of waste heat using panels not much larger than those currently on the ISS.
    Reply
  • alan.campbell99
    Elon says lots of things, most of the time it's either overly optimistic or just plain lies. FSD and getting to Mars are two off the top of my head.
    Reply
  • COLGeek
    Please drop the politics folks. Thank you.
    Reply