Musk to expand xAI's training capacity to a monstrous 2 gigawatts with third building at Memphis site — announcement comes days after Musk vows to have 'more AI compute than everyone else'
xAI makes its next move in the AI race.
Get 3DTested's best news and in-depth reviews, straight to your inbox.
You are now subscribed
Your newsletter sign-up was successful
Elon Musk has revealed that xAI has purchased a third building in its Memphis, Tennessee, site near the Colossus 2 data center to expand its training capacity. The billionaire said on X that the structure will be named 'MACROHARDRR', an extension of his 'Macrohard' project, wherein Musk intends to build software completely from the ground up by solely using AI agents. This additional purchase will supposedly push xAI’s overall training compute capacity to a staggering 2 gigawatts.
XAI has bought a third building called MACROHARDRR. Will take @xAI training compute to almost 2GW. Try @Grok. Download latest app.December 30, 2025
The new building is just the beginning of the expected expenses, though. After all, xAI still must acquire the GPUs, power sources, and more for the site to be productive. And even though Musk is one of the richest people on earth, he’s still working to raise tens of billions of dollars to help fund it, as xAI is burning through over a billion dollars a month, as it races to build the most advanced AI on the planet.
Even though Musk is still working on the funding for the project, Nvidia has already reportedly signed a deal to deliver the needed GPUs for the site, helping him reach his goal of acquiring 50 million H100-equivalent GPUs in the next five years. Musk's ultimate goal, though, is to have more AI compute than everyone else combined, challenging Microsoft and other AI titans. Aside from acquiring the chips for the AI data center, the billionaire also needs to find a way to power it. It’s already been confirmed that Musk bought an overseas power plant and is shipping it to the U.S. To power Colossus 2. XAI has already set its sights on installing a gas turbine facility, which is set to supply 460MW from natural gas, helping the firm achieve its lofty compute capacity goals.
Although xAI is a relatively new entrant to the AI race, it quickly caught up with other, more established players like OpenAI due to the significant resources being poured into the project by Elon Musk. Nvidia CEO Jensen Huang even called his first Colossus project a “superhuman” effort, especially after the facility began operation after just 19 days — a feat that usually takes four years. But with other players in the AI game also spending billions of dollars on their own projects, it would be interesting (or terrifying, depending on your perspective) to see where all this expenditure will ultimately lead in the future.
Follow 3DTested on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.
Get 3DTested's best news and in-depth reviews, straight to your inbox.

-
Zaranthos Love it or hate it this stuff is exciting and will benefit us all in one way or another, hopefully.Reply
I'll be back. -
Stevemeister How much fossil fuel and emissions are being created from the energy generation needed to keep these places running.... Just think about that when your car fails its smog test for being slightly over the limitReply -
blppt Reply
Don't worry about fossil fuel...they're restarting Three Mile Island, lol.Stevemeister said:How much fossil fuel and emissions are being created from the energy generation needed to keep these places running.... Just think about that when your car fails its smog test for being slightly over the limit -
jp7189 Reply
Welp.. I asked ai cause I'm too lazy to figure out the numbers myself and it says it's equivalent to about 1/2 million miles driven (average car) per hour of turbine operation.Stevemeister said:How much fossil fuel and emissions are being created from the energy generation needed to keep these places running.... Just think about that when your car fails its smog test for being slightly over the limit -
usertests Reply
I think the main benefit is going to be from so much money sloshing around (>$1 trillion) that it could result in disruptive technologies being developed and brought to market faster. Obviously in the area of AI accelerators, but also memory/storage technologies, optical interconnects, and possibly even fabrication. Even if you take AI off the table, some of these could remain extremely useful.Zaranthos said:Love it or hate it this stuff is exciting and will benefit us all in one way or another, hopefully.
I'll be back. -
Amdlova Reply
The right question is how many water they need to cool the data center. Fly'n rivers are every where... In my country the cities with flood are increasing.Stevemeister said:How much fossil fuel and emissions are being created from the energy generation needed to keep these places running.... Just think about that when your car fails its smog test for being slightly over the limit -
Zaranthos Replyusertests said:I think the main benefit is going to be from so much money sloshing around (>$1 trillion) that it could result in disruptive technologies being developed and brought to market faster. Obviously in the area of AI accelerators, but also memory/storage technologies, optical interconnects, and possibly even fabrication. Even if you take AI off the table, some of these could remain extremely useful.
That's a good point, with that kind of capital expenditure the technological expansion will overflow the bucket and spill out to other things. Historically many revolutionary inventions spawned growth in many other areas, like the automobile, the railroad, electricity, petroleum, etc. The more humans are liberated from spending time on lesser tasks the more time they can focus on loftier goals. Technological advancement that freed up time has generally resulted in more advancement. For most of our time on Earth we spent most of our time hunting and gathering just to survive. -
timsSOFTWARE It's not going to help that much - scale is no longer the problem, so adding more of it will yield only diminishing returns.Reply
It's like resolution - if all you had was VGA before, and move to 4K, that's a huge upgrade. But once you have 4K, upgrading to 16K would take a whole lot more computing power to drive it, but the difference would be relatively miniscule. -
Zaranthos ReplytimsSOFTWARE said:It's not going to help that much - scale is no longer the problem, so adding more of it will yield only diminishing returns.
It's like resolution - if all you had was VGA before, and move to 4K, that's a huge upgrade. But once you have 4K, upgrading to 16K would take a whole lot more computing power to drive it, but the difference would be relatively miniscule.
True to a point. But there is still vast room for improvement. Quite a few things are still hardware limited. Some of the complex and multi-depth reasoning for one. Multi faceted problem solving and broad branch research. If all you're asking is for data it knows it can spit that out to a lot of people. If a lot of people start asking for it to look beyond the data to the methods and adjust for things like bias or demographics you can add many layers of complexity to many questions. Most of the AI models right now are too limited to garbage in garbage out.