News Elon Musk says xAI is targeting 50 million 'H100 equivalent' AI GPUs in five years — 230k GPUs, including 30k GB200s already reportedly operational...

Admin

Administrator
Staff member
Thanks for the anslysis, very well done.
Sam with the power infra and Musk with compute infra may meet at each other’s ends in 2029.
 
100 million, 200, 700 mllion!

And we still will have no intelligence in AI. In my opinion they are a glorified data search engine.
The only accomplishment I can see them make is dethrone Google as a search engine.
 
Dollars to donuts in that timeframe we will have tech (hardware/software if not completely hardware) running at least tenfold more efficient as "H100 equivalents".
 
No wonder my electricity, water, and gaming costs keep increasing. All these hardware improvements should help Grok hallucinate in a fraction of the amount of time it takes now! I'm so excited to have it unreliably perform all the same tasks Siri can do with 1,000,000x the environmental impact!
 
  • Like
Reactions: Stomx
16 bit performance hasn't changed in quite a while. The only things changing are the <mostly marketing> tricks used to report the numbers. Apples to apples it shakes out like this (source techpowerup):

H100 sxm= 267.6 TFLOPS
H200 sxm= 267.6
B200 sxm= 248.3*

*These are single die numbers, B200 has 2x dies per package, so while the package is a good bit faster, the underlying point is the same; Nvidia doesn't have any fundamentally ground breaking new performance now or anywhere in sight.
 
  • Like
Reactions: Stomx and JRStern
Dollars to donuts in that timeframe we will have tech (hardware/software if not completely hardware) running at least tenfold more efficient as "H100 equivalents".
We (or they) can already build today's chatbots about 1000x faster than five years ago.
By next year they'll add another 10x at least when all the B200/B300s are up and running.
There are several different hardware and software improvements kicking around now that claim another 10x or better, each.
Basically by Christmas the world will have all the GPUs they'll need for the next ten years.

By 2035 high school kids will build their own chatbots with components from hobby lobby and their own smartphones.
 
  • Like
Reactions: Gururu
16 bit performance hasn't changed in quite a while. The only things changing are the <mostly marketing> tricks used to report the numbers. Apples to apples it shakes out like this (source techpowerup):

H100 sxm= 267.6 TFLOPS
H200 sxm= 267.6
B200 sxm= 248.3*

*These are single die numbers, B200 has 2x dies per package, so while the package is a good bit faster, the underlying point is the same; Nvidia doesn't have any fundamentally ground breaking new performance now or anywhere in sight.
The addition of more HBM helps overall throughput.
So does faster networking.
So do FP8 and FP4.
There are several other improvements kicking around. FP16 may not change but there is much else that can improve, a lot.
 
I mean, Musk talks a lot, and no doubt retains the option of changing his mind.
Altman's megalomaniac dreams and fears - are apparently fully present in Musk as well.
HOWEVER the idea that AI means Scale means AI, has pretty much already failed.
Musk can spend every penny he can lay hands on buying GPUs and get himself in the history books as the craziest dude in history.
He may already be up for that with Starship and his Mars project, which is already on the edge of flaming out. There's no purpose in it, Mars is a poor destination, but it will cost at least 3x what Musk has imagined to complete it. Is it worth that?
I salute Musk's madness for trying, with so little analysis. Full employment for thousands of engineers and craftsman. Better than just building giant yachts and stuff.
Musk's SpaceX is a great success, though they had to hose him down a few times so they could put in the discipline to make it work.
Musk's Tesla gave us the modern EV ten years before it would otherwise have arrived. Whether that is a good thing or not, I can't say.
Musk bought Twitter and cleaned it up like Hercules and the Augean stables, a story for all time, even if he blew an extra $20b on it simply because he ran his mouth - that's a story for all time, too.
And he got Trump elected or we'd now have President Kamala, you can rate that as you like.
Crazy old world, ain't it.
But the whole world doesn't need 50 million H100s now or ever.
Better he build a ten gigawatt Magic 8 Ball.
 
  • Like
Reactions: Stomx and rluker5
The addition of more HBM helps overall throughput.
So does faster networking.
So do FP8 and FP4.
There are several other improvements kicking around. FP16 may not change but there is much else that can improve, a lot.
Agreed. There are improvements for workloads that can use them. I'm just refuting the 2x+ performance gain per generation presented in this article. Edit: (specifically in 16 bit workloads)
 
French Guyana has far less than 9GW power generation installed. That would be like New Zealand, or almost Ireland.
 
I mean, Musk talks a lot, and no doubt retains the option of changing his mind.
Altman's megalomaniac dreams and fears - are apparently fully present in Musk as well.
HOWEVER the idea that AI means Scale means AI, has pretty much already failed.
Musk can spend every penny he can lay hands on buying GPUs and get himself in the history books as the craziest dude in history.
He may already be up for that with Starship and his Mars project, which is already on the edge of flaming out. There's no purpose in it, Mars is a poor destination, but it will cost at least 3x what Musk has imagined to complete it. Is it worth that?
I salute Musk's madness for trying, with so little analysis. Full employment for thousands of engineers and craftsman. Better than just building giant yachts and stuff.
Musk's SpaceX is a great success, though they had to hose him down a few times so they could put in the discipline to make it work.
Musk's Tesla gave us the modern EV ten years before it would otherwise have arrived. Whether that is a good thing or not, I can't say.
Musk bought Twitter and cleaned it up like Hercules and the Augean stables, a story for all time, even if he blew an extra $20b on it simply because he ran his mouth - that's a story for all time, too.
And he got Trump elected or we'd now have President Kamala, you can rate that as you like.
Crazy old world, ain't it.
But the whole world doesn't need 50 million H100s now or ever.
Better he build a ten gigawatt Magic 8 Ball.
I'm not a fan boy but credit where credit is due. He recently sold X (Twitter) for 45 billion total evaluation. Space X launches 80+% of every pound into space for all of human kind at 1/10 the the cost a decade ago and thru massive success pushed the car industry to electrify ( I don't want an electric car) He has had definite flops like Hyper loop but I know wonder if that is actually geared towards Mars too because tunnels protect you from lots of hazardous stuff on Mars. This AI thing. He was one of the OG's of open AI. It was supposed to be open source to help humans. It wasn't, so he left (Altman gives me the creeps). We are going to need factory producible, scalable nuclear to power this stuff . Under current regulations nuclear power sites cost 500,000 to 2,000,000 per cubic meter to construct and each plant is different from the next. We need economies of scale . It is as safe a solar if you look at statistics. The AI power needs will not scale in a simple line. It is happening if we want it or not. We should not hand over control to the machines. If we do we will perish as a species. 75% chance of non alignment with AI and it chooses our population elimination
 
Constantly hearing these glorified electronic parrots which rely on rote learning called intelligence is getting tedious. They aren't intelligent. They are just parroting what they've been fed, that is when they aren't completely making things up. In many instances they have failed tests requiring only rudimentary logic and at least one gas been beaten at chess by an Atari 2600.
 
  • Like
Reactions: snemarch
It seems like there is a factor of 1000 error somewhere in the article. either Musk 50 million is off. or check the conversion from Tflops to PetaFlops to ExaFlops. 50 million would mean 50,000 Exaflops with can't be right.
 
Elon Musk's xAI aims to achieve 50 exaFLOPS of AI training compute — equivalent to 50 million H100 GPUs — within five years. Thanks to Nvidia's rapid performance scaling, it is more than doable even with less than a million GPUs, but that will likely require an immense amount of power.

Elon Musk says xAI is targeting 50 million 'H100 equivalent' AI GPUs in five years — 230k GPUs, including 30k GB200s already reportedly operational... : Read more
all i can see is nvidia gonna be worth a quadrillion dollars someday.
 

TRENDING THREADS