News "Everyone and Their Dog is Buying GPUs," Musk Says as AI Startup Details Emerge

If everybody and their dog were buying GPUs than both AMD and Nvidia wouldn't be slashing production orders with TSMC.
Elon likely didn't mean consumer type gpu you'd game on. (thats peanuts to nvidia overall)
ai focused & data center ones are selling.
datacenter/auto are still high.

chrome_5MQmZlfcAL.png
 
Elon likely didn't mean consumer type gpu you'd game on. (thats peanuts to nvidia overall)
ai focused & data center ones are selling.
datacenter/auto are still high.

chrome_5MQmZlfcAL.png
This may not be obvious, but a business can make more money selling less product. Keeping prices high means really fat margins. Would you rather make $300/GPU on average and sell a million or make $100/GPU on average and sell two million? The executives will happily sacrifice volume for margin.
 
  • Like
Reactions: atomicWAR
This may not be obvious, but a business can make more money selling less product. Keeping prices high means really fat margins. Would you rather make $300/GPU on average and sell a million or make $100/GPU on average and sell two million? The executives will happily sacrifice volume for margin.
It all depends on how price-sensitive demand is. When you talk about something like oil, demand is relatively inelastic. A lot of individuals and businesses simply need gasoline, jet fuel, heating oil, etc. so OPEC can get away with cutting production by a little bit and watch prices skyrocket.

For AI, we're talking about businesses which hope to turn compute power into $ (not unlike crypto), so you have a RoI associated with spending on compute. If Nvidia's compute is too expensive, people will either look elsewhere or just put their AI plans on hold.

The way it works out is that it's in Nvidia's best interest to produce as many GPUs as needed to keep up with demand, especially while they're offering the market-leading option. Artificially limiting demand might not boost prices enough to offset the drop in volume, and will ultimately benefit alternatives, like AMD, Intel, Cerebras, Graphcore, Tenstorrent, and dozens of other companies trying to play in this space.
 
  • Like
Reactions: King_V
Elon the grifter. Trying to use his influence to artificially raise the prices of GPUs. Pretty sad attempt.
Huh? What does he stand to gain by doing that?

For me, what's surprising is that he's not using Tesla's Dojo supercomputer. I wonder if that's because it's not as good at Transformer networks, or just because they can't build it up fast enough to accommodate the additional demand.
 
Last edited:
Indeed, the NVIDIA H100 80GB HBM2e is simply unobtainium for individual developers and SOHO.

On NewEgg you can get one for $42,000 (https://www.newegg.com/p/1VK-0066-00022).

On eBay the cheapest is $41,500.

Stratospheric pricing in the sky over Fantasy Land.

At least it suggests that graphical GPU for low-life are way too cheap compared to compute GPU :)

Stop complaining!

/s
 
  • Like
Reactions: bit_user
to artificially raise the prices of GPU
Jensen already doing this w/o Elon.

hat's surprising is that he's not using Tesla's Dojo supercomputer.
his was built for FP16 data sets.

my limited info on that stuff is its good to use after you have a development of soemthing but not best when you want to make something. (as less accuracy for the increased speed) where as nvidia's are less focused and have benefit for his current needs.
 
Indeed, the NVIDIA H100 80GB HBM2e is simply unobtainium for individual developers and SOHO.

On NewEgg you can get one for $42,000 (https://www.newegg.com/p/1VK-0066-00022).

On eBay the cheapest is $41,500.

Stratospheric pricing in the sky over Fantasy Land.
In the last thread on Elon's big GPU purchase, I looked at how much Dell wanted for adding it to their standard 2U server platform (PowerEdge R750xa):
"They want an absolutely astounding $86,250 per H100 PCIe card, and they make you add a minimum of 2 GPUs to the chassis!!!"​
Update: today, I see they've dropped the price to only $54,329.13 each! That puts the minimum configured price at an unthinkable $121,242.99, and they have the absolute gall to claim that's after $71,235.83 in savings!

Having a decent amount of experience with Dell servers at my job, I know they like big markups for add-ons, but I'm still pretty stunned by that one.

The takeaway is: never complain about price-gouging, until you see the Dell price!
 
Last edited:
  • Like
Reactions: KyaraM and domih
Now, THAT would be fun! Can't wait.
I wanted to touch on this earlier, but was busy. I'm sure I'm not the only one who thinks educating an AI via Twitter in its current state of trolls and hatespeech unlimited is a TERRIBLE idea. "Hey, I'll use twitter to teach an AI about humanity? What could go wrong?" Apparently Musk has never seen The Terminator.
 
Last edited:
Huh? What does he stand to gain by doing that?

For me, what's surprising is that he's not using Tesla's Dojo supercomputer. I wonder if that's because it's not as good at Transformer networks, or just because they can't build it up fast enough to accommodate the additional demand.
Why mention it at all? Wouldn’t you keep quiet to avoid any run on GPUs?

Look at the idiotic comments he made about SF being unsafe and Bob Lee’s murder.

The guy is a full on clown. Who cares what he thinks.
 
  • Like
Reactions: KyaraM and King_V
Look at the idiotic comments he made about SF being unsafe and Bob Lee’s murder.
Not sure quite what you're getting at, but strong evidence suggests Bob Lee was killed by a colleague. That's not to say SF doesn't have some really tough street crime problems...

The guy is a full on clown. Who cares what he thinks.
Agreed. It's hard to know when to take Elon seriously. He's certainly gotten very reckless, not to mention irresponsible.
 
  • Like
Reactions: King_V
Clown? The guy who's Starship rocket is going to have an orbital test within 24 hours? The guy that owns the highest valued auto company on Earth? Richest person in existence?
Ever heard of Howard Hughes?

The line between greatness and insanity is not always so clear and bright. Someone could also start on one side of it and veer onto the other. Potentially adding mind-altering drugs to that mix could make for a very volatile combination.

Yeah, what a clown. That bandwagon is looking awfully full from where I'm standing.
I don't know how closely you've followed the Twitter drama, but Elon has yet to demonstrate much in the way of genius or even competence, at the helm of it. A lot of his decisions seem fairly rash and often have to be walked back. And he's put some rather fundamental policy decisions to a twitter poll, as if on a whim.
 
His new AI venture will be a separate company, but could use Twitter data.

"Everyone and Their Dog is Buying GPUs," Musk Says as AI Startup Details Emerge : Read more
This is why Nvidia Consumer GPUs are so high priced right now, if they can't make a decent profit then the time at TSMC would be better spent making chips for the A100 and H100. 10,000 H100 = 80,000 AD100 chips (Like the 4090 but fully enabled) and really all the 4090 really was is a place to use the chips when they were ironing out the manufacturing process and tweaking it to get higher yields and currently a place to put the chips that don't quite cut it for the H100. If Nvidia does release a 4090 Ti they would like have to sell it for inexcess of $3000 up to $3500 or they would be leaving money on the table by not putting those chips into the H100

I just get a big laugh out of all the clueless predicting Nvidia is struggling and going out of business because they are pricing their consumer GPUs so high when the reality is Nvidia is cranking out high end chips as fast as they possibly can
 
If everybody and their dog were buying GPUs than both AMD and Nvidia wouldn't be slashing production orders with TSMC.

Elon is just distracted by the new toy because his last new toy (Twitter) isn't very much fun anymore.
Nvidia is NOT slashing production orders at TSMC they are simply using the time they booked over a year ago for consumer and moving that time over to make their Professional lines ..... However AMD is likely slashing orders (and that comes with a penalty) since they still have a ton of 6000 series devices they need to get rid of before they can start production of the 7800 and 7700 lines and Nvidia is likely to go in and buy that time up too to make A100 and H100 chips

To give you an idea of the high demand for the H100 the normally $35,000 H100 units are selling on eBay for $40,000 - $46,000

Do you realize how many 4070 or 4080 they would have to sell just to make the same profit they will make off the 10,000 H100 units they are selling to Elon who isn't even close to being one of their biggest customers (Microsoft, AWS and Oracle are by far)
 
  • Like
Reactions: KyaraM
On the day the 4090 released if you had taken $1600 and invested it in Nvidia stock you could sell that stock today, get your $1600 investment back, buy a $1600 4090 and still have over $300 left over to put towards a new CPU or monitor
 
  • Like
Reactions: Why_Me
10,000 H100 = 80,000 AD100 chips (Like the 4090 but fully enabled) and really all the 4090 really was is a place to use the chips when they were ironing out the manufacturing process and tweaking it to get higher yields and currently a place to put the chips that don't quite cut it for the H100.
You're badly misinformed about this stuff.

I'm not quite sure I follow what you're saying about 10k H100 = 80k AD100. As far as I can tell, there exists no such chip as the AD100. There's an A100, which is made on TSMC N7. However, the AD102 (RTX 4090) and H100 are both made on TSMC 4N. And the GA102 used Samsung 8 nm.

As for the AD102 vs. H100, the former die is 608.5 mm^2 in size, while the latter is 814 mm^2. And no, the AD102 isn't merely shaved down from the H100 - they're fundamentally different designs. The H100 lacks a display controller and (virtually?) any raster or ray-tracing hardware. The H100 also features vector fp64 support, which the AD102 lacks, and replaces GDDR6 memory controllers with HBM2e or HBM3.

We really haven't seen Nvidia repurpose a 100-series GPU for "gaming" cards since the Titan V launched in 2017.

Here's a secret decoder ring. Study it. Learn it. Know it.

If Nvidia does release a 4090 Ti they would like have to sell it for inexcess of $3000 up to $3500 or they would be leaving money on the table by not putting those chips into the H100
The conflict exists at the wafer level. Nvidia ordered a certain number of 4N wafers from TSMC. Until those wafers go into production, they can choose which mask to use on them. So, it's diverting wafers instead of chips. But, the tension is real.

I just get a big laugh out of all the clueless predicting Nvidia is struggling and going out of business
Whoever is saying that isn't looking at their quarterly reports. Their revenues are still down from their peak (as of the latest quarter to end), but they haven't been unprofitable for a long time.

AMD is likely slashing orders (and that comes with a penalty) since they still have a ton of 6000 series devices they need to get rid of before they can start production of the 7800 and 7700 lines and Nvidia is likely to go in and buy that time up too to make A100 and H100 chips
If it's truly RX 6000 GPU orders that AMD is slashing, then no. Most of those are made on TSMC N7, and the 4N process Nvidia uses for AD GPUs requires entirely different fab production lines. I don't even know if the N5 and 4N production can share much/any of the same equipment.

To give you an idea of the high demand for the H100 the normally $35,000 H100 units are selling on eBay for $40,000 - $46,000
That's basically what the article said, but I think $35k isn't even the "normal" price for H100's. If you bought back in November or so, they'd probably have cost much less. I seem to recall seeing a price of $18k, but I'm not 100% sure that wasn't for an A100.
 
Last edited:
Huh? What does he stand to gain by doing that?

For me, what's surprising is that he's not using Tesla's Dojo supercomputer. I wonder if that's because it's not as good at Transformer networks, or just because they can't build it up fast enough to accommodate the additional demand.
Because it simply doesn't have the scalability of the H100 based systems which is the real story here when talking about why the H100 is in such high demand, there is simply nothing else like it anywhere on the markets

For instance you can take 8 H100 units and form a DGX Server, then you can take 9 DGX servers and form a DGX Pod (71 H100s in parallel) then you can take 32 DGX pods and form a Super Pod ( 2,272 H100 working in parallel) which makes it one of the most powerful supercomputers but the key here is they are fairly easy to build and Nvidia has a massive software and development ecosystem to make it all work and no one else has anything close to that. Moore's Law for chips may be dead but now Moore's Law applied to supercomputers is going to take over
 
  • Like
Reactions: KyaraM
Because it simply doesn't have the scalability of the H100 based systems
According to whom?

For instance you can take 8 H100 units and form a DGX Server, then you can take 9 DGX servers and form a DGX Pod (71 H100s in parallel)
A Dojo tile has 9 petaflops of compute power, 11 GB of SRAM, and 10 TB/s of bandwidth to 160 GB of HBM. Each training matrix has 6 of these tiles. A full ExaPod allegedly has 120 tiles, providing 20 exaflops.

I'm not trying to argue which is better, but it seems clear that Dojo can scale pretty well. So, I wonder whether you're referring to an actual source, when you say they aren't using Dojo because it lacks the scalability of H100, or if you're merely guessing at the reason.
 
Last edited:
training an AI with twitter's data ? that would be the perfect recipe for a Sociopath skynet lol
Twitter is a cesspool of horrible, nasty people. Even worse since EM reversed the ban on a lot of offensive and fake news accounts
Elon, please use 4chan's data too to complete the model, while you are at it.
 
This may not be obvious, but a business can make more money selling less product. Keeping prices high means really fat margins. Would you rather make $300/GPU on average and sell a million or make $100/GPU on average and sell two million? The executives will happily sacrifice volume for margin.
The same executives won't be so happy about sacrificing sales for margins when sales dry up when people who refuse to pay $400 for entry-level GPUs decide to give up on PC gaming altogether or only play what works on IGPs. Who's going to buy 80-tier-and-up GPUs if all PC games have to be optimized for IGPs because affordable entry-to-mid-level GPUs no longer exist? Quite possibly not enough people for high-end consumer GPUs to remain economically viable.

Sacrificing sales for margins only works while the GPU space is an effective monopoly for Nvidia. If Intel decides to make a play for market share by launching Battlemage at the same price points as Alchemist with approximately double the performance per dollar, Nvidia's shareholders may get nervous about how quickly Nvidia is hemorrhaging gaming market share. Intel may also become the budget GPGPU scientist's budget king, which could then become a springboard for Intel's DC GPGPUs and threaten Nvidia's future A100/H100/etc. sales.

The consumer market built Nvidia. Neglecting it for too long could also bring it down.
 
  • Like
Reactions: bit_user
Clown? The guy who's Starship rocket is going to have an orbital test within 24 hours? The guy that owns the highest valued auto company on Earth? Richest person in existence? Maybe you take umbrage with his network of satellites that provide broadband speeds even in the middle of the Pacific ocean and kept Ukraine from falling to the Russians?

Yeah, what a clown. That bandwagon is looking awfully full from where I'm standing. Take that trash back to Ars Technica with the rest of the pretender "intellectuals".
Being stinking rich doesn't really make you intelligent, though. Likewise, being intelligent does not mean you are a scientist. He is a businessman. He doesn't know a hoot about rockets or satellites, so please don't act as if he did anything else than spending money for their development because he didn't. He didn't develop jack. And with Twitter, he also proves that you can still be a clown even when you are a successful businessman.