Nvidia shipped over 900 tons of H100 compute GPUs in Q2 2023, says Omdia.
Nvidia Sold 900 Tons of H100 GPUs Last Quarter: Analyst : Read more
Nvidia Sold 900 Tons of H100 GPUs Last Quarter: Analyst : Read more
Or a third Elon Musk's net worth converted into $100 dollar bills and weighed. Also, you have a rounding error in your Golden Retriever count. It's 32,727 Retrievers and 3 Chihuahuas, to be precise.To add some more context to the story, here are some other things that weigh 900 tons:
- 4.5 Boeings 747s
- 11 Space Shuttle Orbiters
- 215,827 gallons of water
- 299 Ford F150-Lightnings
- 181,818 PlayStation 5s
- 32,727 Golden Retrievers
All to yield a 25% anonomoly rate. They will milk the plautau to get to the 99.99999% accuracy.All to run a stupid chatbot, and evilish projects to replace me and you the author, then he's or she's company or business offload more employees, increasing their profits, and leading to a crowd of jobless people.
The hype is real ¯\_(ツ)_/¯
To add some more context to the story, here are some other things that weigh 900 tons:
- 4.5 Boeings 747s
- 11 Space Shuttle Orbiters
- 215,827 gallons of water
- 299 Ford F150-Lightnings
- 181,818 PlayStation 5s
- 32,727 Golden Retrievers
Americans will use anything other than the metric system.
- 4.5 Boeings 747s
- 11 Space Shuttle Orbiters
- 215,827 gallons of water
- 299 Ford F150-Lightnings
- 181,818 PlayStation 5s
- 32,727 Golden Retrievers
That's around 180,000 cats!!!!
No way, nowadays people overfeed their cats quite a bit. I'm sure everyone knows somebody who has a way overweight cat.
I'd say only around 100k cats!
I would guess the mass majority of AI accelerators Nvidia sells are using the SXM form factor which aren't usable in regular PC's. For the PCI e form factor, adding video outs would only block a portion of the back plate and hinder cooling with no benefit to the purchaser of the product. As shown in the picture below, an unobstructed back plate is critical for cooling when there is no space between the cards.All I'm reading is
These H100s will be insanely cheap in a few years.
Too bad nVidia caught-on to AI-accelerator re-use as actual GPUs.
(Seriously, why aren't they catching flack for artificially making more eWaste?)
Why are people still buying the H100s now that the H200s are available and double the computing power?Nvidia shipped over 900 tons of H100 compute GPUs in Q2 2023, says Omdia.
Nvidia Sold 900 Tons of H100 GPUs Last Quarter: Analyst : Read more
And for that I am thankful.Americans will use anything other than the metric system.
Well, here's a P100 for a mere $180:All I'm reading is
These H100s will be insanely cheap in a few years.
They continue to offer PCIe AIC (add-in-card) form factor versions, even of the H100. You're probably right that the volume is disproportionately biased towards SXM.I would guess the mass majority of AI accelerators Nvidia sells are using the SXM form factor which aren't usable in regular PC's.
I'm a little unclear on the naming convention, but my understanding is that they have three different options which fall in that category:Why are people still buying the H100s now that the H200s are available and double the computing power?
I shall take all 100k cats and they shall be mine and I will give them all scrathums and pets.No way, nowadays people overfeed their cats quite a bit. I'm sure everyone knows somebody who has a way overweight cat.
I'd say only around 100k cats!
That's a lot of litter boxes to clean!I shall take all 100k cats and they shall be mine and I will give them all scrathums and pets.
I wonder if they'd still appeal to you for more food, if you'd never directly fed them, and they never saw you refilling the feeder or smelled cat food on you. I'm going to guess not.Side note, I use an auto feeder that I program the amount and timings of. My two cats kinda hate it because they want more. But once that beast releases, they go ape for it, then they come to me begging for more. And I give them skritches and loves, but not more kibbles.