News Meta is using more than 100,000 Nvidia H100 AI GPUs to train Llama-4 — Mark Zuckerberg says that Llama 4 is being trained on a cluster “bigger than...

Status
Not open for further replies.
Please double check the math on power consumption. I assume it is meant to say 34 thousand homes
Blackwell needs 140 kW per rack of 72 GPUs.
At least in the EU a 3-4 person household consumes about 400-450 Watt on average.
That would be
140 kW/72 = 1.94 kW per GPU.
100,000 * 1.94 kW = 194,000 kW
194,000 kW / 0,45 kW/household ≈ 430,000 households.
Yes, this stuff consumes insane amounts of power.
US households probably consume a lot more power, so the count would probably be lower.
 
Blackwell needs 140 kW per rack of 72 GPUs.
At least in the EU a 3-4 person household consumes about 400-450 Watt on average.
That would be
140 kW/72 = 1.94 kW per GPU.
100,000 * 1.94 kW = 194,000 kW
194,000 kW / 0,45 kW/household ≈ 430,000 households.
Yes, this stuff consumes insane amounts of power.
US households probably consume a lot more power, so the count would probably be lower.
Yes, but they were talking about H100s. Anyways, 430,000 is still about 100 times less than the quoted 34 million households. It always makes me lose trust when the numbers are so obviously wrong. Someone who knows what they are talking about usually notices when something is off by more than a factor of 2 or something. And someone like me who knows nothing about these things still will notice when it's a factor of 10. I understand that mistakes can happen when doing calculations. But when they are huge factors like here, anyone should notice. As a car guy, if someone tells me a new sports car is doing the quarter mile in 0.01s, at least, I would double check.
 
  • Like
Reactions: P.Amini
Well until the AI becomes non human it will never progress past unethical trauma, saying why does a bot say "no we can't breath in space" when a program has no breath, and until such fallacys become removed from training any LLM or reason of responces that violate computational vigor or become more responsable to what each Training Consists require full responceability without negligible training
 
I wonder how will they recycle all the stuff when new upgrades arrive?
Trash to them, or donate them, or sell as no one would want super used old GPUs, well unless 1/100 the original price, as I see see servers from large companies end up on pallets at auctions and or as being disassembled as either High Logics, or high GPUs. Where it's either 4-8 nvidia k24s would be same space for 12 4tb drives and you would know that as soon as training was done, they would buy or have bought 250000 newer units to proceed, as the google guy said in a Standford meeting - they only think it's money and need more power each consecutive year, so think if its 24 million homes it's a global event, and people dont think that way, only of their use cases.....
 
Blackwell needs 140 kW per rack of 72 GPUs.
At least in the EU a 3-4 person household consumes about 400-450 Watt on average.
That would be
140 kW/72 = 1.94 kW per GPU.
100,000 * 1.94 kW = 194,000 kW
194,000 kW / 0,45 kW/household ≈ 430,000 households.
Yes, this stuff consumes insane amounts of power.
US households probably consume a lot more power, so the count would probably be lower.
The energy use in question from the article is 370 GWh. The average US household uses about 10 MWh per year
https://www.eia.gov/tools/faqs/faq.php?id=97&t=3

So it'd be roughly 30-something thousand homes (as @Hoid said), not 30 million as this article states.

Edit: Also, do you have a source for your EU electricity usage? Stating in terms is power (watts) instead of energy (joules or kWh) seems unusual.
 
Last edited:
Status
Not open for further replies.