News Nvidia RTX 4080/4090's Power-Hungry PCB Design Allegedly Detailed

I do wonder how they will deal with future where they are too hot to cool with modern cooling.

Do they just stop making new gpu's at that point?
Do they have to have massive quad slot cooling?

I seem to recall wasn’t there a problem in one US state with OEM systems using over a certain amount of power? I know several years back the EU were looking into possibly putting power limits on domestic PC’s. In a world where energy usage and associated emissions is becoming more heavily scrutinised I do wonder if requirements go up for next gen cards if the home PC industry will see new regulations imposed in some countries across the globe.
 
I seem to recall wasn’t there a problem in one US state with OEM systems using over a certain amount of power? I know several years back the EU were looking into possibly putting power limits on domestic PC’s. In a world where energy usage and associated emissions is becoming more heavily scrutinised I do wonder if requirements go up for next gen cards if the home PC industry will see new regulations imposed in some countries across the globe.
The limits were on how much power they consume at idle. However, I'm not too convinced this may be a problem still even if the card consumes 500W. For instance, I can get my RTX 2070 Super to idle at 20W and thought maybe my GT 1030 would be significantly better. Sure, from a percentage point of view it is at ~13W, but from an absolute perspective it's a pittance.

On a tangent, the only reason why I think this is a thing is because people leave their computers on all the time even if they're not using them. In some of the articles they mentioned a typical gaming PC consumes 63KWHr a year idling.
 

King_V

Illustrious
Ambassador
Ok, this is weird... is Nvidia getting lazy? Are they going to have THAT much more performance that more than justifies the increased power consumption, or are they just cranking up the wick and damn the torpedoes?
 

peachpuff

Reputable
Apr 6, 2021
607
641
5,760
still hit point where it wont be an improvement in performance and still require more cooling than will fit in most rigs.
But it's the inefficiencies of higher clocks that cause excessive power drain, that's why amd has low clocked 64core cpus that only use 280w while higher clocked consumer 8core cpus use a third of that. Sure single cpu workloads suffer but gpu don't have that issue.
 
Ok, this is weird... is Nvidia getting lazy? Are they going to have THAT much more performance that more than justifies the increased power consumption, or are they just cranking up the wick and damn the torpedoes?
To be fair, desktop parts simply ship with "performance at all costs" mindset. With a bit of undervolting and tweaking, you can get these parts to consume like 80% of their marketed TDP with minimal performance loss.