[SOLVED] Why is peak power consumption never considered for GPUs?

Dec 21, 2020
6
0
20
I'm about to upgrade my PC, ready for either the RTX 3070 / 3080, but am unsure about what PSU wattage I require. I'm curious as to what kind of head room to go for but notice that nobody really seems to mention peak power consumption of a GPU (which can be a substantial increase in comparison to their rated values - theoretically double). If a GPU exceeds the rated continuous wattage of an EPS, I would expect one of two things to happen:

1) Minor overshoot - The PSU would handle it, depending on the amplitude and it's duration (a statistic that I can't seem to find on my current EPS - TX550M).

2) Large overshoot - The PSU hard limits the total output going to the PC which would either cause a momentary blip in the GPU's performance or, less likely, a system shutdown.

As a blip in the GPU's performance is more likely, I'd like to think that this is the reason why NVIDIA states a system power requirement of 650 W for the rtx 3070 (nominal 220 W), which people often state as a safety barrier for "idiots" trying to cheap out on thier PSU.
 
Solution
2020120812485140.png


I saw Aris link this on a discussion elsewhere recently. Top graph is peak, bottom is average. They do spike a lot, and I wouldn't say nobody ever covers it. Full system power draw with 5ghz 9900k, though CPU isn't stressed in furmark. The real power draw was probably about 20% lower because this was measured at the wall after some power is wasted by the PSU.

For the most part, QUALITY PSUs are designed to tolerate spikes like this. I would bet you could run a 3080 on most good 650w PSUs without any issues even if you are loading your CPU at the same time. The continuous power draw is probably not going to be anywhere near 650w...

Eximo

Titan
Ambassador
Bit of both. They are also having to consider the typical power requirements found in the average PC. Which would go up as you start looking at expensive GPUs like these. So they think they need 220W, but the rest of the system might need 250W, and that puts it very close to a 550W, so they recommend a 650W for some margin... So it is a safety barrier, but not necessarily for idiots, just common sense. When it gets down to idiots, it is the lower power GPUs and their minimum power supply recommendations. There they are dealing with cheap OEM PSUs.

Certainly some tests I have seen done with the RTX3080 required going up a size in power supply or two, but I have also seen people cram them into ITX boxes with SFX power supplies (but the designers knew very well what people were doing with them, so left a lot of overhead for power spikes). Nominal power is one thing, but GPU boost is another, and then there is overclocking beyond that, so your 220W can easily turn into 300W before you know it.

You could probably run a RTX3070 off a 550W PSU by setting a lower default power limit and foregoing overclocking, really depends on what else you are running.
 
Last edited:
  • Like
Reactions: ricardodflorez
2020120812485140.png


I saw Aris link this on a discussion elsewhere recently. Top graph is peak, bottom is average. They do spike a lot, and I wouldn't say nobody ever covers it. Full system power draw with 5ghz 9900k, though CPU isn't stressed in furmark. The real power draw was probably about 20% lower because this was measured at the wall after some power is wasted by the PSU.

For the most part, QUALITY PSUs are designed to tolerate spikes like this. I would bet you could run a 3080 on most good 650w PSUs without any issues even if you are loading your CPU at the same time. The continuous power draw is probably not going to be anywhere near 650w, though it may spike over this which a good PSU should be able to take.

However, PSUs with older design or several SeaSonic models using modern designs have had issues with spikes, first when Vega came out and spiked a lot, now they have issues with Ampere spiking.

One thing that causes Nvidia to give such steep PSU recommendations is this spikes, but also modern (cough mostly Intel) CPUs drawing a massive amount of power, like 250-300w for stock 10900k and 300+ Watts if you overclock it.
 
  • Like
Reactions: ricardodflorez
Solution
Dec 21, 2020
6
0
20
Thanks for the swift responses. From how you've answered, I'd say my assumptions about NVIDIA's power requirements were correct and that I might as well "level-up" my EPS. Something I'd like to point out though, which is one of the main reasons I asked my original question, is that've used the following power calculator:

https://outervision.com/power-supply-calculator

As well as others (newegg), and their recommended PSU wattages seemed suspiciously low (418W on newegg and 422 W on newegg). Surely that's rather misleading for "noobs"?

Anyway, for context I plan to use a 5600X (when I get my hands on one), I'm not particulalry interested in overclocking at the moment and have a B450M Pro4 - which they've released a BIOS update for the 5000 series chipset.
 

TRENDING THREADS