[SOLVED] Maximum Power Draw Question

box o rocks

Distinguished
Apr 9, 2012
750
10
18,985
Looking at the specs for both the GTX 1070 (reference) and the RX 570 (reference), I see they are both rated as a 150W cards. Yet AMD specs the minimum PSU for the RX 570 as 450W, while Nvidia specs a 500W as minimum for its GTX 1070. How can that be? Shouldn't they both require the same minimum size PSU?

Btw, I already know the 1070 is faster, and am aware of the difference in power draw increase of overclocking. I'm just curious about the way the specs are presented.
 
Solution
the "recommended specs" are usually way over the tops since they don't know whether you pair these cards with an entry level cpu or an enthusiast platform.

so unless you're using some heavily overclocked threadripper CPU, 450W should be fine, given that the unit is quality.

Tom_nerd

Reputable
Jul 15, 2019
107
13
4,615
this is because Nvidia are allowing for the graphics card to boost this increases power consumption which means that a higher rated power supply is advised. it also just comes down to how safe they want to go, and because AMD are well known in their GPU division to be targeting the budget PC then they want you to save the extra $10 to get a cheaper power supply. another possible reason is that they are accounting for the CPU,RAM ETC... and NVidia are using a higher powered system than AMD. although if you are buying a system then I would recommend going for a 500W power supply.
 
this is because Nvidia are allowing for the graphics card to boost this increases power consumption which means that a higher rated power supply is advised. it also just comes down to how safe they want to go, and because AMD are well known in their GPU division to be targeting the budget PC then they want you to save the extra $10 to get a cheaper power supply. another possible reason is that they are accounting for the CPU,RAM ETC... and NVidia are using a higher powered system than AMD. although if you are buying a system then I would recommend going for a 500W power supply.

I mean it still only used 150W in all tests since that's the power limit set.

but those recommendation are always to be taken with a grain of salt. they basically just pick a number to cover themselves legally.

a 450W PSU will do fine for either card.
 

box o rocks

Distinguished
Apr 9, 2012
750
10
18,985
Both explanations sound plausible. I happen to have both cards; A Gigabyte GTX 1070 Mini and a XFX RX-570. Both have a small factory OC. I also have an open air rig I was going to test them on, but it only has a 450W Evga PSU. I wonder if it would handle a CPU + GPU stress test?
If I watched system peak watts at the wall with each card, would that be a relatively accurate test?
 
the "recommended specs" are usually way over the tops since they don't know whether you pair these cards with an entry level cpu or an enthusiast platform.

so unless you're using some heavily overclocked threadripper CPU, 450W should be fine, given that the unit is quality.
 
Solution

box o rocks

Distinguished
Apr 9, 2012
750
10
18,985
OK, here's my results as measured with a Kill-A-Watt meter at the wall:

IBT and Furmark running together (CPU/GPU both @ 100%)
---------------------------------------------------------------------------
RX 570 & i7-4770: 274W peak
GTX 1070 & i7-4770: 300W peak

Neither GPU looks to be too much of a strain on my little Evga 450W PSU with a +12V rail rated for 420W.