mr91 :
Tomshardware states that the maximum wattage used with reference gtx 980 when gaming is 185 watts under extreme gaming conditions.
Do you really think a reference gtx 970 uses just 3 watts less than a Reference 980?
This exchange is pointless, the OP's question has been answered...
Why do you keep trying to steer the conversation back to reference cards... a) there is no such thing really as a reference 970 and b) the OP does not own a reference card. The actual power draw as measured with a meter for each of the cards has already been published on a peer reviewed site and been quoted.
1. Are you suggesting that TPU faked the published results ?
2. The link you provided does not include testing the consumption of the 970.
3. Are you suggesting that "typical stress test applications" (quote from your link) are more representative of what one experiences in gaming than actually using a game as TPU did ?
4. Let's look at the "How they test" which was linked to in the article you linked to
http://www.tomshardware.com/reviews/graphics-card-power-supply-balance,3979-4.html
But let’s get back to the water-cooled Radeon R9 290X in our diagrams. The card's TDP is set at 250W, which it doesn’t quite reach, even during stress testing.
VGA Card Total:
40W (min)
433W (max)
243W (average)
It doesn't quite reach the TDP of 250 watts ? Huh ? Isn't 433 > 250 ?
There's your error. You keep confusing TDP (an average draw) with peak current draw; they are not the same thing. TDP
by definition is the average amount of
cooling necessary to cool the GPU ... the TDP of 250 is not in fact exceeded by the **average** load of 243 watts. The TDP is however most definitely exceeded by the max power draw.
You install an elevator in your shopping mall after collecting data that the average weight of people is 250 pounds. But the people actually getting in his elevator range in weight from 40 pounds to 433 pounds .... do you need to worry that your 250 pound rated elevator won't cut it ?
Due to the varying current draw associated with the varying load, the 250 watt cooling system need only address the 250 watts because the thermal mass is able to absorb and equalize the load. But the maximum load pulled by the card
is in fact 433 watts according to your reference.
While the article later argues that with good caps, you don't need to worry about this.... I agree to an extent....if a) you do in fact have a PSU with great caps and b) you are not utilizing a non-reference card that drastically changes the behavior of the power delivery system as compared with a reference card
upon which the conclusions are drawn. The Gigabyte G1 is such a card. The reason it outperforms the other cards is because it **is** able to deliver more power to the GPU than the competition. The referenced method of testing makes no accounting for this and makes no accounting for overclocking. I have no issue with this approach ... anyone selecting a G1 or overclocking should be well aware of what the impact is. The question was "what is the biggest card" ? and the G1 is the biggest 970, both dimensionaly and power wise that can be put in a PC among the more popular choices
As an analogy, the stock 4790k w/ stock cooler works just fine with running just about any application. But run a synthetic stress test and you present a load which the stock cooler simply can not handle. Within a minute OCCT CPU test will shut down, the max temp allowed by the test having been exceeded. Run high usage apps something like handbrake or H.264 and temps can easily slide into the "uncomfy" zone. Same deal. But the OP isn't talking stock reference card as he is already overclocking and currently has a non reference card. This needs to be accounted for especially hen the word "biggest" is used.
Also remember that PSUs hit their maximum efficiency point at 50% load. Voltage stability and ripple suffer the closer you get to full load ... so for maintaining the higher overclocks, we want to keep our average loads down near that 50% - 70% load mark. So basing PSU size on average TDP loads + 50 watts or so....hurts us here too. All thinks being equal...a 500 watt PSU is at a disadvantage against a 620 watts simply because it will operate closer to its best efficiency point....it will also create less heat an draw less power from the wall.