Why do graphics cards use more power than they're rated for?

lemonsauce

Reputable
Jan 25, 2015
89
0
4,630
R9 280's are rated at 200tdp, yet when they're benchmarked, they use as much as 360 watts. This is making it very hard to decide which power supply is sufficient.
 
Solution
They also say they test at the wall meaning you're losing some from efficiency. And it's gaming so not the max. I like techpowerups breakdown of power but it doesn't look like they have any reviews for the 280.


That's exactly the problem.

Some say 500, others say 600, and some even say 700. What I want to know is how much does the card ITSELF use.
 

That is because it depends on the card/manufacture. Different manufactures have different power requirements.
 


THIS!^

I hear people use the TDP to correspond to how much power components use so often, sometimes I almost believe they are right and then I remember one of the first things I ever learned about . Thank you, sir, have a +1
 


Pretty much.



It isn't due to the manufactures having different power requirements. The stock PCB R9 280xs require 600W, just like the HD7970s did. It is the overclocked ones that require more power, such as a R9 280X Vapor-X.

All stock GPUs always have the same power requirements because they are designed by Sapphire for AMD and everyone follows the same power spec on those, they might use different RAM chips that might make a slight but negligible difference.
 
The problem is I already have a 500w power supply with 40 amps on the 12v rail. I've just now seen that the r9 280 uses 360watts at max load, which is 30 amps. The question for me is, can the rest of my system run off of 10 amps? My cpu uses 60 watts on it's own, that leaves 60 watts for my 1 HDD, 1 DDR3 ram stick, 1 dvd drive, and a few usb and case fans.
 


Well I just purchased this power supply over a month ago thinking it would be sufficient, so i'm out of the retfund window. I was under the impression that the r9 280 used 250watts maximum. I guess it's just a no-go then.
 


Oops. Still, it's a measure of heat, not electrical consumption.
 


Ok then. How do you explain the fact that MSI's own website states their r9 280 consumes 250 watts under "power consumption". Yet when tested, it used 100 watts more than that.
 
Was the testing done with stock settings or overclocked? And where are you seeing the power consumption on MSI's site? I checked both the 3GB and 6GB models they currently show, and both list Power consumption as N/A.

Please post links.
 
Ok, thank you. It could be the 250W is meant as an average or mean power, agreed it would be better to list the maximum for proper power budgeting. Then again, most PSUs are most efficient at about 50-60% of their maximum rated load.

The testing 360W you quoted was the card's power consumption during overclock.
 


Thanks for the info.

In any case, it appears that R9 280's use up to 350 watts at maximum load. The lowest I saw was a non-overclocked XFX R9 280 that used 330 watts at full load. So either way it looks as if my 500w power supply is still not enough.
 





You guys are right, I totally missed that.

I've also noticed that the only time the R9 280 (or 280x) gets into or near 300w is in futuremark testing. Whenever they test with actually games, it's about 50 watts less. Any idea?

http://www.anandtech.com/show/7400/the-radeon-r9-280x-review-feat-asus-xfx/20


EDIT: Forgot gaming doesn't run GPU to the max.