Question Interesting question about PSU efficiency ratings

otringal

Distinguished
Feb 13, 2008
68
0
18,540
Hi, there!

So, I know quite a lot about the White vs. Bronze vs. Gold vs. Platinum vs. Titanium ratings, how they are implemented etc. and I also know very well that the rating only gives an indication about the efficiency at certain loads and doesn't really indicate much about the components' quality, but my question is not about that, so now with that out of the way, here's what I don't understand, with a concrete example:

Let's say we have a PSU with 500W rated as 80+ (to make this example simple, just assume a "regular" White, not Bronze/Gold/Platinum/Titanium). The manufacturer thus states that the PSU will have an 80% efficiency at those standard load values of 20%, 50% and 100%. BUT here's where it gets tricky: does this mean that it draws 500W from the mains socket and only delivers 400W (0.8 * 500) real power to the PC, or does it actually draw 625W from the mains in order to manage and deliver 500W (0.8 * 625) real power to the PC?

As you can see, I'm simply wondering how the power rating and the efficiency ratings are interpreted and calculated (in what direction does the equation go):

  • Power rating of 500W (which is equal to what is drawn from the mains) * Efficiency rating of 80% = True delivered power of 400W (which is what the PC gets)
??? or ???
  • Power of 625W (what gets drawn from the mains without you realizing) * Efficiency rating of 80% = True delivered power rating of 500W (which is what the PC gets)
TLDR version: for a given 80% efficiency, are those 500W the Input power, or the Output power?

The reason why I'm asking this is simple: I always thought that the first version is the true one (what actually happens from an electric point-of-view), but today I was browsing for some reviews and I ended up reading multiple articles and forum posts where people were arguing and explaining both versions with almost 50-50 opinions and that's what got me confused...

Plus, it makes a huge difference: it's one thing to know that you're electricity bill will not exceed the PSU's rating, but your PC components will get 20% less power and it's another thing to know your PC will get the PSU's stated power, but the electricity bill will be 20% higher. It's an apple vs. orange type of tradeoff, where we're comparing an incorrect PSU headroom for the components vs. an incorrect estimation of the electricity bills. Completely different problems with completely different drawbacks...
 
Last edited:
https://forums.tomshardware.com/thr...-will-draw-from-outlet.2296317/#post-15244585

Further reading;
 
Hi, there!

So, I know quite a lot about the White vs. Bronze vs. Gold vs. Platinum vs. Titanium ratings, how they are implemented etc. and I also know very well that the rating only gives an indication about the efficiency at certain loads and doesn't really indicate much about the components' quality, but my question is not about that, so now with that out of the way, here's what I don't understand, with a concrete example:

Let's say we have a PSU with 500W rated as 80+ (to make this example simple, just assume a "regular" White, not Bronze/Gold/Platinum/Titanium). The manufacturer thus states that the PSU will have an 80% efficiency at those standard load values of 20%, 50% and 100%. BUT here's where it gets tricky: does this mean that it draws 500W from the mains socket and only delivers 400W (0.8 * 500) real power to the PC, or does it actually draw 625W from the mains in order to manage and deliver 500W (0.8 * 625) real power to the PC?

As you can see, I'm simply wondering how the power rating and the efficiency ratings are interpreted and calculated (in what direction does the equation go):

  • Power rating of 500W (which is equal to what is drawn from the mains) * Efficiency rating of 80% = True delivered power of 400W (which is what the PC gets)
??? or ???
  • Power of 625W (what gets drawn from the mains without you realizing) * Efficiency rating of 80% = True delivered power rating of 500W (which is what the PC gets)

The reason why I'm asking this is simple: I always thought that the first version is the true one (what actually happens from an electric point-of-view), but today I was browsing for some reviews and I ended up reading multiple articles and forum posts where people were arguing and explaining both versions with almost 50-50 opinions and that's what got me confused...

Plus, it makes a huge difference: it's one thing to know that you're electricity bill will not exceed the PSU's rating, but your PC components will get 20% less power and it's another thing to know your PC will get the PSU's stated power, but the electricity bill will be 20% higher. It's an apple vs. orange type of tradeoff, where we're comparing an incorrect PSU headroom for the components vs. an incorrect estimation of the electricity bills. Completely different problems with completely different drawbacks...
#2 , (power output / power input)x100 =efficiency %. This is taught in 2nd year college electronics (ET or EE).
 
#2 , (power output / power input)x100 =efficiency %. This is taught in 2nd year college electronics (ET or EE).
@dev_cyberpunk I know that as well as anybody else, yet if you re-read my question more thoroughly, you'll realize that the answer (formula) you gave me is precisely the root of my question. You said:

(Pout / Pin) * 100 = Eff

However, both of my versions are correct from a numerical standpoint, yet quite different from a phenomenological perspective:

(400 / 500) * 100 = 80​
(500 / 625) * 100 = 80​

In other words, for an Efficiency of 80, the ratio can go both ways depending on whether those 500W represent the Input power or the Output power. Now do you see why it's confusing as to which version is actually true?
 
Last edited:
Plus, it makes a huge difference: it's one thing to know that you're electricity bill will not exceed the PSU's rating, but your PC components will get 20% less power and it's another thing to know your PC will get the PSU's stated power, but the electricity bill will be 20% higher. It's an apple vs. orange type of tradeoff, where we're comparing an incorrect PSU headroom for the components vs. an incorrect estimation of the electricity bills. Completely different problems with completely different drawbacks...
The biggest concern should be build quality, if you run a 500W PSU at 500W all of the time, or at least a lot, and the PSU is not good quality it will heat up and burn out, depending on how good or "good" the build quality it will also burn out your mobo or not.

If this is just about your bill ,or about safety as explained above, then you can limit CPU and GPU power either from the bios or using software options.

But yeah, a 500W PSU will output 500W and draw more than that to achieve it.
 
  • Like
Reactions: otringal
A PSU only draws the wattage demanded of it, regardless of it's max capability.
The efficiency rating is involved in increasing the draw to the wall to net the desired wattage.
There are advantages to installing a much stronger psu than the nominal requirement.

1. It will operate more in the most efficient middle third of it's range.
2. A strong psu allows for a stronger gpu replacement, one of the most common upgrades.
3. Some graphics cards have occasional high power spikes, well above the nominal draw. A stronger psu can
handle those spikes better.
4. A strong psu will be quieter; the fan may not normally need to run.

Downside... it may cost a bit more.
 
  • Like
Reactions: otringal