JohnnyLucky :

cjl - Welcome to the party!

You are correct. I did oversimplify. I restricted my comments about active and passive power factor correction to the original version. I did not mention any variations. I did that to avoid confusion. Energy efficiency did not apply to the OP's comments and issues.

In an atx revision a goal was set for 70% energy efficiency. A subsequent atx revision incrased the goal to 80% energy efficiency. I have not seen the final version of atx version 3.0 so I don't know if the goal has been increased again.

The power factor you describe in your first paragraph is related to the atx energy efficiency goal. We now have some power supplies that can actually exceed 95% energy efficiency over a large operating range. I don't think we'll ever reach an energy efficienct power factor of 1 since that would mean 100% energy efficiency. If Jack popped in I'm sure he would post a lengthy comment about the effect of electrical resistance which is one of the electrical phenomena that reduces energy efficiency and the related power factor. That is just one example why a power supply draws more power than it delivers.

We often have individuals ask if a 500 watt power supply will draw 500 watts at the wall outlet (mains) when the pc system only requires 300 watts during a gaming session. The typical response is the power supply only draws what is needed. Although correct, the response is incomplete. As you correctly pointed out the power supply will actually draw more power.

No, a power factor of 1 would NOT signify 100% efficiency. Power factor and efficiency are two separate things (and, interestingly enough, passive PFC is often more efficient than active PFC, but less effective at correcting the power factor). Power factor is, quite simply, the ratio of real power to apparent power in an AC circuit. That is all. It has nothing whatsoever to do with the efficiency of the power supply, it only has to do with the number of watts it draws (input) compared to the number of VARs (volt-amperes) that it draws (also on the input). In an ideal resistive load, this ratio is exactly 1. On a reactive load, this ratio is less than 1 however, and this means that the current draw is higher for a given real power (watts input), which puts a higher load on the upstream components such as wiring, surge protectors, and the overall power grid.

Efficiency is a separate matter. Efficiency is the ratio of real power out (watts) to real power in (watts). Most modern power supplies have an efficiency between 80 and 92%, so their output power is 80-92% of the input power. They also often have active power factor correction, and this allows them to run at a power factor of >0.98 most of the time. This means that the input power is roughly 98% or higher compared to the input volt-amps. This is excellent, and with modern electronics, a power factor of essentially 1 is not that difficult to achieve.

To give an example, lets say you have an old power supply without modern PFC. Lets say it has a power factor at a typical load of 0.6, and an efficiency of 70%. I'll compare it to a modern 80+ PSU with 85% efficiency and a power factor off 0.98. I'll assume each is powering the same system with a power draw of 300W.

First, using the efficiency, you can find the power draw of each power supply to supply 300W. The first power supply will require 300/70% = 427 watts from the wall. This is the real power required. The second power supply will require 300/85% = 356 watts. So, the 80+ PSU will save over 70 watts in this case compared to the old PSU. This doesn't tell the whole story however. If you want to find out how close you are to overloading your circuit, you need to know how many amps the power supply is drawing. In order to figure this out, you need the power factor. The first power supply, with a power factor of 0.6, has a real power to apparent power ratio of 0.6. As a result, it will be drawing 427/0.6 = 712 volt-amps of apparent power. This is equivalent to just under 6 amps of draw on a standard US 120V outlet, which is quite significant. The modern PSU, with active PFC, will draw 356/0.98 = 363 volt-amps of apparent power (just over 3 amps). This is a huge difference - the modern power supply will require just over half the current that the older power supply will to drive the same load, even though the efficiency difference is only 15%.

This is the reason that a high power factor is desired - it's true that your power meter only cares about the real power (not the apparent power), and only the real power matters as far as heat generated or efficiency are concerned, but the apparent power determines the current flow, which is what determines whether you will blow a breaker or melt a wire (or some other such bad thing) when trying to run a system. Even though the maximum power available from a standard US outlet is 1800W, a 200W system with a power factor of 0.1 would overload the circuit due to the excessive current draw (due to the low power factor).