PSU power for standard systems with a regular CPU and single graphic card

dor_13

Distinguished
Oct 26, 2011
202
3
18,685
Nowadays hardware requires less energy.

CPUs TDP is about ~65W (e.g. i5-7th gen, NOT the 'k' version),
while high-end graphic cards reach ~170W or less (e.g. evga GTX 1070).
SSDs/HDDs/fans require few watts so I'll neglect them.

But still I see systems that use a high-power PSU (600+W),
while IMHO it should use only 400W PSU to utilize the 80+ power efficiency ratings which is adversely affected by a low load, as seen from load regulation tests.

I don't understand this phenomena, I was thinking maybe I overlooked important details - if so please clarify.
 


Future upgrade?
CPU's socket gets different - meaning that they will need to replace it with a better technology.
Technology gets better - meaning that power requirements get lower with time.
PSU may exhibit fails after 3 years and needs to be replaced, IMHO it is better to replace it after 4 years..

Regarding the link:
Given 400W PSU, rated 80+.
Assume it supplies 80% efficiency at all times.
Means that 10%400 = 40W, so it may supply 320W - that's more than enough considering the above wattage in my first post's example.

Also PSUs supply the rated power to the DC output, see specs (just an example) here.
Meaning that if the system requires 400W - the PSU will draw from the outlet more power in order to supply that 400W.
 
We're talking about excess heat, I guess... Otherwise, the discussion is pointless, in my opinion.
I prefer a power supply that would allow adding a second graphics card, several storage drives and so on, than one selected based on the calculated hardware wattage.
Take into consideration also the quality of the power supply components. Not all PSUs are "Platinum" quality.
 
Does excess heat differs greatly between PSUs of different power ratings?
For example let's take 400W and 550W which only differ by 150W.
I'm not sure about that, it's mainly a design choice.. Should be checked individually on each PSU.

HDD "black" takes up to 7W, that's few.
In standard systems there are 2 SSDs and 2 HDDs I suppose. But you can still add more HDDs w/o problem.
SSDs are quite power-efficient relative to HDDs.

Adding a second graphic card makes it a non-standard system in my view.
(Standard system = "a regular CPU (not overclocked) and single graphic card".)
Normally a second graphic perhaps require the same wattage or even less (due to advancements it technology) so if we continue the first example:
Let's add a requirement of 170W to the system, we get a total of :
Code:
(2*170 + 65)=405W
Let's add to that the memory and other peripherals (3xHDDs/3xSSDs/4xRAM/8xUSBs) and add a safety margin of 50W then we get a total requirement of:
Code:
2*170+65 + (7*3 + 4*3 + 4*4 + 0.5*8) + 50 = 508 W
Still you can take a 520W PSU or if you want to be extra safe then 550W PSU,
But not 600W PSU which:
1. The PSU unit itself costs more.
2. High wattage PSU discourage energy savings - translated to a lot of energy waste in a large period of usage.

Take for example a "Bronze" rated PSU like this.
The total amount of power from its DC rails is 520W (as written nominally), while +12V rail allows for 480W.
It doesn't mean that DC output is effected by its efficiency.
 


I see people make this mistake occasionally. The rated wattage of a PSU is what it can supply to the computer. A 400W supply can supply 400W to the computer. The efficiency comes in on figuring how much it draws from the outlet to supply that power. An 80% efficient supply at 400W is drawing 400W/.8 = 500W.

Most single card systems get away with anything around 550W. The problem comes in when you buy a crappy PSU that may not be able to provide its' rating. I would agree that the trend is for lower draws from previous years.
 


Hmmm, did I say that? :??:
 


You didn't, Dor 13 did. I deleted the wrong message quote. There fixed it.
 

Yep, I know this.
I wrote it because of a minor misunderstanding, sorry.