Question 550w or 1000w and above? efficieny question

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
You have to look at actual power used. buy a killawatt reader. it can also measure your kWh over time. you can put a $ amount on pc use. take your power bill (not incl gas) / kWh used to get your rate, $0.12 is average in the USA.

The real savings is going to be from having 2-3 yr old components. ryzen and skylake+ have cut idle power use down quite a bit. newer gpu idle very low as well.
I have a 9400f + 1660ti + 450W seasonic focus gold and i get 40W idle (monitor off) and 60W browsing web. My old setup gen 4 intel setup was 90/140. stuff a few gens older than that pulled max power almost all the time. the boost clock tech is a big reason why it's so good now. using less power keeps the core cooler allowing it to boost higher.

If you sleep your PC the cpu/gpu are completely off and the mobo is on. you should only pull 1-3W. powered off is about the same. if you flip off the power strip then it will be zero, but your PC clock battery may die if you do this for weeks at a time.
 
Last edited:

Ninjawithagun

Distinguished
Aug 28, 2007
683
3
19,165
70
If you buy a psu with much higher wattage then what your system draws, the psu will not operate at its peak efficiency. It wont consume a ton of extra power, but definately noticable. Extra wattage will also allow for future upgrades.

Higher 80+ certification will help to lower overall power draw, but you wont see that efficiency if you have excess wattage.

If you upgrade your gpu you will need to upgrade your power supply. While delta is a decent brand dell uses, the amperage and wattage ratings are not what id run a 580 on. Definately not a 2070. Dont worry much about efficiency, as higher 80+ certification doesnt always correlate to higher quality.
Actually, this is not true anymore. All of the new digital PSUs prevent additional power consumption below peak power thresholds by having the capability to switch between regulator and capacitor mode states ;-)
 

fry178

Reputable
Dec 14, 2015
776
12
5,365
146
@WreckerALeX
around 50-60% load is just an estimate you want to try and reach, but with less overall consumption on computers because of better overall efficiency,
not something i would put max priority on.
efficiency also changes with different temps, so max load at 20C will be different then if the unit is at 40C.

same for rating. getting a gold (vs bronze etc) usually means better quality as well, but there are many exceptions to the rule,
e.g. some bronze unit from xy would perform better than yx companies gold unit.

i love seasonic, but so far the evga g3/p3 still seem to be a tick better (parts/performance) and would be my preference.
do recommend going with a 600-650w to have some power left for upgrades (you will have the psu for a while),
and not having to swap it for a bigger one in 2 or 3 years because of hardware changes.
 
@WreckerALeX
around 50-60% load is just an estimate you want to try and reach, but with less overall consumption on computers because of better overall efficiency,
not something i would put max priority on.
same for rating. getting a gold (vs bronze etc) usually means better quality as well, but there are many exceptions to the rule,
e.g. some bronze unit from xy would perform better than yx companies gold unit.

i love seasonic, but so far the evga g3/p3 still seem to be a tick better (parts/performance) and would be my preference.
do recommend going with a 600-650w to have some power left for upgrades (you will have the psu for a while),
and not having to swap it for a bigger one in 2 or 3 years because of hardware changes.
the efficiency is at different loads. so if you're idle 75% of the time @ 40W actual on 90% eff your load is around 32W. going to 92% would only bring actual down to 39W. $1/yr savings. going from 75% to 92% only saves 3W or about $3/yr.
The ratings don't cover very low loads either. So you can't predict your savings at idle. spending quite a bit on a new psu to save a few watts 25% of the time probably isn't worth it.
Assume it could save you 20W thats (20/1000) kwHr * 2190 hours per year * 0.12 kwhr rate = $5 year while playing.

If you turn your PC to sleep or off you won't have any idle savings. I'd recommend testing your PC idle Watt usage with a killawatt and compare it to someone with a similar build to see if you're way over or not. a great deal of idle power comes from the mobo unless you have turned off power savings features. You would be better off trying to lower your load than increase psu efficiency to get better idle power savings.
https://en.wikipedia.org/wiki/80_Plus
 

fry178

Reputable
Dec 14, 2015
776
12
5,365
146
sorry, but outside copying data back and forth (memory), using dual vs single channel will have marginal impact on overall pc performance (5 maybe 10% max),
and wont hurt fps unless amount is low (less than 8gb), or vram amount is to low for game/settings (and gpu has to swap back and forth).
more ram (8gb vs 4) in SC would have bigger impact than 4gb in DC.

similar to the fact that under most (normal) game use, most mid to high end cards will run "faster" (e.g. higher overall fps) on a 8x slot,
than if plugged in at 16x, because of less overhead..

but unless the ram was changed (from prebuild), this is running 2x8gb.
 
Last edited:
sorry, but outside copying data back and forth (memory), using dual vs single channel will have marginal impact on overall pc performance (5 maybe 10% max),
and wont hurt fps unless amount is low (less than 8gb), or vram amount is to low for game/settings (and gpu has to swap back and forth).
more ram (8gb vs 4) in SC would have bigger impact than 4gb in DC.

similar to the fact that under most (normal) game use, most mid to high end cards will run "faster" (e.g. higher overall fps) on a 8x slot,
than if plugged in at 16x, because of less overhead..

but unless the ram was changed (from prebuild), this is running 2x8gb.
maybe not in every game. it's a really bad move for building either way. considering costs are the same. I wouldn't consider a 5-10% drop good either. 10% can be the difference between a midrange card and a $100 jump.

this person is dropping 20%
View: https://www.youtube.com/watch?v=D8AdbIfTwDs

View: https://www.youtube.com/watch?v=-k5wA7EFwpo
 

fry178

Reputable
Dec 14, 2015
776
12
5,365
146
overall pc performance, not game performance.
something all gaming/hardware sites have tested and its USUALLY not a big impact OUTSIDE the normal stuff.
even when i dont understand how ppl would have such a low fps in the first place (underpowered for the game you want to play) where 10% would make an impact,
as taking 60Hz as reference, 10% is just 6fps difference, not a 100$/next bigger gpu kind of level.
easily mitigated by switching to fastsync and running fps limiter (rtss/nv profiler).

not saying its smart, but for someone with limit slots/funds, its an option to install one stick now, and add another one later,
which isnt a problem with name brands especially when getting oced stuff where chips dont change.
 
Last edited:
Apr 20, 2019
11
1
15
0
2) I might keep this pc really long and I game a lot 4-6hr a day, sometimes 8. Is a Titanium recommended over a bronze if I decide to keep this pc like 5-7yr or maybe I can reused it if I get one with 10-12y warranty.

Not important but if you want my full story:
  1. Got new monitor(Asus ROG Swift PG27UQ) to experience 4k & HDR with ps4 pro
  2. Decided I want 4k & HDR on PC, so got new gpu (Asus rtx2070 strix gaming oc)
PC: Dell Inspiron 5675 (Everything Prebuild)
MOBO: Dell 07PR60
CPU: Ryzen 7 1700 (8core, 16 Threads) 3-3.7ghz @ 65w
GPU: Rx580 Polaris, 8gb GDDR5 1266mhz(Currently, rtx2070 on the way)
The RTX 2070 requires about 175W at max power, a puny 25W more than the RS 580 while delivering vastly superior performance. This means that for any given Game, the 580 will be burning the full 150W to produce (x) FPS whereas the 2070 would require much less than 150W to produce the same (x) fps.

For example, at full power, the 2070 gets 116.2 FPS on Witcher 3 but the 580 is only capable of 66.7 FPS at max power - according to gamedebate.
It is obvious that at any level of performance, the 580 will always be using more power to achieve the same or inferior results.

The monitor does not draw power from the PSU so its specs are irrelevant.

At idle, your PSU will be pulling up to 100 Watts from the wall. At full bore, it may go as high as 350W. If so, you have no overhead worth mentioning So the DELL PSU must go. A 550W PSU would provide a little overhead but I would go with a 650W - 800W. The higher wattage item will come in handy in case you change your mind about adding another GPU or badder CPU to the PC. Silver or Gold is fine, Platinum on sale is better.
 
Last edited:

fry178

Reputable
Dec 14, 2015
776
12
5,365
146
sorry, but the rating for efficiency is not equal quality.
there are lots of decent bronze units with better design/build quality than other with gold/plat ratings.
its an indicator, but not anywhere close to "guaranteed".
 
Apr 20, 2019
11
1
15
0
sorry, but the rating for efficiency is not equal quality.
there are lots of decent bronze units with better design/build quality than other with gold/plat ratings.
its an indicator, but not anywhere close to "guaranteed".
I did not use the word "guaranteed." ...
... and I too would rather have a bronze rated Cadillac than a gold rated VW.
Asus X99a 3.1
Intel E5-2690-3
64GB DDR4
970Evo M.2 1TB
Fusion-io Scale PCIe SSD 1.3TB
Several 2.5" SSDs totaling ~4TB
4TB Seagate HDD
Gigabyte GTX 1080
Corsair HX1000
 
Last edited:

hftvhftv

Honorable
Herald
May 26, 2014
810
107
11,240
28
sorry, but the rating for efficiency is not equal quality.
there are lots of decent bronze units with better design/build quality than other with gold/plat ratings.
its an indicator, but not anywhere close to "guaranteed".
Was reading through this thread waiting for someone to finally say that the 80 Plus rating means nothing as far as quality goes, and that manufacturers only need to send one power supply to have it certified by Ecos Consulting, allowing for them to cherry pick an especially good power supply to base the rating for their entire product line off of.
 

jonnyguru

Distinguished
Nov 30, 2006
889
246
19,440
44
the CSM isn't the best unit and doesn't use an up to date design.
I know this is an older post... but what the heck are you talking about? Are you confusing model names?

CS-M isn't an up to date design? It's uses an LLC resonant front end with DC to DC for the +3.3V and +5V. How do you want it to be more up to date? DSP? You want it to have I2C communication or something?

Seriously?
 

ASK THE COMMUNITY

TRENDING THREADS