power for gaming rig

mhmd shoumar

Distinguished
Feb 11, 2015
562
1
18,995
Hi i'm going to buy this gaming rig :
gtx 970
8gb ram
fx 8350
GIGABYTE GA-970a d3p

i want to know what is the max power that it will consume without overclocking
THANKS
 


500w PSU

Assumed maximum demand due to losses. 625w

In one calendar month this would consume 451.50Kw/h

at a cost of 0.06c per Kw/h, the maximum cost is $27.09
http://www.dslreports.com/faq/2404
This website explains that if you where to run a PC 24/7 with 500w power supply it would cost you around $27 per month, although I don't think you will be running your PC 24/7 on average it will be $5-$10 deepening on where you live
 


Power consumption would depend on how much power is needed for achieving sufficient results suited for the settings.

If in game, most of the power will be consumed by the GPU. It may peak upto 314 Watts.

Check out this. It will help you understand better.

http://aphnetworks.com/reviews/gigabyte-g1-gaming-geforce-gtx-970-4gb/12
 
The bill is not going to increase simply because you have a higher wattage Power Supply, it increases due to the fact that the components in the computer(CPU, GPU etc...) consumes more electricity.

The fact to see here is the quality of the PSU which distributes the power.

At a higher power generation requirement, you want a PSU with a high Efficiency. This is because the Power PSU pulls from the wall, a little of it would be lost to heat generated by the conversion process in the PSU. How much efficient the PSU is in converting the power pulled out of the wall determines your cost for it.

If you have a good PSU with a 85% efficiency, & the system needs suppose around "100Watts" of power, the PSU will pull 115 Watts or so of power to compensate for this wattage loss during the AC to DC current conversion.

That is why we suggest a higher wattage PSU because if it is of good quality, it will be more efficient at this higher wattage generation and at idle power state which are the two most common states which may occur in any PC.
 


If you are concerned about power usage you probably want to avoid the AMD processor. They are notoriously huge energy users and expend a ton more heat than the equivalently performing Intel processors.

If I had to guess based on the items you linked I would put your machine at consuming an average of about 375-400 watts during it's peak usage.

Also, keep in mind that the more efficient your power supply is the less energy it uses to supply power to your machine but at some point it becomes cost prohibitive to get more efficient. For instance if you machine actually consumes 350 watts during it's gaming and you have an 80% efficient power supply the PSU actually uses 438 watts from the wall. If you had a 90% efficient supply instead it would use 389 watts instead, which saves 49 watts. That means if you pay 15 cents per kilowatt hour and run your machine like that for 5 hours a day it would save about 268kWh over a 3 year life... which is about 40 bucks.
 
The TDP is based on a yearly usage chart. It would never impact that much in real world as games are extremely GPU intensive. I am using an FX CPU and there is virtually no difference in the bill I get now and the one before when I used to use i5 4690. I am using my system 24/7 by the way. When I am not in front of it, It is downloading stuff. When I am on it, I will either be using 3Ds Max or Gaming.
 


Actually measured from the wall my AMD 965BE with a 7850 uses about 50 more watts during gaming than my 3570k with a 670. This is about half due to a lower efficient power supply. Since they both game about 4 hours a day and are running 24/7 (it also idles about 20 watts more) it probably would have been cheaper overall to get a better intel processor and have the cost savings in power.
 


It's not noticeable because of all the other fluctuations on the bills, but it is charged to you by your provider. For instance at 19 hours of 20 watts and 5 hours of 50 watts is 0.63kWh per day, or 230kWh per year... at $0.15 per kWh that works out to 34.50 per year. Assuming I run them like that for 3-4 years that's 103 - 138 bucks over the life of the machine that could have been the difference between the 965BE and an entry i5 or possibly another 3570k. It would have been the same overall cost for a vastly superior machine.

If the OP would switch to a 4690k currently they would see similar results. A better machine with lower power and a total cost that's about the same.

 


Which are terrible for gaming rigs these days. Unless he's making a headless POV-Ray renderer or something the 4690k will be a way better machine.

 
Well. That's what I mentioned earlier. Games are GPU intensive. :) Few games use up CPU. But this CPU will take care of that as well. If you are looking for power efficiency in a gaming rig, well. You should look at the GPU rather than the CPU. Better efficiency of the PSU will also make substantial difference.
 
OKay. The maximum amount of power the system would need while gaming will be generally stated by the GPU manufacturer. That would be the rate at which all the components in your computer are working at 100% capacity which will never happen unless there is something really wrong with the OS or the computer. Max on Max, under normal circumstances, your PC will consume around 350-450 Watts depending on the load on the GPU while gaming.
 


For a 4690k it would be somewhere hovering around 325 watts for better performance.

For the AMD it will be hovering somewhere near 375 watt for worse performance.
 
okay thank you very much but another question is that all the benchmarks i see on the fx 8350 and gtx 970 are on 1080 but te max resolution i can get is 180*1024 due to my 17 inch screen.The question is how much exactly the % of fps is higher between those 2 resolution for example if the game is running on ultra at 40 fps at 1080 how much should it run at 1280*1024.
AND AGAIN.. THANKS
 


It doesn't scale perfectly like this, but the general idea to get an estimate is that the number of pixels needed for a 1280x1024 screen is about 1.3 million and the 1920x1080p is just over 2 million. which means the frame rate of the 1280x1024 will be roughly 45-50% faster ... so getting 40FPS on 1080p would translate roughly to 60FPS at 1280x1024.

Of course that assumes the things lowing your frame rates down are completely GPU related, which is why you can only guess at the performance, but at least it will give you a rough idea when you are looking at benchmarks.
 
Traciatim is right. Those factors which would affect the FPs would be the in game components like the 3D environment that the game is played in. Like the meshes involved in each characters and atmospheric parameters like fog, light, how long the scene to be viewed has to be rendered. These would need high amount of Graphic memory and processing. If the Graphic card is doing at its best and still the game needs more resources, you will get into a drop in the frame rate or stuttering. This can be resolved to an extent by adding more Physical RAM into the system. But it may not be as effective as a higher rated GPU.