Watts per hour help.

LilBloOd

Reputable
Jul 7, 2014
27
0
4,530
Alright I got a system built that's about 1000 watts. My question is. I have a i7 4790k. How many watts per hour will my CPU use per hour on full load? Lets say it's overclocked at 4.4 GHZ. And lets say if it was overclocked to about 4.8-5ghz..

Question 2! If I was to use my computer and it was on 24/7 with my CPU running at full load ( 95-100% CPU usage )
How long can I expect it to last me b4 it starts to lose performance or stops working @ stock speed no over clock what so ever? Ps it is being water cooled if that accounts for anything.
 

InvalidError

Titan
Moderator
Watts are watts and are already a unit of energy per unit of time: joules per second. "watts per hour per hour" is redundant and incorrect; just watts will do. If you want to know how this influences your power bill, you need watt*hours and multiply that by your utility rate.

As for how much power an overclocked Intel CPU uses, 50W extra sounds like a reasonable high-mark.

At stock clock, the CPU itself will likely last 10+ years unless something else (ex.: the water pump) fails first and kills it.
 

LilBloOd

Reputable
Jul 7, 2014
27
0
4,530
I guess I'm asking how many watts is the CPU? 30 watts not clocked? About 50 watts overclocked ? At full load?

Btw the CPU will be running 24/7. :) I know it will because the software I'm using uses all cores n I got the CPU gadget showing 100% CPU usage ;).

I been using the software non stop on my old desk top. Do t care if it breaks or burns out because its like 5 year old 500 dollar comp from bestbuy. But so far it has been running non stop 16-20 hrs a day for like 3 days... Other 4-8 hrs I been using for net or playing games ;).
 
And the PSU will be pulling 1000Ws out of the wall no matter basically.

Are you bitcoin mining? Sounds like it kind of, if so you might want to check out these threads:
http://www.tomshardware.com/forum/id-2041369/cryptomining-hardware-thread.html
http://www.tomshardware.com/forum/id-2041557/crypto-currency-mining-graphics-cards.html

1000Ws is also INSANE overkill for a system with just an i7 (and a 500 bestbuy one at that what?) and a single card.
 
The PSU will be pulling a lot less than 1000W in most cases...
It will just pull what is necessary for the system (divided by the efficiency), so it should be closer to 500W at the most...

But in this case, it seems that the system consumption is 1000W, so it should pull about 1000W, as you say. So it 1000W. If you use it for 1 hour, you have 1000W*h, so you have 1000Wh per hour of use (1000W=1000Wh/h)

That would mean 1000Wh per hour of use, to make it simple. (1kWh per hour, so 24kWh a day, so about 720kWh a month)

You should probably add some kind of factor to account for it not being used completely all of the time. For example, if it's used 20 hours a day (83%), just multiply that, and you would get about 600kWh a month, for example.
 

Slobodan-888

Reputable
Jul 17, 2014
417
0
4,860
Lol. I read so many bollocks in this topic from the various "experts".

You can't measure the CPU power consumption, only the power consumption of the whole computer (or computer + monitor + UPS,...).

In order to know the actual power consumption of your computer, you need to buy (or borrow) a power meter. I have this one: http://www.peaktech.de/productdetail/kategorie/engergiemessgeraet/produkt/p-9035.html

With a power meter plugged in, you start using the computer. After a while, it will give you an accurate min, max and average power consumption, and also power factor. And various other data. You then take average power consumption (in W) and divide it with average power factor (could be from 0 to 1) to get apparent power figure (in VA). It is apparent power that electric companies are charging you for. Then all you have to do is multiply the apparent power figure with the number of hours that computer will be on every month. And you will get the amount of energy consumed.

While measuring it, it is best to plug the input of the UPS into the power meter rather than just the input of the computer PSU, because that way you will be measuring the power consumption of the monitor and the UPS (and everything that is on and plugged in to the UPS) as well.

Example:

Average power consumption is 250 W, power factor is 0,75 and computer is on 24 hours a day for 30 days a month.

250 W / 0,75 = 333,333 VA
333,333 VA * 24 h * 30 days = 240 KVAh of energy consumption for a month
 

InvalidError

Titan
Moderator

OP asked specifically about CPU power; not whole-system power. CPU power is relatively easy to calculate since the CPU's VRMs are all fed by the ATX12V/EPS12 cable.

CPU-only power consumption may not be a completely reliable indicator of its impact on whole-system power draw but CPU-only power figures is what OP initially asked for.
 

Slobodan-888

Reputable
Jul 17, 2014
417
0
4,860
I don't know what OP stands for.

OK, then it is like this. Go to BIOS and set the fixed value for CPU voltage. Take a multimeter, set it to DC current measurement, get the CPU power connector out, put one multimeter probe in +12V of the PSU connector (short the 2 +12V wires in connector) and other probe in the +12V of the CPU power connector on the motherboard. Connect GND of the PSU connector to the GND of the CPU power motherboard connector.
Power on a computer, load it (the CPU), and measure the current. After a while, take the average current measured and multiply it with the CPU voltage set in the BIOS (if possible, measure the voltage with other multimeter) and you will get the average power consumption of the CPU.

Note that many multimeters are only capable of measuring up to 10 A of current, and that CPU will draw over 80 A (@ 1,25 V, if CPU usess 100W) at full load, and that it will destroy your multimeter.

Unless you have a current probe for your multimeter (which I doubt). If you do, you dont even have to remove connector...
 

InvalidError

Titan
Moderator

The voltage on the ATX12V/EPS12 is 12V and to provide ~100W @ 1.25V, the VRM will be drawing only ~10A from those 12V wires. The 80A within the buck converters stays within their input filters, shunt inductors, output caps and the load; the upstream PSU only sees the average ~10A current.
 

LilBloOd

Reputable
Jul 7, 2014
27
0
4,530
I think it's to late for bitcoin , alt coin mining ATM. Well main reason is I'm trying to get solar power set up ;). I was under the assumption if CPU is the main thing running it will be the main source of the watts and I was just gonna add like 300-400 watts on top for the idle components. So Pretty much it doesnt mater which componant is using how much power.If I have a 1000W power supply it will draw 1000 watts no mater if CPU / components r idle or not? So I will be using 1000 watts even if the pc is on idle just sitting there for 24 hrs?
 

Slobodan-888

Reputable
Jul 17, 2014
417
0
4,860
1.000 W is the max power that PSU can supply. Unless you have high-end GPU(s), your entire computer will draw about 200-250 W at full load, and around 120 W at idle. Devide that with power factor, since inverter will be rated by max apparent power output.

For example, my computer, built around AMD A10-7850K, without discrete GPU, draws around 150 W during gaming (World of Tanks), and that is including 23" LCD monitor and UPS.
 




I don't think the CPU is effective at all for any type of coin mining.

I tried coin mining, and i found it would have taken me like 900 days to earn 0.1 bit coins. (of course I only had a single graphics card to do it with but I wanted to try and do it during my off-times)
 

InvalidError

Titan
Moderator

BitCoin is a dead-end for anything but ASIC miners.

Scrypt on the other hand is almost entirely dependent on memory access latency and as such, does not scale anywhere near as well to GPU/FPGA/ASIC since they are all stuck waiting for RAM most of the time... and the way Scrypt reads, modifies and writes memory randomly makes it pretty much impossible to optimize.

GPU, FPGA and ASIC Scrypt mining will be more power-efficient but not much faster for a given amount of memory chips and memory channels.
 

LilBloOd

Reputable
Jul 7, 2014
27
0
4,530
So just quick touch up and back to te basics.... 1000w psu uses 1000 watts no mater what the work load? Or do I only get charged depending on what's the total power being used by the components ? Ie CPU / memory is being used to the max n the gpu is only 50% work load..... Sry if I don't make sence no sleep in 30 hrs
 

InvalidError

Titan
Moderator

Go sleep and re-read the thread. Your question has been answered half a dozen times if not more already.