FX-6300 using 45w max

Don`t mistake TDP with electricity consumption, this is the biggest misunderstood technical specifications on CPUs. 95W TDP typical for this CPU is the heat that needs to be dissipated by the cooler in order for it to function without burning up, it does not represent the electrical wattage used by the CPU.
 


um, what?

No, that's not true at all.

TDP stands for Thermal Design Power. It's the time rate of change of energy in the form of heat that the CPU should not exceed under standard operating conditions. That energy has to come from somewhere, and that somewhere is electricity. It's possible to exceed the TDP by apply abnormal synthetic loads such as Intel Burn Test, but Prime95 will get it damn close. It is exactly reflective of the typical peak power consumption by its very definition.
 


So why is my CPU only peaking at less than 60 Watts then?
 


It's most likely a sensor problem. The mechanisms used to measure power delivery are often horribly inaccurate. The only reliable method is to attach a line conditioner and RMS current meter between the wall socket and the PC.
 


If it gave you 83.7 watts at all times it was definitely giving you inaccurate readings. That's extremely common though, I've only ever seen a reasonably accurate software power reading on one system and that's the one that I'm on right now.
 
HWInfo64 gave the same wattage readings but there was two different readings for CPU temperature, one of them maxed out at a 67 d c (Fentik reading) during running Prime95 the other reading never went above 55 degrees or something. I am using the stock cooler by the way.
 


There are typically two different temperature sensors on each system.

First, there's a thermal diode on the CPU itself (often on each core). This is usually pretty accurate. Since it's closest to the heat generating components, it will be the hottest.

Second, there's a thermocouple / thermistor on the motherboard near the socket. This is less accurate, but gives you a better indication of the environmental temperature. It will always be lower than the ones on the CPU itself.
 
Ok... THERMAL wattage is NOT the "same" as ELECTRICAL wattage. If it was, your homes heating bills would be far lower. Different forms of energy, different methods of transfer. Thermal wattage will always be lower than electrical wattage from the conversion (losses).

TDP is scalar, not a vector. Its the heat output required to be overcome at any given point, not a time rate of change. But I guess that can be argued how you are looking at it I suppose.




Directly @ op -> its well known HW monitor is glitchy, and Im pretty sure thats one of them. The only "true" way to tell what you're using is to get a wall mount monitor and subtract knowns (fans, gpus, etc) to get your actual usage.

As long as everything is running properly at proper temperautres, 60w seems normal, as I've seen people running "30w" with intels.
 


Energy is energy no matter where it comes from. It is physically impossible to pull energy out of nowhere, the laws of thermodynamics don't allow it. If a microprocessor is dissipating 90 watts of heat (pretty typical for a heavy load) it must be drawing at least 90 watts from the socket. There is no way around this. The TDP is the amount of power that a CPU is designed to not exceed under non-synthetic operating conditions and power is defined as the time rate of change of energy. Energy is conserved, you should know this. There will be some variations due to chip quality, and it will be lower under lighter use.

The reason why heating bills vary is because equivalent amounts of energy are cheaper when purchased in volumes of natural gas than in electricity, but few homes have the ability to turn natural gas into anything but heat.