Question effect of lower temp on power consumption/losses --- very large difference ?

Paulie walnuts1888

Commendable
Jun 3, 2021
70
7
1,535
i am noticing a near 50% drop of power consumption performing the same exact things when I have my window open and the room goes to 20f, this is a huge diff and i am very surprised, the effect of losses at higher temps is very apparent. is there a chart showing X temp resulting in X power consumption w the same workload
 

CompuTronix

Intel Master
Moderator
higher temperature results in higher resistance which increases current
ohm law is like 7th-9th grade school material lol
https://www.bbc.co.uk/bitesize/guides/znh7382/revision/8
kerberos_20,

Sorry, but that's not quite right ... higher resistance instead decreases current.

DC (Direct Current) Ohm's Law: E = Voltage, I = Current (Amps), R = Resistance (Ohms)

E = I x R

... OR ...

E
I
x R

... OR ...

E ÷ R = I

Example A: 1.35 Volts ÷ 0.009 Ohms = 150.0 Amps
Example B: 1.35 Volts ÷ 0.011 Ohms = 122.7 Amps

So, at a given Voltage, if Resistance is increased, then Current is decreased.

Watt's Law: W = Watts (Power), E = Voltage, I = Current (Amps)

W = E x I

... OR ...

W
E
x I

Example A: 1,35 Volts x 150.0 Amps = 202.5 Watts
Example B: 1.35 Volts x 122.7 Amps = 165 6 Watts

So, at a given Voltage, if Current is decreased, then Watts (Power) is decreased.

Although ambient temperature affects resistance, which in turn affects Power consumption (Watts) and thus Core temperatures, the effects are actually much less than basic DC Ohm's Law or Watt's Law suggests. A CPU is NOT a purely DC device, but is instead an extremely complex semiconductor that's more affected by a multitude of other factors in AC Ohm's Law beyond the realm of simple DC resistance such as frequency, capacitance, inductance, reactance, impedance and transconductance, all of which are present in analog and digital circuit designs.

Paulie walnuts1888,

The reason for the differences we see in Power consumption (Watts) relative to ambient temperature is primarily due to another factor known as Leakage Current, which increases and decreases with ambient temperature and directly affects Power consumption (Watts), that in turn affects Core temperatures. Lower ambient temperature means lower Leakage Current which decreases Power consumption and Core temperatures.

CT :sol:
 

Paulie walnuts1888

Commendable
Jun 3, 2021
70
7
1,535
Sorry but that's not quite right ... higher resistance instead decreases current.

DC (Direct Current) Ohm's Law: (E = Voltage, I = Current (Amps), R = Resistance (Ohms)

E = I x R

... OR ...

E
IxR


... OR ...

E ÷ R = I

Example A: 1.35 Volts ÷ 0.009 Ohms = 150.0 Amps
Example B: 1.35 Volts ÷ 0.011 Ohms = 122.7 Amps

So, at a given Voltage, if Resistance is increased, then Current is decreased.

Watts Law (W = Watts or Power, E = Voltage, I = Current):

W = E x I

... OR ...

W
E x I


... OR ...

W ÷ E = I

Example A: 1,35 Volts x 150.0 Amps = 202.5 Watts
Example B: 1.35 Volts ÷ 122,7 Amps = 165 6 Watts

So, at a given Voltage, if Current is decreased, then Watts (Power) is decreased.

Although ambient temperature affects resistance, which in turn affects Power consumption (Watts) and thus Core temperatures, the effects are actually much less than basic DC Ohm's or Watt's Laws suggest. A CPU is an extremely complex semiconductor which is NOT purely a DC device, but is instead more affected by a multitude of other factors beyond the realm of simple DC resistance such as frequency, capacitance, inductance, impedance and transconductance, all of which are present in analog and digital circuit designs.

Paulie walnuts1888,

The reason for the differences we see in Power consumption (Watts) relative to ambient temperature is primarily due to another factor known as Leakage Current, which increases and decreases with ambient temperature and directly affects Power consumption (Watts), that in turn affect Core temperatures.

CT :sol:
Is there a chart illustrating leakage current values at x temperature
 

CompuTronix

Intel Master
Moderator
I am not aware of such charts in Intel's Datasheets specific to the 13900K, which instead most likely exist in proprietary Intel engineering documents. However, Leakage Current as well as voltage tolerance varies among different Generations of processors and their respective Microarchitectures, just as you've observed through the testing you've conducted on various processors from 75°F (23.9°C) to 20°F (-6.7°C), which is a HUGE change in ambient temperature. Suffice it to say that a certain amount of Leakage Current is normal and expected, which you should be able to plot for yourself during a controlled experiment, with respect to Power consumption (Watts) per degrees ambient temperature, while running a steady-state workload such as Prime95 Small FFTs.