Sorry but that's not quite right ... higher resistance instead
decreases current.
DC (Direct Current) Ohm's Law: (E = Voltage, I = Current (Amps), R = Resistance (Ohms)
E = I x R
... OR ...
E
IxR
... OR ...
E ÷
R = I
Example A: 1.35 Volts ÷ 0.009 Ohms = 150.0 Amps
Example B: 1.35 Volts ÷ 0.011 Ohms = 122.7 Amps
So, at a given Voltage, if Resistance is increased, then
Current is decreased.
Watts Law (W = Watts or Power, E = Voltage, I = Current):
W = E x I
... OR ...
W
E x I
... OR ...
W ÷ E = I
Example A: 1,35 Volts x 150.0 Amps = 202.5 Watts
Example B: 1.35 Volts ÷ 122,7 Amps = 165 6 Watts
So, at a given Voltage, if
Current is decreased, then Watts (Power) is decreased.
Although ambient temperature affects resistance, which in turn affects Power consumption (Watts) and thus Core temperatures, the effects are actually much less than basic DC Ohm's or Watt's Laws suggest. A CPU is an extremely complex semiconductor which is NOT purely a DC device, but is instead more affected by a multitude of other factors beyond the realm of simple DC resistance such as frequency, capacitance, inductance, impedance and transconductance, all of which are present in analog and digital circuit designs.
Paulie walnuts1888,
The reason for the differences we see in Power consumption (Watts) relative to ambient temperature is primarily due to another factor known as
Leakage Current, which increases and decreases with ambient temperature and directly affects Power consumption (Watts), that in turn affect Core temperatures.
CT