Using an EVGA 1080 Ti SC Hybrid and EVGA Precision, but the general question should apply to most cards.
Is the heat output of the GPU solely or at least primarily determined by the voltage rather than GPU Usage or absolute clock speed? I realize usage can bump up clock speed and that in turn can bump up voltage, but is the heat output effect of the former solely a function of its effect on voltage?
Note I'm talking about heat output, not GPU temperate itself. The hybrid cooling system and custom fan curves keep the GPU between 45 C and 50 C under use regardless of voltage, clock speed, etc., but of course that heat has to go somewhere, which is into my room which becomes uncomfortably hot.
I'm just playing a 4X game (Endless Legends) so don't care much about frame rate. I haven't messed with under-volting, just setting the Power Target down to 50%, which doesn't have a significant effect on frame rate but keeps the frequency generally in the mid 1500's as opposed to the mid 1900's it wants to use otherwise (even though power = adaptive and GPU usage is around 50%). The average voltage goes down to 0.8 from 1.0. The side effect is the GPU usage goes up significantly, from around 50% to 70%.
So my basic question is, does that GPU usage itself increase temperature output if the frequency and voltage remain constant? I suppose I could test it by setting fan speed to constant and seeing if the temp went up significantly in one case rather than the other, but that would be a pain, I'd have to wait for the coolant to stabilize temps between and during tests, etc., so I'm hoping someone just knows the answer.
If voltage is the primary cause of heat output, does anyone know of a way to automatically limit voltage or power target based on running application, rather than manually changing Precision profiles depending on the game I'm playing? Or to set up a custom clock frequency vs usage so it doesn't boost to the 1900's with only 50% usage?
As a side note unrelated to my question, just something strange I've noticed, if I set both the power target to 50% and the clock offset to -200 Mhz, the system becomes unstable with the graphics driver crashing and restarting.
Thanks.
Is the heat output of the GPU solely or at least primarily determined by the voltage rather than GPU Usage or absolute clock speed? I realize usage can bump up clock speed and that in turn can bump up voltage, but is the heat output effect of the former solely a function of its effect on voltage?
Note I'm talking about heat output, not GPU temperate itself. The hybrid cooling system and custom fan curves keep the GPU between 45 C and 50 C under use regardless of voltage, clock speed, etc., but of course that heat has to go somewhere, which is into my room which becomes uncomfortably hot.
I'm just playing a 4X game (Endless Legends) so don't care much about frame rate. I haven't messed with under-volting, just setting the Power Target down to 50%, which doesn't have a significant effect on frame rate but keeps the frequency generally in the mid 1500's as opposed to the mid 1900's it wants to use otherwise (even though power = adaptive and GPU usage is around 50%). The average voltage goes down to 0.8 from 1.0. The side effect is the GPU usage goes up significantly, from around 50% to 70%.
So my basic question is, does that GPU usage itself increase temperature output if the frequency and voltage remain constant? I suppose I could test it by setting fan speed to constant and seeing if the temp went up significantly in one case rather than the other, but that would be a pain, I'd have to wait for the coolant to stabilize temps between and during tests, etc., so I'm hoping someone just knows the answer.
If voltage is the primary cause of heat output, does anyone know of a way to automatically limit voltage or power target based on running application, rather than manually changing Precision profiles depending on the game I'm playing? Or to set up a custom clock frequency vs usage so it doesn't boost to the 1900's with only 50% usage?
As a side note unrelated to my question, just something strange I've noticed, if I set both the power target to 50% and the clock offset to -200 Mhz, the system becomes unstable with the graphics driver crashing and restarting.
Thanks.