Iver Hicarte

Distinguished
May 7, 2016
414
18
18,795
Hey guys,

All of a sudden I just got curious, so as the title says, can a monitor's refresh rate affect a GPU's temperature? Does it affect it on a huge margin if it's maxed out on 144hz or 240hz etc. for example. And how much of a difference is there on the temperature on each refresh rate setting. I know that it's just a very miniscule detail that nobody really gives any attention to since it barely makes a difference in a system's performance and metrics. If it turns out that it's a yes, can you elaborate in detail, may it be minute, I don't mind. I just want my curiosity quenched.
 
Solution
Let's put some numbers to this. I have a 1440p 240Hz monitor on a RTX 4070 Ti. I'll leave the computer idling on the desktop and use HWiNFO to measure the power consumption (for the sake of argument, let's assume it's correct)
  • 60Hz: ~4W
  • 120Hz: ~5.3W
  • 144Hz: ~5.4W
  • 240Hz: ~6.1W
So as a baseline figure, it takes only about 2W on this particular video card to send 4 times as many frames to a monitor. With that, I would say that it will affect the temperature of the GPU, but not by a whole lot. 2W is less than 1% of the card's total TBP.

Now if you're looking at games, unless you're running VSync, then it's irrelevant: the card will render as many frames as the CPU will give it.
D

Deleted member 2947362

Guest
yes.

for example if you run a game and your gpu can max it out at say 100fps it will run hotter because it's working harder.

The same game and settings if you enable a 60fps frame cap of the 100fps it could do the gpu will run cooler as it's only working at 60% of it's full potential.

So I would imagine it's the same when the GPU only has to render the desktop at 60hz/fps it's not working as hard as rendering the desktop 120hz/fps.
 
Last edited by a moderator:
  • Like
Reactions: Iver Hicarte
Let's put some numbers to this. I have a 1440p 240Hz monitor on a RTX 4070 Ti. I'll leave the computer idling on the desktop and use HWiNFO to measure the power consumption (for the sake of argument, let's assume it's correct)
  • 60Hz: ~4W
  • 120Hz: ~5.3W
  • 144Hz: ~5.4W
  • 240Hz: ~6.1W
So as a baseline figure, it takes only about 2W on this particular video card to send 4 times as many frames to a monitor. With that, I would say that it will affect the temperature of the GPU, but not by a whole lot. 2W is less than 1% of the card's total TBP.

Now if you're looking at games, unless you're running VSync, then it's irrelevant: the card will render as many frames as the CPU will give it.
 
  • Like
Reactions: Iver Hicarte
Solution

Karadjgne

Titan
Ambassador
Gpu takes the data packet from the cpu and renders a wire-frame model. After that render is done, it adds the other instructions and pre-render affects like colors, background, shadows and shading etc. When that's done, it renders the picture again in full, according to resolution. Then the gpu adds in the post processing affects like RT, DLSS etc after which the complete frame is sent to the buffer. That's the entire process of the gpu creating a frame.

After that, it's out of receiving and into shipping. This is where your refresh rate comes into play, v-sync, adaptive, g-sync etc. It's why there's such a low wattage difference, because the only thing moving is whatever frames are already completed.

Because games are not static, temps will not be static and you'd not see any variation unless you ran multiple passes at different refresh rates, of the exact same scenario and compared them at exact same time points.
 
  • Like
Reactions: Iver Hicarte

Iver Hicarte

Distinguished
May 7, 2016
414
18
18,795
Let's put some numbers to this. I have a 1440p 240Hz monitor on a RTX 4070 Ti. I'll leave the computer idling on the desktop and use HWiNFO to measure the power consumption (for the sake of argument, let's assume it's correct)
  • 60Hz: ~4W
  • 120Hz: ~5.3W
  • 144Hz: ~5.4W
  • 240Hz: ~6.1W
So as a baseline figure, it takes only about 2W on this particular video card to send 4 times as many frames to a monitor. With that, I would say that it will affect the temperature of the GPU, but not by a whole lot. 2W is less than 1% of the card's total TBP.

Now if you're looking at games, unless you're running VSync, then it's irrelevant: the card will render as many frames as the CPU will give it.
Many thanks for taking the time to run a small test. So indeed it does affect temperature but not by a whole lot like you've mentioned.