• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

[SOLVED] FPS way lower than refresh rate - noob question.

thechickenone

Prominent
Mar 20, 2019
4
0
510
Hey all,

I recently bought a 240hz monitor (asus gaming vg279qm), but it's a bit overkill for my system.

I bought it as it was on sale at a local store, and was a 1-day return - so I got about $150 off the RRP.

I'm running a: ryzen 5 3600, RTX 2060, 16gb DDR4 RAM, and so on.

My question: if I'm getting no where near 240FPS, does it make a difference what hz I have my monitor set as?

I.e. if I'm playing a game, and I'm getting about 90fps, should I have my monitor set to a lower refresh rate? Or does it simply not matter what refresh rate it is, as long as my FPS isn't above it? Deciding whether I should return it and get a 144hz monitor, or whether 144 vs 240 doesn't actually change anything for my system.

I'm still trying to wrap my head around refresh rates, so I'm a bit clueless.

Thanks everyone.
 
Solution
D
I'm not an expert here and what I know I know from Home Theatre not from gaming, but will try to explain what I can. If something is wrong and someone knows better, please correct me.

FPS is basically Frames per second, the output you are feeding to a monitor, namely, how many times each second the frame changes (to create motion).

Refresh rate, measured in Hz, is the amount of times a monitor checks for a new image.

Thus, if you have 24fps and a refresh rate of 30Hz, your monitor is correctly showing everything you are feeding.
If you output 60fps with a 30Hz, assuming you see anything, you still get motion, but it could be twice as smooth, as the monitor refreshes the image you see every TWO frames you feed to it.

So yes, it's...
I'm not an expert here and what I know I know from Home Theatre not from gaming, but will try to explain what I can. If something is wrong and someone knows better, please correct me.

FPS is basically Frames per second, the output you are feeding to a monitor, namely, how many times each second the frame changes (to create motion).

Refresh rate, measured in Hz, is the amount of times a monitor checks for a new image.

Thus, if you have 24fps and a refresh rate of 30Hz, your monitor is correctly showing everything you are feeding.
If you output 60fps with a 30Hz, assuming you see anything, you still get motion, but it could be twice as smooth, as the monitor refreshes the image you see every TWO frames you feed to it.

So yes, it's kinda usless to get a 500Hz monitor or something like that. As long as your refresh rate is above the FPS, you are fine. If the opposite happens, you might still be fine although not exploiting the full potential of your graphics.

Our human eye would be able to tell a difference up to around 150fps, so it wouldn't really matter being able to have a 200fps... Of course, I am talking about average here.
 
Solution