[SOLVED] FPS for 144Hz monitor?

Status
Not open for further replies.

Gintama69

Reputable
Aug 23, 2019
149
5
4,595
Hello, what would be the optimal fps for 144Hz monitor?
average 144 fps or minimum 144 fps?

I have 9900KS, RTX2080Ti and 27" 1440p 144Hz monitor, and I get average 146fps on CoD MP at 1440p max details(without ray tracing), and average fps drops to ~130fps if I enable ray tracing.
I checked my average fps using MSI Afterburner & Rivatuner Statistics Server, but was unable to get accurate minimum fps.
Should I reduce the quality so I get average ~160-170 fps? (and then hopefully my minimum fps won't drop below ~130-135)
 
Last edited:
Solution
And yet you miss the point entirely. You can't see the difference. Nobody can. After @ 80fps most ppl can't distinguish changes, it's not that hard to look at one monitor at 30fps viewing and see the difference to another monitor showing the same thing at 60fps, or noticing if a 60fps drops to 30fps etc. But after @ 80fps that changes, and nobody can distinguish a 150fps drop to 100fps.

Your brain is too slow. By the time the picture is registered by your eyes, converted to a signal, sent to your brain, register as a picture, the frame has already refreshed. You see nothing but a constant signal. A 240Hz monitor might make details a little sharper compared to a 144Hz monitor, but only at 1080p, and at those levels the 1440p is already...

Mrgr74

Reputable
BANNED
Hello, what would be the optimal fps for 144Hz monitor?
average 144 fps or minimum 144 fps?

I have 9900KS, RTX2080Ti and 27" 1440p 144Hz monitor, and I get average 146fps on CoD MP at 1440p max details(without ray tracing), and average fps drops to ~130fps if I enable ray tracing.
I checked my average fps using MSI Afterburner & Rivatuner Statistics Server, but was unable to get accurate minimum fps.
Should I reduce the quality so I get average ~160-170 fps? (and then hopefully my minimum fps won't drop below ~130-135)

That's entirely up to you. Whats more important to you? Having high FPS or visuals? Drop down to 1080p and test then. Your FPS will get a nice boost if you care more about having a high FPS vs visuals...

Forgetting FPS for a moment, hows the game look/run? That's more important than anything really. Maybe it's because I don't have hardware that can crank out the raw frames, but I'm failing to see what the issue is here. You obviously have a buff arse PC pushing out well above 100+ FPS @ 2k and @ maxed out settings.. Many here, myself included, would be willing to resort to being a shady kinda fella on the corner late at night to have that kinda HP under our towers cover! :)
 

Karadjgne

Titan
Ambassador
Neither. Optimum fps is either lower or higher than 144fps, the bigger the margin the better. Most stutter issues happen right at refresh, it's where the gpu has a momentary brain-fart and can't get a frame fully ready in time for refresh, so shoves the last frame back in, and you see the pause as a stutter. Having a lower than refresh fps, the timing is different, it's not 1:1 or anything close, so the frame will drop with a different refresh. Higher fps (especially minimums) means the frames are always ready before hand, buffered, so should apply seamlessly.

As to which is better, minimums or average, that's a personal choice. Ray tracing can 'fix' many incorrect lighting affects, shadows etc but honestly that's usually the last thing I've ever bothered noticing, I'm more worried about the sniper trying to pot-shot me, not whether the shadow on the lamp post is coming from the right angle vs light source.
 

Gintama69

Reputable
Aug 23, 2019
149
5
4,595
That's entirely up to you. Whats more important to you? Having high FPS or visuals? Drop down to 1080p and test then. Your FPS will get a nice boost if you care more about having a high FPS vs visuals...

Forgetting FPS for a moment, hows the game look/run? That's more important than anything really. Maybe it's because I don't have hardware that can crank out the raw frames, but I'm failing to see what the issue is here. You obviously have a buff arse PC pushing out well above 100+ FPS @ 2k and @ maxed out settings.. Many here, myself included, would be willing to resort to being a shady kinda fella on the corner late at night to have that kinda HP under our towers cover! :)
Smoother FPS/better gaming performance is more important for me. I'm willing to playing at lowest quality/fastest fps if that's gonna help my performance.
I've already tested 1080p max details(without ray tracing) and I got 176fps.. but that's not gonna be good enough if I end up getting 1080p 240Hz monitor.. which i'm considering atm, and that's the main reason I made this post. https://forums.tomshardware.com/threads/240hz-monitor-worth-it.3554855/


Neither. Optimum fps is either lower or higher than 144fps, the bigger the margin the better. Most stutter issues happen right at refresh, it's where the gpu has a momentary brain-fart and can't get a frame fully ready in time for refresh, so shoves the last frame back in, and you see the pause as a stutter. Having a lower than refresh fps, the timing is different, it's not 1:1 or anything close, so the frame will drop with a different refresh. Higher fps (especially minimums) means the frames are always ready before hand, buffered, so should apply seamlessly.

As to which is better, minimums or average, that's a personal choice. Ray tracing can 'fix' many incorrect lighting affects, shadows etc but honestly that's usually the last thing I've ever bothered noticing, I'm more worried about the sniper trying to pot-shot me, not whether the shadow on the lamp post is coming from the right angle vs light source.
Yes I guess higher fps = always better, but I wanted to know the minimal/optimal fps for monitor's refresh rate since it's not possible to get 240+ minimum fps on all games(i.e. Control) even with the fastest CPU/GPU.
I was able to get ~216 average fps at 1440p lowest details on CoD MP so I can probably get ~240 average fps at 1080p IF I run at lowest details.
 

Karadjgne

Titan
Ambassador
And yet you miss the point entirely. You can't see the difference. Nobody can. After @ 80fps most ppl can't distinguish changes, it's not that hard to look at one monitor at 30fps viewing and see the difference to another monitor showing the same thing at 60fps, or noticing if a 60fps drops to 30fps etc. But after @ 80fps that changes, and nobody can distinguish a 150fps drop to 100fps.

Your brain is too slow. By the time the picture is registered by your eyes, converted to a signal, sent to your brain, register as a picture, the frame has already refreshed. You see nothing but a constant signal. A 240Hz monitor might make details a little sharper compared to a 144Hz monitor, but only at 1080p, and at those levels the 1440p is already plenty sharper than a 1080p can ever be, simply due to pixel sizing and saturation.

Moving upto a faster refresh would be taking a step back in quality of picture. The better move would be 4k at 120Hz for scenery and visuals, but 1440p at 144Hz is about as good as it gets for fast action.

You say you want the best picture, yet are dumping graphics settings to lowest levels, that makes no sense. Fps isn't everything, there does come a point of diminishing returns where fps literally becomes moot, and thats @ somewhere over 100.
 
Solution

Gintama69

Reputable
Aug 23, 2019
149
5
4,595
And yet you miss the point entirely. You can't see the difference. Nobody can. After @ 80fps most ppl can't distinguish changes, it's not that hard to look at one monitor at 30fps viewing and see the difference to another monitor showing the same thing at 60fps, or noticing if a 60fps drops to 30fps etc. But after @ 80fps that changes, and nobody can distinguish a 150fps drop to 100fps.

Your brain is too slow. By the time the picture is registered by your eyes, converted to a signal, sent to your brain, register as a picture, the frame has already refreshed. You see nothing but a constant signal. A 240Hz monitor might make details a little sharper compared to a 144Hz monitor, but only at 1080p, and at those levels the 1440p is already plenty sharper than a 1080p can ever be, simply due to pixel sizing and saturation.

Moving upto a faster refresh would be taking a step back in quality of picture. The better move would be 4k at 120Hz for scenery and visuals, but 1440p at 144Hz is about as good as it gets for fast action.

You say you want the best picture, yet are dumping graphics settings to lowest levels, that makes no sense. Fps isn't everything, there does come a point of diminishing returns where fps literally becomes moot, and thats @ somewhere over 100.
I understand what you are saying, but I'm sure if your info is 100% correct.
If nobody can distinguish between 150fps and 100fps like you claim then there's really no reason for monitor with higher than 100Hz refresh rate to even exist at first place.
I think most people would agree that the difference between 60Hz and 120Hz monitor is very noticeable(including myself from my own experience), especially for fast paced games(like shooters or racing games etc).

But the difference between say 120Hz and 240Hz isn't noticeable as 60Hz to 120Hz(probably due to human brain/eye limitation like you said).
However the difference between 120Hz and 240Hz still exist(but probably more noticeable to those with more sharper/active(?) brain/eyes, e.g. pro gamers & might not be noticeable to slow old people etc).
I think Linus Tech's done two separate tests on 120Hz vs 240Hz monitors(one about year ago iirc and one recently with NVidia), and on both tests average people performed better on games at 240Hz.

_

If I can only use one monitor then I'd happy with one I have atm(27" Samsung 1440p 144Hz 1ms QLED), I wouldn't want 4K cos most new games can't get fast enough fps even with the fastest CPU/GPU, and I'd prefer 1440p over 1080p for non-gaming.

But I'm thinking of getting 24"(24.5" or 23.8") 1080p 240Hz for my 2nd monitor for CoD MP cos I might be able to play much better on 24" 240Hz.
I'm 100% sure it's gonna be better than 27" 144Hz even for me(obviously i'm not a pro gamer), but the question is would the difference between 24" 1080p 240Hz and 27" 1440p 144Hz noticeable for me?
Guess I won't know for sure until I actually try it.

Lastly most people know CoD MW 2019 is not a very demanding game. But I'll most likely have to run 1080p lowest settings to get average ~240fps since I only get average ~176fps at 1080p max(without ray tracing) with 9900KS + 2080Ti.
(I'm not too worried about low quality graphic settings, I ran Crysis 2 MP at 1080p lowest settings with GTX580 so I could get 100+ fps all the time on 120Hz monitor in past is well)
 
Last edited:

Gintama69

Reputable
Aug 23, 2019
149
5
4,595
Size has nothing to do with the screen resolution. a 24" 1080p is the same as a 60" 1080p. With that said, a larger screen looks sharper with a higher resolution.
Yes I think everyone knows that.

But since you mentioned the size, I think size can affect the game play.
24" 1080p is probably the optimal size for competitive fast paced games, cos 24" is small enough so your eyes can see the whole screen at all time.. giving you fastest possible reaction time.

But if you are on 27"-32" 1080p screen then your eyes will need to move left and right to see the whole screen(unless you are sitting pretty far from screen), therefore obviously slower reaction time than 24". So larger the screen(60"-80") = slower reaction time unless you are sitting far enough to see the whole screen.
 

Karadjgne

Titan
Ambassador
I have 2 monitors. Not yet been able to switch them. The game plays on the primary always. So afaik even having a 1080p/240 sitting next to the primary 1440p, CoD is going to play on the 1440p unless you switch it to windowed mode instead of full screen, and move the window to the secondary screen. And I don't know how permanent that'd be, or if on restarting the game, you'll be back on primary screen again.

For me, the secondary screen is for other stuff. Be it maps, Google directions, even live-time temp readings or task manager etc. Many times it's nothing more than discord open so I don't need to pop the overlay all the time.

I don't see ppl performing better on a faster refresh screen, but I can see ppl preferring the performance, it being a visual thing, but 1440p is going to look better than 1080p, regardless of refresh.

As far as size goes, for 1080p it's a distance vs pixel size thing. At the distance most ppl sit from monitor, at 23.6/24" you can't distinguish the pixels, but is still large overall. Move upto 27"+ and while pixel density doesn't change, it's still 1080p, the physical size of the pixels does, considerably. And the picture looks terrible until you back away a little. Sit at arms length from a 24" 1080p monitor then sit at arms length from a 1080p 50+" TV. You'll understand immediately. This is why there's a move to 1440p at 27"+ size, so that you can remain at regular distance, but pixel size is small enough that even with a larger screen, there's no pixelation.

Some ppl do enjoy a 27/28" 1080p monitor, but if you look, their screen sits a little farther back on the desk, as little as 12" making a huge difference visibly.
 
Dec 12, 2019
8
4
15
What you want to get is stable (preferably - in-game locked) framerate at least a few frames below your monitor's refresh rate, or below your average framerate, if it doesn't reach the maximum refresh rate.
First - to eliminate tearing that happens above your monitor's refresh rate (If your monitor's freesync/g-sync compatible and you take advantage of that).
Second - to get as consistent gameplay as possible.

Battlenonsense has great videos on the topic of input lag and other useful stuff, check out his YT channel.
 

Gintama69

Reputable
Aug 23, 2019
149
5
4,595
What you want to get is stable (preferably - in-game locked) framerate at least a few frames below your monitor's refresh rate, or below your average framerate, if it doesn't reach the maximum refresh rate.
First - to eliminate tearing that happens above your monitor's refresh rate (If your monitor's freesync/g-sync compatible and you take advantage of that).
Second - to get as consistent gameplay as possible.

Battlenonsense has great videos on the topic of input lag and other useful stuff, check out his YT channel.
Thanks I'll check that video out.
Your reply is pretty straight forward but it's still bit confusing since you said "if it doesn't reach the maximum refresh rate", so does that(fps at least few frames below refresh rate) still apply if fps exceed maximum refresh rate?
 

Gintama69

Reputable
Aug 23, 2019
149
5
4,595
I have 2 monitors. Not yet been able to switch them. The game plays on the primary always. So afaik even having a 1080p/240 sitting next to the primary 1440p, CoD is going to play on the 1440p unless you switch it to windowed mode instead of full screen, and move the window to the secondary screen. And I don't know how permanent that'd be, or if on restarting the game, you'll be back on primary screen again.

For me, the secondary screen is for other stuff. Be it maps, Google directions, even live-time temp readings or task manager etc. Many times it's nothing more than discord open so I don't need to pop the overlay all the time.

I don't see ppl performing better on a faster refresh screen, but I can see ppl preferring the performance, it being a visual thing, but 1440p is going to look better than 1080p, regardless of refresh.

As far as size goes, for 1080p it's a distance vs pixel size thing. At the distance most ppl sit from monitor, at 23.6/24" you can't distinguish the pixels, but is still large overall. Move upto 27"+ and while pixel density doesn't change, it's still 1080p, the physical size of the pixels does, considerably. And the picture looks terrible until you back away a little. Sit at arms length from a 24" 1080p monitor then sit at arms length from a 1080p 50+" TV. You'll understand immediately. This is why there's a move to 1440p at 27"+ size, so that you can remain at regular distance, but pixel size is small enough that even with a larger screen, there's no pixelation.

Some ppl do enjoy a 27/28" 1080p monitor, but if you look, their screen sits a little farther back on the desk, as little as 12" making a huge difference visibly.
Bit confused.. firstly why would buy expensive 24" 240Hz monitor just for secondary screen? when you can get 34" UW 3440x1440 for same price.

Also in my case IF I end up buying 24" 240Hz, then it's only gonna be for CoD MP(or other face paced shooters), so why wouldn't you use the 24" 240Hz for CoD?
CoD has option to select the monitor in game menu, and even if that didn't work properly you could just turn off the 27" 1440p monitor and only use the 24" 240Hz for CoD etc.
 

Karadjgne

Titan
Ambassador
I use 2 monitors. My desktop is stretched over both and I do things other than just play a game, so both are in constant use. I'd not but a 240Hz monitor at all, just to play a game, not when I've got a perfectly good 1440p@144Hz monitor as primary.

Your game play isn't going to improve, you won't perform better having open ended fps instead of minimums above 144Hz. You'll not see the difference in fps and lowering details to try and gain higher fps is just making the picture worse, for no gain.

You've got the money, buy whatever, the only real bonus to a 240Hz monitor over a standard 60Hz/120Hz/144Hz is that you'll unlikely suffer screen tearing due to the gpu slapping out frames faster than the monitor can refresh. At 1080p it's going to be a worse picture than 1440p no matter the refresh.

Just be careful to get s true 240Hz monitor, like the new Asus, because 99% of the rest are nothing more than 120Hz monitors that use doublers and refresh the same screen twice, regardless of input.
 
  • Like
Reactions: Gintama69

Gintama69

Reputable
Aug 23, 2019
149
5
4,595
I use 2 monitors. My desktop is stretched over both and I do things other than just play a game, so both are in constant use. I'd not but a 240Hz monitor at all, just to play a game, not when I've got a perfectly good 1440p@144Hz monitor as primary.

Your game play isn't going to improve, you won't perform better having open ended fps instead of minimums above 144Hz. You'll not see the difference in fps and lowering details to try and gain higher fps is just making the picture worse, for no gain.

You've got the money, buy whatever, the only real bonus to a 240Hz monitor over a standard 60Hz/120Hz/144Hz is that you'll unlikely suffer screen tearing due to the gpu slapping out frames faster than the monitor can refresh. At 1080p it's going to be a worse picture than 1440p no matter the refresh.

Just be careful to get s true 240Hz monitor, like the new Asus, because 99% of the rest are nothing more than 120Hz monitors that use doublers and refresh the same screen twice, regardless of input.
Hey Karadigne, thanks for the reply..

I was thinking of getting 24" 1080p 240Hz as second monitor(for CoD MP).. but more I think about it, it's not worth it.
  • I already have a great gaming monitor(27" 1440p 144Hz) and the difference might not be that noticeable.
  • I'd only use 24" 240Hz for CoD MP & it'll cost $570 for a cheapest new one.
  • They say CoD MW isn't that demanding.. but looks like I'll still have to run 1080p lowest setting to get most out of 240Hz according to my average fps test..
Spent almost 6 hours yesterday/today testing two different resolutions(1440p/1080p) and three different quality settings cos I was able to get more accurate average fps by setting the menu/out of focus frame rate to 250fps.
Test Spec (9900KS, RTX2080Ti, 32GB 4000 RAM, NVidia 441.66 driver, everything at stock speed)
Results (Average FPS using MSI Afterburner & Rivatuner Statistics Server after 5 consecutive MP match at selected resolution/settings )
133fps - 1440p highest settings(rtx on) - View: https://imgur.com/a/FHwndHY

155fps - 1440p highest settings(rtx off) - View: https://imgur.com/a/ZSRQaO7

218fps - 1440p lowest settings - View: https://imgur.com/a/go7854B

163fps - 1080p highest settings(rtx on) - View: https://imgur.com/a/ixGtr0C

185fps - 1080p highest settings(rtx off) - View: https://imgur.com/a/IPDy3JF

238fps - 1080p lowest settings - View: https://imgur.com/a/tmdJxES


However I'd still love to try out the 24" 1080p 240Hz if possible.. cos I think it'll def offer two advantages over 27" 1440p 144Hz.
  1. Your eyes should be able to see everything in the 24"(24.5" or 23.8") screen, therefore giving you the fastest possible reaction time. I sit ~40cm away from my 27" but my eyes need to move left and right a bit to the see the whole screen(unless I sit further back).
  2. The difference between 144Hz vs 240Hz probably isn't very noticeable to most people(unlike 60Hz vs 120Hz/144Hz), but it still exist. I'm guessing maybe 5-25% depends on person's eyes/brain(?).
So the combination #1 & #2 should provide at least some advantage(maybe 10-15%? on average) to most people imo.
 
Last edited:
Dec 12, 2019
8
4
15
Thanks I'll check that video out.
Your reply is pretty straight forward but it's still bit confusing since you said "if it doesn't reach the maximum refresh rate", so does that(fps at least few frames below refresh rate) still apply if fps exceed maximum refresh rate?
If you have g-sync compatible monitor and you have g-sync enabled, it eliminates tearing below the maximum refresh rate, but not above it.

So if you have 144hz monitor with g-sync enabled and your fps is 100-140 - you won't see any tearing.
However, if you have, for example, 144-160 fps, g-sync can't correct the monitor's refresh rate to match your fps, and you get tearing.
You can eliminate that tearing by enabling v-sync, but it adds noticeable input lag, while g-sync eliminates tearing without any additional input lag.

If you don't care about tearing, you can limit your fps wherever you want, just to get consistent frametimes

I hope I made it less confusing, but anyway you should check out battlenonsens's videos
 
Status
Not open for further replies.