Question 60 Hz display=limit 60 fps,right or wrong?

Jun 30, 2018
81
1
35
0
I remember playing last year Doom very smoothly and very fast at around 90 fps,same as Shadow of Mordor at around 80 fps which I did measured with Fraps with my R9 390.Nowdays I'm stuck in games at around 45-50 fps in most games (on high settings) and that is max.
So I thought about buying either RTX 2060 cheaper one,because I'm low on budget or GTX 1660 Ti.
I play at 2560x1080.
So my questions are:
1.How did I manage to play Doom so smoothly,if I'm limited to only 60 Hz monitor,and I'm sure that was hell more then 60 fps?
2.If I buy any of those newer cards will my games be "locked" to only 60 fps?
I know I can disable v-sync in games but what will then happen?
If that is the case,then it's not worth spending money on new card without buying 144Hz monitor.
 

Math Geek

Glorious
Herald
you're mixing up two different things. your gpu is capable of producing any amount of frames per second based on it's strength, settings and so on.

your screen however, is only capable of SHOWING you 60 of these frames every second. so anything your gpu is producing above the monitor's refresh rate is basically wasted as the system simply throws them out since it can't actually show them to you. if you want to SEE more of the frames, then you need a monitor that is capable of showing them to you.

v-sync and g-sync are a way to tell the gpu to only produce the 60 fps, that the monitor is capable of showing. by not producing the extra frames, the ones that are created are better quality with reduced tearing and other issues that arise with the gpu trying to pump out 1,000 frames every second. less work for the gpu = better quality work possible so to speak

this is why we all giggle when we see the "i only get 500 fps on CSGO and i NNNNNEEEEDDDDD 1000" all them frames are never seen and simply wasted effort, yet folks will SWEAR they can see a difference.
 
Jun 30, 2018
81
1
35
0
So theoretically it's not worth to buy any of those cards if I own 60 Hz monitor?
It's not worth spending 400 euros just to get 10-15 fps increase,if I understood you right?
Basically I will play games at 60 fps no matter how much I get with graphics card,right?

But I'm sure I remember huge difference when playing back Doom as a FPS compared to games I play now,and I am positive I am not talking about only 10-15 frame increase.
Can I somehow overclock this monitor to at least 75 Hz?
It's LG 25UM58-P.
Lets say I turn off V-sync,then I will still play at 60 fps no matter how much Fraps or MSI AB shows?
 

pivodrums

Prominent
Feb 10, 2018
70
8
545
3
So theoretically it's not worth to buy any of those cards if I own 60 Hz monitor?
It's not worth spending 400 euros just to get 10-15 fps increase,if I understood you right?
Basically I will play games at 60 fps no matter how much I get with graphics card,right?

But I'm sure I remember huge difference when playing back Doom as a FPS compared to games I play now,and I am positive I am not talking about only 10-15 frame increase.
Can I somehow overclock this monitor to at least 75 Hz?
It's LG 25UM58-P.
Lets say I turn off V-sync,then I will still play at 60 fps no matter how much Fraps or MSI AB shows?
exactly, it's not worth to upgrade any gpu unless you also plan to upgrade your monitor to 144Hz (or even more).
300FPS can't help you anything if your monitor can only show you 60Hz/FPS, therefore you can only see 60.

I would suggest upgrading to something like you suggested (rtx 2060) and then save some money and upgrade monitor to 27" QHD (2560x1440) and 144/155Hz.
Then the difference will be obvious ;)
 
Reactions: davew1860

Math Geek

Glorious
Herald
right, the newer games take more power to create those 60 fps than the older Doom and other older games did. so the newer card may create more frames than you can see right now, but it will also future proof you for a while as the next gen games require more and more gpu power. for now, g/v-sync will help those 60 fps be the best they can be and not waste the resources created more frames than needed

so for now with the better gpu than needed, you can turn up the eye candy and over the next few years you'll see that slowly lower settings as games use the power more. this tends to be why we still say "buy as much gpu as you can afford" even though your monitor/games may not need it right now.
 
Reactions: davew1860
you're mixing up two different things. your gpu is capable of producing any amount of frames per second based on it's strength, settings and so on.

your screen however, is only capable of SHOWING you 60 of these frames every second. so anything your gpu is producing above the monitor's refresh rate is basically wasted as the system simply throws them out since it can't actually show them to you. if you want to SEE more of the frames, then you need a monitor that is capable of showing them to you.

v-sync and g-sync are a way to tell the gpu to only produce the 60 fps, that the monitor is capable of showing. by not producing the extra frames, the ones that are created are better quality with reduced tearing and other issues that arise with the gpu trying to pump out 1,000 frames every second. less work for the gpu = better quality work possible so to speak

this is why we all giggle when we see the "i only get 500 fps on CSGO and i NNNNNEEEEDDDDD 1000" all them frames are never seen and simply wasted effort, yet folks will SWEAR they can see a difference.
I'm pretty sure higher frame rates than your monitors refresh rate help keep input latency low. You can feel it in responsiveness, which is why people continue to want more frames in competitive first-person shooters such as CSGO.
 
Reactions: TJ Hooker
Jun 30, 2018
81
1
35
0
so basically waste of money,from what I understood it would require to buy 144Hz monitor too.Newer card-->144Hz monitor,otherwise waste of money,right?
I play newer titles at around 45-50 fps,is that enough for a game to be playable?
I haven't checked Metro Exodus although and Shadow of the Tomb raider,and shadow of war which seem most demanding to this card and frame rate drops very low.But other games ran at between 40-50 fps from what I saw on youtube.
Maybe in the future then when this card sucks at more demanding games I will need more powerful card,huh?
Which is the average playable frame rate?
 
I have an asus Strix 2060 on a freesync monitor @ 72hz
I can produce 100+fps in games, but the monitor will show only 72 of them per second
It doesn't mean 100fps is a waste, but it means it will update without loss/struggle. When you play with less than 60 it will look a bit jerky.

If I can find this video online which is really good at explaining it I will post it.
 
Jun 30, 2018
81
1
35
0
This concludes to the point I should buy a 144Hz monitor and card together,right?
Np point of buying a card right now,when I am low on budget for monitor.
I could get newer games to run at 60 fps although,but I still play older titles,so this is not concern for me.Besides spending money for a card is not worth extra 10-15 fps without 144HZ monirtor,right?
Those 2 cards anyhow give a lot more then 60 fps,which I can't get if I could buy just card.
 
If you play games like Metro Exodus on the R9 390 i'd be very suprised if you reached 45-50 fps. 144hz monitor via triple A titles like Metro Exodus would need a RTX 2070 and the high IPC of intel to match FPS to a refresh rate that high, on ultra settings.
 
The 1660Ti should be good for 60fps in most games at 2560x1080.

If you got a faster card and turned VSync off, then you'd get what's called "tearing" where there would be these subtle splits in the screen when the monitor is only partway through drawing a frame, then the video card pushes the next frame, so you have part of the screen showing one frame, and part of the screen showing the next.
 
Reactions: frozensun

Math Geek

Glorious
Herald
i would not go so far as to say it's not worth it without the faster monitor. as i noted, the better card will ensure you get that 60 fps now and in the future. for instance i bought a 280 when they first came out and am finally looking at upgrading. it let me do what i needed back then and it is finally after these few years started showing it's age. i could have bought a cheaper gpu but then it would have been too slow much quicker meaning i would have had to upgrade sooner.

so no it is not a waste to buy the better gpu without the better monitor, it just will be more than needed now but will be just good enough in a couple years saving money upgrading sooner. and as a bonus, if you happen to run across a great buy for a 144 hz monitor down the line, you can grab it and know your gpu can already keep it busy and happy :)
 
Jun 30, 2018
81
1
35
0
So I'm little confused now, to buy or not with 60 Hz monitor?
Maybe I should wait for more demanding games,when my R9 390 can't handle at least 40 fps and then buy those cards.
I mean, the plan to buy a gpu now without 144MHz monitor is obviously not a solution,because I don't have money for one,and second when I plan to buy a 144Hz monitor then I'd buy any of those cards,and maybe they will be cheaper in the future.
I really don't know how good can R9 390 handle Metro Exodus,but don't care since I have many older title games to play,installed.
One thing which comes to my mind then is first buy monitor because prices of them don't drop that much,but here pricing for 2560x1440 144Hz is a lot like 500 euros.
But then again 1660 Ti is not good solution for that resolution in ultra/high settings.
If I'd go for 1920x1080 144Hz those are cheaper but then I'm back at full HD like before.
Is there noticeable difference between playing a game 1920x1080 vs 2560x1440?
 
Jun 30, 2018
81
1
35
0
I played as a kid at 15 fps,I remember Thief,Condemned I had old amd hd graphics card maybe 15 years ago,then why not...
40 fps is playable...not perfect but playable.
 
I have an asus Strix 2060 on a freesync monitor @ 72hz
I can produce 100+fps in games, but the monitor will show only 72 of them per second.
It's not quite that simple. A 72 Hz refresh rate doesn't just mean the screen shows 72 image per second. It also means those images are equally spaced in time. So there's a fixed interval of 1000/72 = 13.9 ms between each image.

100 fps just means the card is averaging 100 images per second. It says nothing about how those images are distributed in time. If (extreme example) 99 of those frames were rendered in 13 ms, and the last frame rendered in the remaining 987 ms, then the monitor would actually only display two different images per second despite your GPU generating 100 fps and the monitor having a 72 Hz refresh rate.

So average fps higher than the monitor's refresh rate can actually be useful if it means your minimum fps stays at or above the monitor's refresh rate. And the answer to the question in the title of this thread: "60 Hz display = limit 60 fps" is wrong. That will just result in your monitor missing a frame (shows fewer than 60 unique frames per second) every time the GPU's instantaneous fps dips below 60 fps.

In other words, if your monitor is 72 Hz and you want to make it display 72 images every second, the setting you want (if your game has it) is for minimum fps to be 72 fps. That is, if the game detects that a frame will take longer than 13.9 ms to render, it will dynamically sacrifice some image quality settings in order to finish that frame in less than 13.9 ms. Thus guaranteeing that new frame finishes drawing before the monitor's next refresh, and thereby maintaining fps at at least 72 fps all the time. So the monitor always has a new image to display every time it refreshes.

This is why G-sync and Freesync are such a big deal. Instead of the GPU trying to match the monitor's refresh rate, the monitor adjusts its refresh rate to match the GPU's output.
 
I'm pretty sure higher frame rates than your monitors refresh rate help keep input latency low. You can feel it in responsiveness, which is why people continue to want more frames in competitive first-person shooters such as CSGO.
Yes, this is correct. For example, if you're rendering 120 fps on a 60 Hz monitor and your GPU finishes a frame while the monitor is halfway done refreshing, the bottom half of the monitor will come from the new frame. You'll have screen tearing, but you'll have a more up-to-date image on at least part of the screen compared to if you were only running at 60 fps.
 
Jun 30, 2018
81
1
35
0
Solandri what are you trying to tell me,I don't understand mate?
Gameplay would be smoother/faster with new card even on 60 Hz monitor,is that it?
 

ASK THE COMMUNITY

TRENDING THREADS