Question 60 Hz display=limit 60 fps,right or wrong?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Well I have 2 choices either to buy a card or a monitor cuz I can afford both now.
Lets say I want to buy first a decent 2560x1440 144Hz monitor.
And I am playing few months older games at 2560x1080 with average fps of 50 on high/ultra
Buying a new monitor and playing with same old R9 390 how much fps would I loose if I switch it to 2560x1440 instead of current 2560x1080?
Games I play are Far Cry 5,Batman Arkham Knight,Dishonored 2,Dyling LIght,Far Cry Primal,Rise of the Tomb Raider,The Witcher 3,Rage 2...I downloaded newer titles as well,but waiting to finish these games.

I mean that is not much of a higher resolution,right except this 1080-->1440p.
Buying a card would ofc mean to enable V-sync.
This upgrade should not only be determined by your monitor and plans for your kept or possible new upcoming graphics card. In order to answer this question correctly and make an informed decision i'll need a complete list of your hardware system specifications. This will help me get a "birds eye view" as to what hardware you have and future possible upgrade capabilities of the platform.
 
Last edited:
Jun 30, 2018
81
1
35
0
I'd just buy the monitor.How much fps would I loose with playing those games on QHD?
My specs: core I7-6700K, MB: chipset Z170 16GB DDR4 RAM at 3200MHz
Graphics Card Asus Strix R9 390 8 GB DDR5.
 
I'd just buy the monitor.How much fps would I loose with playing those games on QHD?
My specs: core I7-6700K, MB: chipset Z170 16GB DDR4 RAM at 3200MHz
Graphics Card Asus Strix R9 390 8 GB DDR5.
Based on the list of games above you'd probably be around 60-80 fps on low/medium settings 1440p with the R9 390 gaming at that resolution.. You can double check each title above by searching/watching youtube gameplay of your current hardware and said game. If I were in your shoes, I would keep that system at 1080p resolution because right now that system is overkill for playing games on it (with exception to the graphics card) and would make an excellent high refresh rate 1080p gaming system. Modern triple A titles are now optimizing for 8 core 16 thread processors, so if you expect longevity out of your current system, i'd say the i7, 16gb memory and an upgrade to a reasonable mid end graphics card would be a great 144hz 1080p system now, but would ultimately end up aging into a solid 60+ fps/60hz 1080p panel system. I have another i7 4790 mini itx system with a gtx 1070 and it's starting to show it's age in games like Black OPS 4. When you purchase a new platform upgrade in the future of any significance I would definitely look for a matching resolution monitor.
 
There is no such human who can notice more then 30 fps,there is no such perfect "eye".
I think 144Hz monitors is made for fools who want to spend more money.
So anyone who tells me that he can notice difference between 60 fps and hell above that is liar I think.
Just tell me one good reason why someone needs 100fps in game,there is no such perfect "eye".We humans make perception on 30 fps.I googled and found out that.
The notion that the human eye can't perceive anything beyond 30 fps is a myth. In general, any blanket claim that the human eye can only perceive up to some specific FPS is typically BS. The real answer to what exactly the human eye is capable of is more complicated than that.
 
Last edited:

Math Geek

Glorious
Herald
up to 100 fps or so there is a difference that even my bad eyes can see. above that i don't see any increase in quality/smoothness/whatever else is claimed to exist there.

never seen any formal testing but informal testing i have seen done locally put people in front of the same game on the same system with the same settings. only change was locking it to a specific fps. above 100 fps, no one could guess even close to what the actual fps was.

i'm sure if you put a counter on the screen then all of a sudden the lower fps machines would be dubbed "unplayable" but with only the gameplay to go by, no one could guess the difference. below 100 fps though it was clear even to me what going from 60 to 100 fps brought. so i'll give it that much. i guess if you spent your entire life in front of your monitor ( as i suppose some do) then maybe you could pick out 144 fps over 100 fps but i'd not believe it until some form of valid testing could be done.

i have only seen it matter to people when they can see a counter on screen, then they all of a sudden JUST KNOW the difference
 
Reactions: King_V
Personally, I can see LED PWM flickering up to about 600 Hz. Mind you, it stops bothering me after about 200 Hz, but I can notice it much higher (in motion). Yes those old fluorescent ceiling lamps (the cheaper ones flicker at 120 Hz) used to drive me nuts.

As for video, I find 60 Hz adequate, and 30 Hz tolerable. I can certainly see the difference when going up to 120 Hz or 144 Hz. If I were into FPS gaming with lots of fast motion and dodging/aiming, I'd probably want the higher refresh rates. But for general computing and the RPGs I like to play, 60 Hz seems to be good enough.

This is one of those things which is unique to every individual. So I don't make blanket recommendations on how high a refresh rate to get. I encourage people to visit a computer store and play some games on monitors with the higher refresh rates. That way they can determine for themselves how high a refresh rate they need, how low is good enough.
 
Jun 30, 2018
81
1
35
0
I want those games to play at high/ultra settings.
So how much fps would I loose if I play them on 2560x1440 currently, if I play them at around 50 fps currently at 2560x1080?
 
Jun 30, 2018
81
1
35
0
Hmmm R9 390 can't handle Metro Exodus.On open world fps drops at even 30 fps,averaging around 47 fps in closed spaces,but fps varies a lot.
What does that mean?
The maximum is 55 then in some scenes drops at even 33,and it varies.
Game is hardly playble.I lowered the settings to HIGH and got 10 fps more,but hell I want to play on ultra.
 
This was my initial suspicion only because i've built systems with the 390 and it was in computer years a LONG time ago. It sounds like you aren't ready for a platform upgrade nor should you be because you still have an excellent system. I say this with 100% certainty because if you were enthusiast level, 144hz+ would be no question. If you'd like to play ultra detail on that system for 3-5 years and the fact that triple A titles are optimizing for systems that are on the way higher end of the enthusiast level spectrum, i'd say it's safe to keep your system geared towards 1080p 60hz (based on your previous posts). If you were to purchase a used gtx 1080/ new RTX 2070 This would be overkill for that resolution now, but would give you the longeity you desire to play 1080p on ultra via a 60hz panel.
 
Last edited:
Jun 30, 2018
81
1
35
0
I must decide between buying a monitor or graphics card.
Hmm If I buy 1660 TI gaming x from MSI,without buying a monitor r u sure I could be more then happy in next 2-3 years playing future games at 2560x1080 at ultra settings at 60 fps?
I
see this card is not meant for 2560x1440 with ultra settings,giving little above 40 fps avg,which I think is not enough for smooth gameplay.
In that case buying a 144Hz monitor would be a waste of money.
 
I want those games to play at high/ultra settings.
So how much fps would I loose if I play them on 2560x1440 currently, if I play them at around 50 fps currently at 2560x1080?
2560x1440 has about 4/3 the number of pixels as a 2560x1080. A very rough rule of thumb is that your performance change is the reciprocal of the number-of-pixels change.

Ergo, if the new monitor has 4/3 the number of pixels compared to the old monitor, the number of frames per second will be 3/4 of what it was relative to the old monitor.

Also, again, my own personal preferences showing here: For gaming, I would never give up the 21:9 aspect ratio of the 2560x1080 UNLESS I needed the extra vertical resolution for some reason (say, work purposes) that the 2560x1440 has, as I don't want to go back to 16:9 aspect ratio if I can avoid it.
 
Reactions: TJ Hooker

spencer.cleaves2

Upstanding
Jan 5, 2019
178
25
215
8
So theoretically it's not worth to buy any of those cards if I own 60 Hz monitor?
It's not worth spending 400 euros just to get 10-15 fps increase,if I understood you right?
Basically I will play games at 60 fps no matter how much I get with graphics card,right?

But I'm sure I remember huge difference when playing back Doom as a FPS compared to games I play now,and I am positive I am not talking about only 10-15 frame increase.
Can I somehow overclock this monitor to at least 75 Hz?
It's LG 25UM58-P.
Lets say I turn off V-sync,then I will still play at 60 fps no matter how much Fraps or MSI AB shows?
When building a PC, at the end of the day you should be building it around your display. If you have a 1920/1080 resolution monitor at 60Hz, and you drop a 2080ti in your build, you basically just made a beast computer that is being bottle-necked by the monitor. But lets say the monitor was 4k resolution at 60Hz, then the GPU would be matched to the monitor a little bit better. If you're someone who finds them playing a lot of FPS games like PUBG and CSGO, then I recommend focusing on higher refresh rate monitors (144Hz+) and worry less about resolution. If your someone who likse RPG or more story based games and your eyes dont bleed at 60FPS, then you could get a 4k 60Hz monitor. At the end of the day, it is always a waste of money to build a crazy PC and use a 60Hz 1920/1080 monitor
 
For future gaming?
That's a harder call to make, just because it's hard to say how crazy developers might start going with more graphical detail, higher res textures, more advanced 3D engines, and so forth. This gets counterbalanced by the developers still wanting to get the largest target audience for their games, which won't happen if you require them to upgrade to extreme-level hardware.

There was a time when 4GB of VRAM was always enough for games at 1920x1080, but eventually, that changed, and now, on occasion, you find a game where the 4GB limit might start to be an issue.
 
Just switching over from Ultra to High settings has a huge impact on FPS and performance results while difference in visual detail is negligible. This was why I was recommending a gtx 1080 / RTX 2070 to achieve ultra at 1080p for the next 3-5 years which can't be guaranteed by anyone, however choosing those higher end cards would surely stack the odds in your favor. While performance of the gtx 1660 ti on ultra might not deliver these results for years to come (You can now in many titles), this is a very reasonable card for your system, monitor and performance expectations and I think you'll be pleased making the jump from your R9 390.

View: https://www.youtube.com/watch?v=d2E1FY6TlPk&t=807s
 
Jun 30, 2018
81
1
35
0
I switched to high settings.I'm impressed by R9 390 on Metro Exodus with average fps of 52.
Altthough specs for high settings tell that gtx 1070 is required I'm pleased by this fps.
Sometimes fps drop to around 32 in open world scenario with a lot of details (like outside world) but that stays for short period.
RTX 2070 here costs a lot.I think I will stick to my current card for now.
I wish I could play in ultra but at high the game looks okay and it's playble on this frame rate.
 
Reactions: King_V
Well I have 2 choices either to buy a card or a monitor cuz I can afford both now.
Lets say I want to buy first a decent 2560x1440 144Hz monitor.
And I am playing few months older games at 2560x1080 with average fps of 50 on high/ultra
Buying a new monitor and playing with same old R9 390 how much fps would I loose if I switch it to 2560x1440 instead of current 2560x1080?
Games I play are Far Cry 5,Batman Arkham Knight,Dishonored 2,Dyling LIght,Far Cry Primal,Rise of the Tomb Raider,The Witcher 3,Rage 2...I downloaded newer titles as well,but waiting to finish these games.

I mean that is not much of a higher resolution,right except this 1080-->1440p.
Buying a card would ofc mean to enable V-sync.
None of those are eSports titles, which is primarily what high refresh displays were made for in the first place. You're just going to be playing catch up trying to facilitate it with a powerful enough GPU to justify it, all over titles that don't come close to needing it.

In short, you need to be practical and think more about balance. A R9 390 paired with a 144Hz display is just pointless.
 
Jun 30, 2018
81
1
35
0
It's pointless my friend that average gaming monitor 2560x1440 at 144Hz costs here in east.Europe around 500 euros,too.
Maybe when I think about your post,actually why would I need 144HZ refresh rate with GTX 1660Ti or RTX 2060?
 

ASK THE COMMUNITY

TRENDING THREADS