Question 1440p monitor, refresh rate etc. questions

TheMemeMachine

Reputable
May 27, 2017
19
0
4,520
Hello everyone, I have some questions regarding the 1440p monitors.
I am going to upgrade to 1440p because the 1080TI is have is wasted on the 1080p monitor i currently have, and I am a bit confused by the refresh rate of choice. Most people recommend a 144hz refresh rate. Now, the 1080Ti will probably not manage 144fps on ultra setting on all games, but since getting a 144hz monitor would be more futureproof that getting a say 60hz one, I was wondering if it is possible to lower the monitors refresh rate to avoid screen tearing? Like, should I lower it to like 100hz, or maybe lower, depending on how the 1080ti performs. It is my understanding that getting a lower fps than the monitor refresh rate causes the screen tearing, but if I am wrong correct me please. I dont understand why people pick high refresh rates when their GPU cant produce enough fps, am I missing something? Also, the monitor selection where I live is pretty limited, so it will probably not have Gsync, but freesync instead. I heard that nvidia gpus now support freesync, so I guess this shouldnt be a problem, but again correct me if I'm wrong.
I would like some opinions from you guys on this, maybe even someone to share some personal expirinece with this problem

Oh yeah, I was also wondering if my CPU would be a bottleneck here? Its an 8600K. I hear some people say it will and some others saying it wont, so im curious.
Thanks in advance for the replies! :)

rest of my rig:
i5-8600K stock speed, cooled by the beQiet shadow rock slim
trident z rgb 16GB at 3200MHz (xmp profile)
gigabyte Z390-UD mobo
kingston 480gb SSD
WD black 2TB HDD
CM MWE 650w PSU
 
It's very doubtful that your CPU will be a bottleneck in any meaningful way. Yes, there are probably SOME circumstances where an i7 with more cores/threads will offer a bit better performance, especially if you tend to play titles that clearly benefit from multithreading, but probably not enough so to warrant the expense unless you have money burning a hole in your pocket.

As for the refresh rate, anytime you deviate away from the native refresh rate or resolution of a monitor, it can work, but it can also present some issues with performance and image quality. I think you'd be fine dropping the frequency of the refresh rate as you outlined, in situations where you need to, but it kind of defeats the purpose of HAVING a high refresh rate monitor and if you simply go with a monitor that has adaptive sync of some kind, you wouldn't need to worry too much about it anyway. It would MOSTLY do that automatically, given the proper settings and configuration.

Since Freesync monitors seem to work now with Nvidia graphics cards, I'd say your options in getting an adaptive sync capable display are a lot more open, and probably a lot less expensive, than they used to be. I'd recommend looking into that if you decide to go with a monitor that has a refresh rate that is beyond what you think you can maintain in terms of FPS, and probably even if you DIDN'T think that would be a factor, because TOO MANY FPS can cause tearing just the same as too few.
 
  • Like
Reactions: TheMemeMachine
It's very doubtful that your CPU will be a bottleneck in any meaningful way. Yes, there are probably SOME circumstances where an i7 with more cores/threads will offer a bit better performance, especially if you tend to play titles that clearly benefit from multithreading, but probably not enough so to warrant the expense unless you have money burning a hole in your pocket.

As for the refresh rate, anytime you deviate away from the native refresh rate or resolution of a monitor, it can work, but it can also present some issues with performance and image quality. I think you'd be fine dropping the frequency of the refresh rate as you outlined, in situations where you need to, but it kind of defeats the purpose of HAVING a high refresh rate monitor and if you simply go with a monitor that has adaptive sync of some kind, you wouldn't need to worry too much about it anyway. It would MOSTLY do that automatically, given the proper settings and configuration.

Since Freesync monitors seem to work now with Nvidia graphics cards, I'd say your options in getting an adaptive sync capable display are a lot more open, and probably a lot less expensive, than they used to be. I'd recommend looking into that if you decide to go with a monitor that has a refresh rate that is beyond what you think you can maintain in terms of FPS, and probably even if you DIDN'T think that would be a factor, because TOO MANY FPS can cause tearing just the same as too few.
Thank you very much for your answer. So what would be some issues if I were to drop the refresh rate to say 100Hz? You mentioned it would be performance and image quality issues but I don't understand what you mean by that.
 
Lower refresh rates can sometimes cause flicker, unstable display or blurring. Usually that is only at refresh rates below 60hz, but as I said, anytime you deviate from the native "best" or "recommended" settings, for pretty much most hardware, but specifically for displays in this case, there CAN be unexpected results. I'd think you'd be fine dropping to 100 or even 75hz if necessary, and if you plan to upgrade the other hardware down the road then it's likely worth taking the risk, minimal as it is. I doubt you'd have any problems at all, but as I said, some kind of display with adaptive sync would be the better option so you don't have to even bother with worrying about that at all.
 
  • Like
Reactions: TheMemeMachine