• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

gaming in 1440p

jacobo2u

Reputable
Oct 2, 2015
8
0
4,510
Hello everyone!
I have a question regarding monitors at 1440p
I am building a new rig which will be approximately:
6700k
MSI gaming M7
corsair RM750i
yet to be determined 980 Ti (msi, evga or gigabyte)
16 GB ripjaws 4 2400 DDR4
corsair hydroseries....

question is; The monitor for this rig at 1440p seems to be a very important and expensive purchase. At this resolution and high settings, I surmise that I can get around 60 FPS in most games. at these frame rates, would g sync even help me? V sync? I have been looking at the Asus ROG SWIFT PG278Q 144Hz 27.0" Monitor and Acer XB270HU bprz 144Hz 27.0" Monitor but both of these are more than I really want to spend. Can I get away with a 60 hz monitor? or should i go with a one of these, and to achieve higher frame rates bump my settings down a little. I am still not certain how monitors adjust to dips in frame rates. v sync goes down to multiples of the max? I know what g sync does. what happens when both are not there? Also, would it ever make sense to by a 4k monitor and then run games at 1440? would that have some weird distortions?
I look foward to you guy's inputs!
 
Solution
I can answer a few of these for you.

You could get away with a 60Hz monitor, but the most you'll ever "see" is 60 frames.

Vsync tries to make the GPU only send frames in time with the monitor's refresh rate. This means that if the GPU "finishes" early, it holds the frame until the monitor is ready, if the GPU takes too long, the monitor merely plays the same frame again. This causes input lag in games, but prevents tearing, which is what G-Sync aims to cure.

When both are not there, you have no input lag, but you *can* get tearing. Tearing happens because the monitor merely spits out frames as fast as it can, with no regard to the "sequence" of which it came. I.e., GPU renders Frame A, monitor displays frame A. If it's a 60 hz...
I can answer a few of these for you.

You could get away with a 60Hz monitor, but the most you'll ever "see" is 60 frames.

Vsync tries to make the GPU only send frames in time with the monitor's refresh rate. This means that if the GPU "finishes" early, it holds the frame until the monitor is ready, if the GPU takes too long, the monitor merely plays the same frame again. This causes input lag in games, but prevents tearing, which is what G-Sync aims to cure.

When both are not there, you have no input lag, but you *can* get tearing. Tearing happens because the monitor merely spits out frames as fast as it can, with no regard to the "sequence" of which it came. I.e., GPU renders Frame A, monitor displays frame A. If it's a 60 hz monitor, it will take 16.67 ms before it can display next frame. So GPU renders frame B and the monitor holds it in buffer until it's done displaying Frame A. However, the GPU doesn't wait on the monitor. GPU finishes Frame C before the monitor even gets to display Frame B, so the monitor dumps Frame B from the buffer and puts Frame C in, and when the refresh is up, it displays that, so now there's a "skip" between what happened between Frame A and C, and this can be seen in "tearing".

At high refresh rates w/o G-Sync, this is harder to see. 144 Hz monitors refresh frames every 6.94 ms.

It would not make sense to buy a 4K monitor and just play at 1440p. Most monitors are optimized for displaying at the recommended resolution, and running downscaled resolutions tend to look worse than a monitor running it natively. i.e. 1080p on a 1440p monitor will sometimes look worse than 1080p on a 1080p monitor of the same size.
 
Solution