75Hz better than 120Hz

Solution




For easier numbers lets consider a hypothetical 10 hz monitor and a gpu that is only able to output 5 fps, a beautifully rendered slideshow.

The monitor will show 10 frames every second regardless of what the video card says, commonly or rather uncommonly known as 10 hertz.

After the first frame from the video card is sent the monitor has nothing to show for the 2nd frame of its 10 hertz cycle so it just doubles the...
The Hertz of a monitor is how many times it updates per second.

120Hz needs 120FPS to fully utilize the refresh rate. If you do not get up to 120FPS, a higher Hz monitor is still beneficial due to less screen tearing.

More Hz is always better.
 


Is screen tearing not an issue if there is also GSync?
 
Is a 75 Hz screen better than a 120 Hz screen? No. But all we're talking about is the maximum rate at which the screen can display new images. The quality of those displayed images can still vary, even among screens of the same refresh rate in Hz.

I would say, the applicability of a high refresh rate screen is going to depend on whether the equipment outputting to the screen can even make good use of the high refresh rate. If you won't ever have frame rates that are consistently at or above 75 Hz, paying extra for a 75 Hz screen over a 60 Hz screen won't yield you a significant return. If you won't ever have frame rates that are consistently at or above 120 Hz, paying extra to get a 120 Hz screen is also not going to yield a good return, but that's just my opinion.

I want to qualify my statement with, each individual user will have a different opinion about the few frames between 60 and 75 Hz, and how important it is that they all be displayed, or between 75 - 120 Hz, or even higher, so where a person wants to draw the line between getting what is to be considered good utility out of the extra spent on the higher refresh rate screen will vary.

The laptop being linked to lists G-SYNC. Screen tearing shouldn't be an issue.
 


With a 1080GTX running a 1080p screen, it should be making good use. I am choosing this configuration over the "Ultra HD (3840 x 2160)" model to get maximum FPS (and hopefully five good years) out of it. That 4K screen will look better but will be much harder to run.

But any way we slice it, it's not getting 120 FPS...so I was thinking a 75Hz screen would be good. At least it's not a 60Hz.
 


A Geforce 1080ti with a comparable i5/i7 should be able to get close to 120 fps in most games at 1080p.

https://www.gamespot.com/articles/nvidia-geforce-gtx-1080-ti-review/1100-6448559/#1080p_Benchmarks\

111.5 in Unigine Valley
175 fps in Tomb Raider
154.5 fps in Bioshock Infinite
167 fps in Shadow of Mordor

Tom's doesn't benchmark high end cards at 1080p due to being fps limited by even the fastest cpu.


Edit: Geforce 1080 is about 10 fps lower in each benchmark but the above conclusion is still valid.


Do consider the Geforce 1070Ti for its much lower price and similar performance to the Geforce 1080Ti
 




For easier numbers lets consider a hypothetical 10 hz monitor and a gpu that is only able to output 5 fps, a beautifully rendered slideshow.

The monitor will show 10 frames every second regardless of what the video card says, commonly or rather uncommonly known as 10 hertz.

After the first frame from the video card is sent the monitor has nothing to show for the 2nd frame of its 10 hertz cycle so it just doubles the last frame sent making a pattern like:

00112233445566778899

With that 2:1 ratio you shouldn't technically have any tearing but your input response will be delayed as you literally see the same picture twice meaning when you click it takes a few extra dozen milliseconds to register on your screen.

This extra time it takes to register is what makes a higher hertz monitor better even with tearing

When the doubled frames come every 1/120 seconds this is effectively a 60 hertz picture which is much more fluid than a doubling of frames coming every 1/30 second.

The screen tearing comes around when it isn't a simple ratio like 2 hertz : 1 fps.

When you get a ratio like "60 fps : 42 fps" or "1.42 : 1" crazy things start happen.

Instead of complete pictures with a doubling of delay you start getting partial frames that the graphics sent to the graphics inbetween it sending frames to the monitor.



 
Solution
Screen tearing is not an issue when using either G-SYNC or FreeSync. Regardless whether the screen can run at up to 120 Hz, 75 Hz, or even 60 Hz, if you have anti-tearing technology, you don't get tearing, which is what those two technologies are, and do. When using G-SYNC or FreeSync, the screen refreshes at the frame rate of the graphics card, so no matter how fast the screen, it will never refresh faster than the actual frame rate. If your game is outputting at 68 FPS, both screens run at only 68 FPS when using G-SYNC or FreeSync. The only benefit a faster refresh rate screen has over a slower one, when both have anti-tearing technology, is for the few frames generated between their respective refresh rates that are above the speed the slower of the two screens can show. If your game is running between 76 - 120 FPS, the 120 Hz screen can show the extra 1 - 44 FPS.

The input lag of a screen has nothing to do with the frame rate being sent to the screen. It has to do with the signal processing time of the screen. Input lag for a screen is the measure of time taken for the screen to process the frame after having received it, and then finally to display that processed frame.

Also, in your above example, the display is not responsible for frame doubling. The 10 Hz screen in your example will only show the same frame twice if it is sent the same frame a second time from the graphics device. A 10 Hz display will happily show blank, or anything else being sent to it for it's 10 unique updates every second.

If you are experiencing some sort of lag due to low frame rates, and it's not because of input lag in the screen, then it sounds like the game is tying it's simulation to the frame rate, which is common because it's an easier way to code games. A 120 Hz screen isn't going to cause a game like this to suddenly double it's input rate because it's doubling it's frames; it doesn't work that way. The input rate is tied to the actual frame rate, not the rate at which frames may be doubled and output, which is not necessarily going to be consistent. It doesn't work that way for games that decouple the world simulation from the frame rate either. The world simulation will run at a much higher speed, and should therefore be able to handle your input at whatever rate your underlying hardware can process it, or at the game engine's internal maximum rate.
 


Agreed.

My example was with v-sync off.

G-sync or free-sync by design does not have tearing as long as you stay within their fps limits.

"After the first frame from the video card is sent the monitor has nothing to show for the 2nd frame of its 10 hertz cycle so it just doubles the last frame sent making a pattern like"

I guess I should have been more clear with my pronoun/nouns, i did mean the graphics card repeats the same frame not the monitor.