derekullo :
The monitor will show 10 frames every second regardless of what the video card says, commonly or rather uncommonly known as 10 hertz.
After the first frame from the video card is sent the monitor has nothing to show for the 2nd frame of its 10 hertz cycle so it just doubles the last frame sent making a pattern like:
00112233445566778899
Screen tearing is not an issue when using either G-SYNC or FreeSync. Regardless whether the screen can run at up to 120 Hz, 75 Hz, or even 60 Hz, if you have anti-tearing technology, you don't get tearing, which is what those two technologies are, and do. When using G-SYNC or FreeSync, the screen refreshes at the frame rate of the graphics card, so no matter how fast the screen, it will never refresh faster than the actual frame rate. If your game is outputting at 68 FPS, both screens run at only 68 FPS when using G-SYNC or FreeSync. The only benefit a faster refresh rate screen has over a slower one, when both have anti-tearing technology, is for the few frames generated between their respective refresh rates that are above the speed the slower of the two screens can show. If your game is running between 76 - 120 FPS, the 120 Hz screen can show the extra 1 - 44 FPS.
The input lag of a screen has nothing to do with the frame rate being sent to the screen. It has to do with the signal processing time of the screen. Input lag for a screen is the measure of time taken for the screen to process the frame after having received it, and then finally to display that processed frame.
Also, in your above example, the display is not responsible for frame doubling. The 10 Hz screen in your example will only show the same frame twice if it is sent the same frame a second time from the graphics device. A 10 Hz display will happily show blank, or anything else being sent to it for it's 10 unique updates every second.
If you are experiencing some sort of lag due to low frame rates, and it's not because of input lag in the screen, then it sounds like the game is tying it's simulation to the frame rate, which is common because it's an easier way to code games. A 120 Hz screen isn't going to cause a game like this to suddenly double it's input rate because it's doubling it's frames; it doesn't work that way. The input rate is tied to the actual frame rate, not the rate at which frames may be doubled and output, which is not necessarily going to be consistent. It doesn't work that way for games that decouple the world simulation from the frame rate either. The world simulation will run at a much higher speed, and should therefore be able to handle your input at whatever rate your underlying hardware can process it, or at the game engine's internal maximum rate.