[SOLVED] Screen Tearing even with GSync?

ps2cho

Distinguished
Oct 19, 2008
64
1
18,535
So I recently upgraded with a 27" HP Omen 27i 165Hz IPS display with my 2080 Super.

When running PUBG I am still noticing screen tearing - I can see the horizontal lines when running.
I have GSYNC enabled in the Nvidia control panel - the monitor is set on 165Hz refresh rate.

Is this normal to expect to see this? I thought Gsync was supposed to eliminate tearing completely? Anything to test?
 
Solution
According to https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/15/ , you still need to enable VSync if you're hitting the G-Sync ceiling to avoid tearing:

Wait, why should I enable V-SYNC with G-SYNC again? And why am I still seeing tearing with G-SYNC enabled and V-SYNC disabled? Isn’t G-SYNC suppose to fix that?
(LAST UPDATED: 05/02/2019)

The answer is frametime variances.

“Frametime” denotes how long a single frame takes to render. “Framerate” is the totaled average of each frame’s render time within a one second period.
At 144Hz, a single frame takes 6.9ms to display (the number of which depends on the max refresh rate of the display, see here), so if the framerate is 144 per...
According to https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/15/ , you still need to enable VSync if you're hitting the G-Sync ceiling to avoid tearing:

Wait, why should I enable V-SYNC with G-SYNC again? And why am I still seeing tearing with G-SYNC enabled and V-SYNC disabled? Isn’t G-SYNC suppose to fix that?
(LAST UPDATED: 05/02/2019)

The answer is frametime variances.

“Frametime” denotes how long a single frame takes to render. “Framerate” is the totaled average of each frame’s render time within a one second period.
At 144Hz, a single frame takes 6.9ms to display (the number of which depends on the max refresh rate of the display, see here), so if the framerate is 144 per second, then the average frametime of 144 FPS is 6.9ms per frame.

In reality, however, frametime from frame to frame varies, so just because an average framerate of 144 per second has an average frametime of 6.9ms per frame, doesn’t mean all 144 of those frames in each second amount to an exact 6.9ms per; one frame could render in 10ms, the next could render in 6ms, but at the end of each second, enough will hit the 6.9ms render target to average 144 FPS per.

So what happens when just one of those 144 frames renders in, say, 6.8ms (146 FPS average) instead of 6.9ms (144 FPS average) at 144Hz? The affected frame becomes ready too early, and begins to scan itself into the current “scanout” cycle (the process that physically draws each frame, pixel by pixel, left to right, top to bottom on-screen) before the previous frame has a chance to fully display (a.k.a. tearing).

G-SYNC + V-SYNC “Off” allows these instances to occur, even within the G-SYNC range, whereas G-SYNC + V-SYNC “On” (what I call “frametime compensation” in this article) allows the module (with average framerates within the G-SYNC range) to time delivery of the affected frames to the start of the next scanout cycle, which lets the previous frame finish in the existing cycle, and thus prevents tearing in all instances.
 
Solution