Shouldn't low fps cause screen tearing in games too?

gebran bassil

Honorable
Feb 23, 2013
55
0
10,630
I understand that if the framerate that the graphics card is giving is higher than the monitor's refresh rate, then there will be screen tearing because multiple frames are being drawn on top of each other on the screen during one refresh. But say that your framerate is lower than the screen refresh rate, and the refresh rate isn't a multiple of the framerate.... meaning say for example the screen refresh rate is 60Hz, and the Graphics card is giving 40 frames per second, so that means that out of every 3 frames on the screen, only 2 frames are going from the GPU to the screen.
So step by step, we have a 60hz monitor and a 40fps game. First let's say the graphics card is unloading frame 1 from it's front buffer onto the screen, and at the same time it's filling the back buffer with frame 2. Once the screen is done drawing frame 1, it has to draw frame 2, but frame 2 isn't finished yet cause the graphics card is drawing frames slower than the monitor, and to be precise the graphics card needs 1 and 1/2 monitor refresh to draw 1 frame on its back buffer. So now the monitor doesn't get frame 2, so frame 1 is drawn again on the monitor. But once frame 2 is finished in the back buffer, exactly half of frame 1 has already been drawn on the screen, and at the moment when half of frame 1 is on the screen, frame 2 starts being drawn. So we will have half of the screen with frame 1, and the other half with frame 2, which is basically screen tearing.
So doesn't this mean framerates that are lower that the refresh rate cause screen tearing? And why don't they?
 
As far as I know the screen can actually tear at low fps as well.

A game running at a steady 40fps would be displaying a frame every 25ms.
A monitor displaying at 60hz would refresh every 16.7ms.

Therefore to stay in sync and prevent tearing the display would need to spend two cycles to display each frame, which is why the framerate drops to 30fps with v-sync on.

As i understand it, if triple-buffering is on, the GPU has 1 active frame and 2 frames in the buffer, so it can feed the monitor at a higher rate to maintain sync at the cost of some input lag.

You probably already know all of the above - but as far as I know low fps still causes tearing; it may simply not be very noticeable, or you could have triple-buffering turned on.

The other possibility is that say 80fps on a 60hz monitor means replacing frames faster than the monitor can display them, whereas 40fps on the same monitor means the monitor maintains the previous frame until the new one is ready. If that makes sense.
 
Thanks for the reply.
I use adaptive vsync, not triple buffering, which means vsync is automatically turned off whenever the framerate goes below 60fps, and is automatically turned back on whenever the framerate goes back to 60fps.
And even if triple buffering was turned on, it should have no effect whatsoever because I have vsync turned off when framerate is lower than 60fps. Because when vsync is turned off the front and back buffer are constantly switching roles, meaning the graphics card never waits for the screen to finish a frame so it can unload the next frame; meaning that the instant that the graphics card finishes drawing frame 2 in the second buffer, it will immediately unload it onto the screen and start filling the first buffer again, making the third buffer completely redundant.
And yet, I never get screen tearing when the framerate dips to 40~50fps. Though I agree that if vsync and triple buffering were both turned on, then there would be no screen tearing when the framerate dips below 60fps, but this is not the case with me since I always have triple buffering disabled.
 

TRENDING THREADS