I understand that if the framerate that the graphics card is giving is higher than the monitor's refresh rate, then there will be screen tearing because multiple frames are being drawn on top of each other on the screen during one refresh. But say that your framerate is lower than the screen refresh rate, and the refresh rate isn't a multiple of the framerate.... meaning say for example the screen refresh rate is 60Hz, and the Graphics card is giving 40 frames per second, so that means that out of every 3 frames on the screen, only 2 frames are going from the GPU to the screen.
So step by step, we have a 60hz monitor and a 40fps game. First let's say the graphics card is unloading frame 1 from it's front buffer onto the screen, and at the same time it's filling the back buffer with frame 2. Once the screen is done drawing frame 1, it has to draw frame 2, but frame 2 isn't finished yet cause the graphics card is drawing frames slower than the monitor, and to be precise the graphics card needs 1 and 1/2 monitor refresh to draw 1 frame on its back buffer. So now the monitor doesn't get frame 2, so frame 1 is drawn again on the monitor. But once frame 2 is finished in the back buffer, exactly half of frame 1 has already been drawn on the screen, and at the moment when half of frame 1 is on the screen, frame 2 starts being drawn. So we will have half of the screen with frame 1, and the other half with frame 2, which is basically screen tearing.
So doesn't this mean framerates that are lower that the refresh rate cause screen tearing? And why don't they?
So step by step, we have a 60hz monitor and a 40fps game. First let's say the graphics card is unloading frame 1 from it's front buffer onto the screen, and at the same time it's filling the back buffer with frame 2. Once the screen is done drawing frame 1, it has to draw frame 2, but frame 2 isn't finished yet cause the graphics card is drawing frames slower than the monitor, and to be precise the graphics card needs 1 and 1/2 monitor refresh to draw 1 frame on its back buffer. So now the monitor doesn't get frame 2, so frame 1 is drawn again on the monitor. But once frame 2 is finished in the back buffer, exactly half of frame 1 has already been drawn on the screen, and at the moment when half of frame 1 is on the screen, frame 2 starts being drawn. So we will have half of the screen with frame 1, and the other half with frame 2, which is basically screen tearing.
So doesn't this mean framerates that are lower that the refresh rate cause screen tearing? And why don't they?