[citation][nom]Hetneo[/nom]Wrong, because of vsync basically. Monitor is refreshed every 1/60th of second, or 72nd or 75th or 120th depending the settings of monitor 60, 72, 75 or 120 Hz, it's controlled by crystals in it and there is no arbitration in it's frequency. Vsync makes sure that frames that are not in sync with refresh rate are not displayed at all, that's how vsync works. If you have screen tearing with vsync on, then there's something very odd going on with your monitor and people who know what they are talking about call such monitors by one pretty and short word, "broken".[/citation]
I think you misread what I wrote or you misunderstood.
I was talking about when v-sync is not on, you can get screen tearing, even when your FPS are lower than the refresh rate.
You also do not understand how refresh rate works with LCD's. While they are solid state, and only change when told to, monitors are forced to refresh the state or at least check their state, once every refresh. The monitor will look at a block of memory designated as the monitors buffer, and display it to the screen. Without v-sync on, the video card can be in the middle of updating that buffer at the same time the monitor is updating its image, resulting in screen tears.
As far as v-sync only ignoring the extra frames, again, this is wrong the vast majority of the time. You may find this functionality in benchmarks so that it can still measure how many FPS your card can perform, without tearing, and this sort of functionality can be programmed into a game, in an attempt to avoid tearing and not limit FPS, but this is not how the setting v-sync normally behaves, and never the way it behaves when you turn it on in your Nvidia or AMD control panels. Normally it renders a frame, and if the monitor is updating the screen, the GPU is instructed to wait until it is done before sending that image to the screen buffer.
Try turning on a FPS monitor sometime and test a variety of games. You'll see the FPS get capped at your refresh rate in almost every case. I have seen an exception twice, so I'm not saying you are always wrong, but most the time you are.
I think you misread what I wrote or you misunderstood.
I was talking about when v-sync is not on, you can get screen tearing, even when your FPS are lower than the refresh rate.
You also do not understand how refresh rate works with LCD's. While they are solid state, and only change when told to, monitors are forced to refresh the state or at least check their state, once every refresh. The monitor will look at a block of memory designated as the monitors buffer, and display it to the screen. Without v-sync on, the video card can be in the middle of updating that buffer at the same time the monitor is updating its image, resulting in screen tears.
As far as v-sync only ignoring the extra frames, again, this is wrong the vast majority of the time. You may find this functionality in benchmarks so that it can still measure how many FPS your card can perform, without tearing, and this sort of functionality can be programmed into a game, in an attempt to avoid tearing and not limit FPS, but this is not how the setting v-sync normally behaves, and never the way it behaves when you turn it on in your Nvidia or AMD control panels. Normally it renders a frame, and if the monitor is updating the screen, the GPU is instructed to wait until it is done before sending that image to the screen buffer.
Try turning on a FPS monitor sometime and test a variety of games. You'll see the FPS get capped at your refresh rate in almost every case. I have seen an exception twice, so I'm not saying you are always wrong, but most the time you are.