VSync, On or Off?

pcnoob101

Distinguished
Jun 15, 2009
102
0
18,680
so, I heard things about Vsync limitng frames thus lower power consumption, less tearing.
so does it really do that? i keep it off because i like to see the fps all the time. BUT is it better on or off?
less system stress?
 
V-sync keeps the frame rate the same as the refresh rate fo your monitor, hence no tearing, If your GPU is pushing more FPS than the refresh rate then V-sync is better turned on in my opinion, but then i dont like tearing in games....

It will have no effect on the power consumption if you turn it on...

You cannot see FPS...unless you are talking about something like Fraps which is showing you tthe FPS whilst you are playing.. Once it goes over a certain amount, you do not see any more as that is the speed that your eyes and brain can see and calculate... I have heard people say that some can see a higher FPS than others but am yet to see proof of this, In my honest opinion, 45fps is no different to me than 60fps or 100 fps, but then maybe that is all my brain can cope with lol...

30 FPS is commonly used as the bare minimum a Game should run at for smooth gameplay! i like this figure for RTS RPG MMO games but do prefer 45+ for Shooters though.
 
Just as Moricon says, if you're producing more FPS than your screen can display, then turn Vsync on. This stops the next frame arriving from the graphics card before the screen has finished drawing the last - so you end up with some of frame 1 showing with some of frame 2 (which shows as tearing).

If however, you're producing the same or less FPS than your screen can display, turn Vsync OFF! When you hit under 60 FPS, Vsync will limit you to only exact fractions of this number, so 30, 20, 15, 10 FPS, etc. This means if your card is able to produce approx 55 FPS, Vsync will actually limit you to 30.

Best advice IMHO, turn Vsync off always and then enable it only if you notice tearing in that particular game.
 
VSYNC is only really worth it when screen tearing is excessive and constant, or if you are playing a game where you already extremely high framerates.

its not worth the performance hit when often you wont notice a difference in quality.
 
I suffer from screen tear alot in Dirt 2,so i enable vsync,like moricon says you cant see any diffenence above a certain frame rate anyway and that frame rate is 24fps any lower than that and then you start to see the dirrerence,iam led to believe the brain,or eye what ever it may be cannot see the difference above 24 fps,so if you are suffering screen tear i would enable vsync.
 
vsync requires 3 buffers to work correctly(without hic-ups), so it will use up slightly more video memory.

Here's how a standard double buffered(no vsync) renderer works.

Videocard front end continuously streams Buffer 1 to your monitor. Your GPU is cranking away to render the next frame to Buffer 2. The frame finishes, now Buffer 1 and 2 are swapped. This swapping more than likely happens mid-frame and you see tearing.

In this case, the GPU is running full power to try to render to Buffer 2.

Typical Tripple buffering (vsync on). Buffer 1 is ALWAYS in sync with your monitor and constantly streams data. Buffer 2 is not in sync with anything is an updated whenever the GPU is done rendering the current frame. Buffer 3 is actively rendered to from the GPU

So, the GPU renders FULL SPEED to buffer 3, once the GPU is done with the current frame, it swaps Buffer 3 and buffer 2 and starts immediately on the next frame.

Buffer 1 may still be streaming to the monitor, so buffer 1 and 2 don't swap. Now, it just so happens that the GPU finishes yet ANOTHER frame and again swaps buffer 2 and 3. So, the previous frame got rendered but never displayed.

Now, the monitor is re-syncing and swaps buffer 1 and 2 and now you get to see the next frame.

Notice that in both v-sync on and off, the GPU is going full out.

With v-sync, if your FPS is under your refresh rate, then you will see every frame rendered. If you produce more FPS than your refresh, some frame will just be dropped and replaced with new data.

edit:

I can sense sub 50fps but only in games where I can move with the mouse, like FPS games.

If I suddenly swing the camera 180 degrees and I'm getting 30FPS, I will notice gaps between objects on the screen. Any time there is fast motion going on, gaps between updates of the objects on the screen is noticeable.

You can think of the Frame Rate as a strobe light. If something is moving fast enough with a slow strobe, you will see jumping. I start to notice this jumping around 50FPS but only for faster moving games. Games like WoW/StarCraft/Diablo I'm typically fine down to 30FPS, but even then certain special effects may seem a bit choppy.

I usually don't have problems until I'm under 40FPS. Around 40fps the choppiness can get bad enough that a rocket in close quarters chops too much to estimate it's relative speed and it's harder to predict it's impact. So, instead of a fast object moving towards you, it looks more like a rocket teleport jumping towards you at an average speed, but it's hard to figure out the speed. Like someone throwing a base ball at you with a strobe running, you'll probably miss the ball.