Am I missing something about G-Sync?

Denthil

Reputable
Aug 24, 2014
27
0
4,540
1
G-Sync fixes tearing by synchronizing your GPU's frame output with the refresh cycle of your monitor. But tearing occurs when your GPU is outputting frames at a faster speed than your monitor, and in response, your monitor displays two frames at once. So, what if you have a 144Hz monitor and aren't achieving frame rates that exceed 144 FPS? Why do you need V-Sync or G-Sync? Your monitor would just keep refreshing nothing, until your GPU had outputted the next frame. Is my logic flawed?

I'm asking because I'm shopping for a new monitor and need to know if it is worth it. No tearing and no stutters would be great, but (to me) it isn't worth the $200 premium.
 

chenw

Admirable
Oct 25, 2014
1,666
0
6,460
211
As far as I know, V-Sync and G-Sync are two ways to solve the same problem, but the way they solve it is completely the opposite.

V-Sync forces your GPU to wait until your monitor's next refresh if the image is done, then it displays your next image. Due to the way minitor have set refresh times, this causes input lag as the GPU is forced to wait until it displays the current information. Vsync also causes visible stutter when your FPS is below the max refresh of your monitor, because the time variation between frames are fixed, hence a frozen frame tend to be more noticeable. Also if the frame time variance is high, V-sync can feel quite stuttery.

G-Sync is the other way round, it forces the Monitor to wait until GPU has finished rendering, then displays the image, so input lag is removed as GPU no longer has to wait for anything. Also at lower fps the games feels smoother because, while the fps may be the same between v-sync and g-sync are the same, the time they display it are not, and g-sync results in lower fps to feel smoother than it is on v-sync. This is why G-sync costs as much as it does as the Monitor has to have the g-sync module in there to be able to change its refresh like that.

If you don't like the cost of G-sync, you can wait for AMD's FreeSync, which from I gather, it's a similar tech but AMD isn't charging any royalties for the freesync modules., and it only works with AMD cards.

On paper it does not seem that great, it's something you really have to see in person.
 

Denthil

Reputable
Aug 24, 2014
27
0
4,540
1


Yeah, I know. I'm asking if because G-Sync and V-Sync solve the same problem, and that problem occurs when your GPU is outputting at a higher frame rate than your monitor, is ensuring that your monitor is faster than your GPU a viable solution?
Also, I have an nVidia GPU. FreeSync will (probably) be exclusively AMD GPU supported.
 


You completely misunderstand how GPU's and monitors work together.

Tearing is not caused by your GPU producing more frames than its refresh rate. Tearing occurs as a result of the GPU and the monitor not being in sync. It happens at every FPS, from 1 to 300. If you don't use V-sync or G-sync, you get tearing.

A basic run down as to what causes tearing:
The monitor updates its images every refresh, which is usually 60 times per second. This process takes time. It is not instant. At the same time, the GPU is creating new images. As soon as it finishes creating an image, it is sent to the front buffer so the monitor can display it. When the GPU sends a new image to that front buffer while the monitor is updating its image, you get a tear. This is because the monitor fails to complete its update with the previous image, and finishes the update with the new frame. This results in two frames being displayed at once, and a tear is seen between those two images. Monitors spend most their time updating with only a small window when it is not updating, so for the most part, if you don't have V-sync or G-sync on, tearing occurs almost every time the GPU creates a new frame.

V-sync solves this by forcing the GPU to wait until the monitor is in vertical blanking mode (between refreshes), before it is allowed to change the front buffer.

G-sync solves this by forcing the monitor to wait to perform a refresh until the GPU has sent a new frame to the front buffer, with some exceptions if your FPS drop below 30, or goes beyond your refresh rate.
 

Similar threads


ASK THE COMMUNITY

TRENDING THREADS

Latest posts