Nvidia Responds to GeForce 600 Series V-Sync Stuttering Issue

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]jasonpwns[/nom]The problem is, I get some annoying screen tearing if I don't turn v-sync on.[/citation]

How can you get tearing if you have FPS below 60? What is the refresh rate of your display? Are you sure that you are seeing tearing and not stuttering? The two are not the same and it seems like you have some sort of stutter, not tearing, unless your display has a low refresh rate.
 
[citation][nom]caedenv[/nom]what are you talking about? Vsync is great for reducing screen tearing issues, and helping up the graphical quality without issue (limited frame rate generally frees up more processing power for higher settings). If it is causing that many issues for you then I would suspect something to be wrong with your machine, not with vsync.[/citation]

Vsync causes an absurd amount of input lag. I seriously dont know how people dont notice. When i play an FPS on my PC i can immediately tell if its on or not.
 
[citation][nom]blazorthon[/nom]How can you get tearing if you have FPS below 60? What is the refresh rate of your display? Are you sure that you are seeing tearing and not stuttering? The two are not the same and it seems like you have some sort of stutter, not tearing, if it happens at a lower FPS, unless your display has a low refresh rate.[/citation]

I maintain an FPS of above 45 at nearly all times when v-sync is turned off. Usually around 50-60 So it isn't stuttering caused by low FPS. I can tell the difference between stuttering and screen tearing too.
 
[citation][nom]blazorthon[/nom]The only resources that would not be used would be parts of the GPU that then can't do anything else. For example, some cores might either idle or not work as hard, but what can you do about it? What would they be used for if you could take advantage of them? They might be needed to be held in reserve to keep FPS over 60 during intensive parts of a game or whatever. *Snip just to keep spam down*.[/citation]

I agree with you. Mostly I just wanted to point out that vsync can actually reduce strain on the GPU. It doesn't really mean you can take advantage of the extra GPU resources (maybe you can? I really have no clue). Of course, one nice thing would be that if the GPU is working less hard, then you are hopefully using less power and creating less heat.
 


I did not mean stuttering caused by low FPS. There are many kinds of stutter and more specifically, micro-stuttering might have been the problem. Regardless, there is absolutely no reason for a 60Hz display to have tearing when games that you play don't even go far above 60FPS in your current configuration. Tearing is caused by having FPS that is considerably above your display's refresh rate so if you have bad tearing at 40FPS to 60FPS, well then that is just plain weird, unless you have a bad display.
 
and will not be available until our next major driver release targeted for June (post-R300)

Amusingly, it's almost ten years since the ATi R300 launched. I know it's got nothing to do with the news article, but noteworthy nonetheless.
 
I am lucky enough to have both a crossfire 7970 setup and an SLI 680 setup (AMD on a 2600K, Nvidia on a 3770K). Let me just say that even with a 3 month head start (79xx over 680), I have FAR more issues with games and AMD drivers than Nvidia's. And that was pretty much consistant over the 2 years I had a 5970 in my system versus the 480 Fermi I had at the same time. It has been my experience that Nvidia simply writes better drivers. Maybe they have a much larger driver team?
 
[citation][nom]kniped[/nom]Vsync frequently makes a game unplayable with a mouse anyways so who does this really affect?[/citation]

Dude, stop playing your games on Intel GMAs.
 
680 owner here. I get this problem when I play Diablo 3. It only happens with adaptive v-sync. 99% of the time it's OK because my frames are over 60. On the rare occasion there are 4 players, everyone casting a bunch of spells, and like 30 monsters on the screen exploding, the frames will go below 60. That's when adaptive vsync turns itself off and there will be a slight stutter.
 
on the v-sync issue i am wondering why do your monitors only refresh at 60Hz and not at the 75Hz or higher that some of the gaming monitors can actually do? I ask this because you have such high end GPUs but a lower end monitor you should cut some money from the GPU budget and get a better monitor if you can only do 60Hz for refresh as that is the minimum for many.
v-sync helps when your card is able to do your monitors refresh rate + n with n being the amount of frames it has rendered but has not had the capability to actually display because of the refresh rate is lower then the FPS.
In this driver issue it is not surprising as the Kepler GPU design is fairly new and was bound to have problems like anything new. It is good that nVidia is fixing it while it is good that it does not affect all users of the 600 series. In this it is not fair to criticize AMD and/or nVidia as both companies make mistakes and try to fix them ASAP just nVidia seems to be faster.
What would be a good thing for the future is for nVidia, AMD, intel and VIA to take more testing of their graphics chips and processors before the release so this kind of thing doesn't happen
 
i had problems with vsync for a while till i tried this.turn on you vsync and leave the cards settings a defult and turn on target framerate to 65-70.it seemed to work for me.
 
[citation][nom]blazorthon[/nom]How can you get tearing if you have FPS below 60? What is the refresh rate of your display? Are you sure that you are seeing tearing and not stuttering? The two are not the same and it seems like you have some sort of stutter, not tearing, unless your display has a low refresh rate.[/citation]

It's a common misconception that lower FPS than your refresh rate can't have screen tearing. However, you can. Withoug V-sync, even below your FPS, there is nothing stopping the monitor from updating the screen at any time, including when the video card is part way through updating a frame. This causes tearing.
 
[citation][nom]yumri[/nom]on the v-sync issue i am wondering why do your monitors only refresh at 60Hz and not at the 75Hz or higher that some of the gaming monitors can actually do? I ask this because you have such high end GPUs but a lower end monitor you should cut some money from the GPU budget and get a better monitor if you can only do 60Hz for refresh as that is the minimum for many. v-sync helps when your card is able to do your monitors refresh rate + n with n being the amount of frames it has rendered but has not had the capability to actually display because of the refresh rate is lower then the FPS.In this driver issue it is not surprising as the Kepler GPU design is fairly new and was bound to have problems like anything new. It is good that nVidia is fixing it while it is good that it does not affect all users of the 600 series. In this it is not fair to criticize AMD and/or nVidia as both companies make mistakes and try to fix them ASAP just nVidia seems to be faster.What would be a good thing for the future is for nVidia, AMD, intel and VIA to take more testing of their graphics chips and processors before the release so this kind of thing doesn't happen[/citation]

Don't forget there are those with 2560x1440 IPS monitors, which are 60hz and need more power to run, and there are those with 3 1920x1080(1200)p monitors, which also need a lot of power.
 
[citation][nom]jasonpwns[/nom]I always get a lower frame-rate when I enable v-sync than when I disable it.[/citation]
You might not be aware, but your monitor can only update the screen at it's refresh rate. That means a 60hz monitor can only display 60 FPS. Anything beyond 60 FPS on a 60hz monitor, will not be displayed, or worse, will result in partial images being displayed on the screen. V-sync, while reducing 'FPS', may also be displaying exactly the amount of frames that your monitor is capable of, and no more. This can keep heat and noise levels down from the GPU as well as removing screen tearing.
 
[citation][nom]bystander[/nom]This can keep heat and noise levels down from the GPU as well as removing screen tearing.[/citation]

This is why I use the "Framerate Target" in Precision X. I too notice the input lag with Vsync on so I use the games built-in frame limiter or Precision X.

BF3 I keep it @ 70FPS(It has issues at 60) and it runs great and stays cooler as well.
 
[citation][nom]cliffro[/nom]This is why I use the "Framerate Target" in Precision X. I too notice the input lag with Vsync on so I use the games built-in frame limiter or Precision X.BF3 I keep it @ 70FPS(It has issues at 60) and it runs great and stays cooler as well.[/citation]
I use V-sync (not adaptive), but I have a 120hz monitor. Unfortunately for you, if you use 70 FPS as a target FPS, you are going to see some tearing. Although it doesn't seem to bother some people.
 
[citation][nom]atikkur[/nom]was it regular vsync or adaptive vsync that affected? and do pre 600 series affected too? (500,400,200,9,8,7,6,fx,4,3,2,tnt?)[/citation]
Yes in some cases they are affected. People with 300.83 drivers don't have problems but people with 301.10 or 301.34 do have them.
 
[citation][nom]ewood[/nom]how does v sync free up resources? because it limits the frame rate? if you up the settings beyond what your graphics card is capable of the frame rate will dip below the v sync frequency. v sync is meant to eliminate screen tearing and in no way does it free up resources[/citation]

My system (Core i7 2600k @4.6 and GTX680) saves about 100-130W when vsync is on, according to my UPS monitoring software. According to PrecisionX, my video card stays below 800MHz in WoW with 32X FXAA and adaptive vsync on. It does actually reduce processor and video card utilization, saving resources and power.
 
[citation][nom]blazorthon[/nom]Complete lie and you know it. There was no bias in how Tom's did that AMD article and there is no bias in this one. In fact, Tom's clearly says that Nvidia's message was not very reassuring whereas AMD fixed their problem before Tom's addressed it, so if there is any bias, it'd against Nvidia, not against AMD. Fail troll is fail.[/citation]

not true blazorthon ... only problem is that tom's in effect excoriated AMD for a paper launch of the 7970, and they did not do the same for the GTX 670... which they said was the greatest thing since sliced bread!... when a spade is a spade call it a spade, not a heart.
question is... is it two weeks after the launch of the gtx670?... Yes?... are there any available?... no!.. AMD had units available two weeks after the launch date.... excuses on Nvidia's part do not cut it, no availablilty means PAPER Launch... and tom's conclusion of the AMD 7970 article smacked of nvidia "love" or hatred over the fact that they had to work harder because the launch date was moved up...
 
[citation][nom]redeye[/nom]not true blazorthon ... only problem is that tom's in effect excoriated AMD for a paper launch of the 7970, and they did not do the same for the GTX 670... which they said was the greatest thing since sliced bread!... when a spade is a spade call it a spade, not a heart. question is... is it two weeks after the launch of the gtx670?... Yes?... are there any available?... no!.. AMD had units available two weeks after the launch date.... excuses on Nvidia's part do not cut it, no availablilty means PAPER Launch... and tom's conclusion of the AMD 7970 article smacked of nvidia "love" or hatred over the fact that they had to work harder because the launch date was moved up...[/citation]
Radeon HD 7970 wasn't available for three weeks after AMD paper-launched it. GTX 670 was available for a week before it ran out of stock. If you wanted one, you could buy it on launch day.

Now that the thing is gone, that's a different story, but approaching the GTX 670 with a cautious tone was the right thing to do.

Remember, I'm the one who actually paid $550 each for two Radeon HD 7970s when they showed up on Newegg ;-)
 
[citation][nom]dgingeri[/nom]My system (Core i7 2600k @4.6 and GTX680) saves about 100-130W when vsync is on, according to my UPS monitoring software. According to PrecisionX, my video card stays below 800MHz in WoW with 32X FXAA and adaptive vsync on. It does actually reduce processor and video card utilization, saving resources and power.[/citation]
You kiddo don't know how vsync works. All frames are always rendered, it doesn't matter whether it's double buffering or triple buffering used. With vsync frames that are not in sync are discarded. Vsync doesn't limit number of frames rendered, but number of frames displayed.
 
[citation][nom]Hetneo[/nom]You kiddo don't know how vsync works. All frames are always rendered, it doesn't matter whether it's double buffering or triple buffering used. With vsync frames that are not in sync are discarded. Vsync doesn't limit number of frames rendered, but number of frames displayed.[/citation]
Actually, in most cases, he was correct. However, there are cases were you are correct, but it is far less common.
 
[citation][nom]bystander[/nom]It's a common misconception that lower FPS than your refresh rate can't have screen tearing. However, you can. Withoug V-sync, even below your FPS, there is nothing stopping the monitor from updating the screen at any time, including when the video card is part way through updating a frame. This causes tearing.[/citation]
Wrong, because of vsync basically. Monitor is refreshed every 1/60th of second, or 72nd or 75th or 120th depending the settings of monitor 60, 72, 75 or 120 Hz, it's controlled by crystals in it and there is no arbitration in it's frequency. Vsync makes sure that frames that are not in sync with refresh rate are not displayed at all, that's how vsync works. If you have screen tearing with vsync on, then there's something very odd going on with your monitor and people who know what they are talking about call such monitors by one pretty and short word, "broken".
 
Status
Not open for further replies.