Is there a difference between the way AMD and Nvidia process Vsync? I'm asking because I gamed with a HD6950 for about two years and used vsync the whole time because I hate screen tearing. With the 6950 It would cap at 60FPS if it had the power of then go lower if it couldn't make it. Now I have a GTX770 and Vsync seems to go straight to 30 FPS if it cant hit 60. I have Triple Buffering on and have tried using adaptive but the tearing is bad under 60FPS so adaptive doesn't help. Do I have to choose between 30 FPS and tearing just because I purchased an Nvidia card?