Nvidia G-SYNC Fixes Screen Tearing in Kepler-based Games

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
AMD could do it. May be with a driver update on both side AMD and the monitor makers, but .... dont u think Nvidia will butthurt about this?

besides, IT still doesnt make any sense why this thing has to be GTX650Ti boost and above. GK107 is also a kepler chip, GTX650ti is the same chip.


 


Fixed. I didn't mean to imply the refresh rate remained at 60. I'm just using a 60 Hz monitor as an example to show that the FPS won't exceed 60.
 


LOL
 
I read up quite a bit on this new tech. At first, I didn't get many details, but after searching around I think I've figured it out. Traditionally, a monitor or screen device locks its 'refresh' rate to an even number of intervals (60hz for instance is 16.6ms). What this tech basically does, is do away with fixed intervals (mainly because LCDs don't really need it. It would be a little trickier to do on a beam scanning device like a CRT). It remains to be seen what intervals periods will be available. I think this tech is work better at lower 'refresh' rates, simply because if the max update rate was 6.94ms (144hz), you won't get a finer/smaller interval than that - else your ceiling would be higher than 144hz. Which could be possible, but that means the native refresh rate would have to be higher than what the 'user' or device ~can~ select (say like 240hz or higher). Anyway, it's about time this type of monitor/screen interfacing came around.
 
dragonsqrrl wrote:
Of course at a certain point a game won't look smooth no matter what. At 5 fps rendered graphics are going to look like a slideshow regardless (although those will be some clean, tear-free 5 frames per second), but I think G-Sync does give you much more flexibility with low frame-rates, but again only to a certain point. I'm willing to bet you could probably go as low as ~24 fps and still have a perfectly 'smooth' experience.
Unfortunately you'll still need 30 FPS for this to work. That is the lowest refresh rate that G-sync allows. When the refresh rate reaches 15hz, monitors get light variance problems, so they did put a 30hz minimum refresh rate to prevent issues.
 
"Back then, the U.S. power grid was based on a 60 Hz AC power grid, thus setting the TV refresh rate to the same 60 Hz that made it easier to build for TVs"

1) What's with the "back then"? USA has always used 60Hz since then, so the "Back then" should be between "that" and "made"

2) TV manufacturing wasn't any different between 50Hz and 60Hz or even 85hz, 60Hz on 60Hz lines just made the power factor issue go away (devices essentially turning off when the cycle was below a certain voltage) on both the tv (which couldn't store enough energy between cycles to power a particle accelerator) and camera (similar issue).

"The PC industry simply inherited the refresh rate."

1) No, the PC industry started by using modified TVs for monitors, so the 60Hz rate was a must. By the time PCs were popular, the major manufacturers simply used their standard TV models and slapped on a vga cable.

2) The reason we now have 60Hz isn't because of that, rather that there is a physical limit to how fast an LCD can be switched. Even the fastest grey times are only good for 6-8ms, and the average full range switch is in the range of 16ms (60Hz). To go faster than 60Hz requires overdriving (and even 60Hz often does), and that makes faster speeds practically pointless (since you won't be able to display it anyway, and instead will just smear all the images together)
 


Many have answered but the answer to this question is yes we care about FPS, probably even more than before, and no 30FPS on a g-sync will still look slightly less fluid than 60FPS on a 60hz v-sync monitor.

The main advantage here is that 35fps on a g-synced monitor will look far smoother than 35fps on a regular 60hz monitor. Since the monitor will actually draw 35fps, where 35fps on a 60hz monuitor you have to decide if you want to see screen tearing and kind of see the 35fps, or delay some frames and see judder.

Another advantage that many people don't seem to be pointing out is that this seems to eliminate the need for triple-buffering since the sole purpose of triple buffering is to be able to work on frames while the other frames are being displayed, swapped, and cleared. This should reduce your input latency by a full frame which should make the games feel more responsive.

Smoother looking, no tearing, more responsive... pretty much everything gamers have been asking for. Too bad they didn't make it an open standard though. Even so, for me it's time to start saving for a new monitor/video card combo since this sounds fantastic on paper, hopefully it works as well as it sounds in practice.
 
Mousemonkey , October 18, 2013 10:48 PM
'So why don't you just develop the hardware and or software required and then give it away for free? That way everyone will be happy and you'll get what you want.'

I must ask this. Did Nvidia spent your money on this thing? Bceause you seem like you are insulted by idea for this to be open. God forbid something free and in favor of every man on Earth. But ok. It's usual for Nvidia fans.
 


No they didn't spend my money on this but why should it be open? Do you not understand the concept of business and competition?
 


Graphics card companies usually use an open standard if one exists, but if none exists, they make a proprietary one. There is no G-sync like standard, so they made their own.

If this is successful, or even if isn't all that successful in a sale view, I bet a standard will be created. Just like 3D Vision and later HD3D came about.
 


That could happen but then both companies have their own endorsed monitors out already and whilst I'm not too sure what special features the AMD ones have I do know that the Nvidia 3D ones come with their own glasses because they have to be active as opposed to AMD's passive setup. Whether this actually takes off or not is unknown but as they already have a foot in the monitor market and are well known for being able to throw money at things, who knows?
 
This is basically fixing an issue they created anyway. Prior to digital displays there were these things called analog displays that had to be synced. With analog displays the output device was synced when the frame was completed so the video card could update the video buffer accordingly. This sync signal was available for years. For proof look at the Nintendo Lightgun. You can see what happened when people didn't make use of it when displays and video screens flash in live broadcasts and movies, that's un-synched video. For whatever reason they decided this isn't necessary for digital displays. The only difference is a digital display writes a screen-full at a time whereas the video buffer is being filled sequentially. If they had simply payed attention to timing like they did with analog displays this wouldn't be a problem.
 
So in future you can only use those monitors that work with your own GPU, and if you change your GPU you allso have to buy new monitor from different producer, so that it will be compatible with your new hardware... nice...
This seems to be somewhat familiar, you can only use those hardware that are compatible with specific hardware... now I remember! It is called Apple!
Not having tearing in picture is good, only having it with very spesific set of hardware has it disadwantages. Generic solution via displayport would be win, win situation for consumers, but that is so last season stuff or is it?
 

Nvidia plans to have a kit to convert at least some monitors to be able to use G-sync. This may also mean AMD could possibly do the same.
 


I did say that earlier but the AMD fans would rather make daft Ad hominem attacks than useful or meaningful comments.
 


So somekind of expansion card that you can change depending on what GPU you hapen to own at that moment. That would be nice, but I have newer seen this kind of upgradeable expansion cards to be cheap. They are used in somewhat in AV-equipment where you can for expample upgrade your AV-amplifier to support new standards. But they tend to cost a guite a lot, but still less than completelly new apmplifier, so there may be future for that kind of equipment.

But all in all this is problematic situation. I personally now have AMD GPU in my computer. I allready have Nvidia card waiting for installment, when I have time to do some other upgrades to my computer too. Who knows what GPU I will have after that. It may be again Nvidia, but if AMD is making better deal by that time I may switch back to AMD with my PC. That is the strong point of PC machines. You chose those part that are best and most economical at current situation! Having different monitors for different GPU would make that much harder. So GPU expansion slot with 100$ AMD or Nvidia sync expansion card would be really nice option, or would it be better to have an open standard for sending sync codes to monitor... I prefer the later just for good competition because the prior would lead more expensive monitors...
 


Unfortunately your wishes don't gel with the way business competition works.
 
Most true gamers will be able to push a steady 60+ fps at all times with their chosen hardware. I don't really see reason to spend an extra $100 on G-Sync, I would rather buy a regular 144 or 120hz $299 monitor and put the extra $100 towards the GPU.
 


It appears the first G-sync monitors are the 144hz monitors.

Gsync allows for not using v-sync, which causes latency and stutter, and still not get tearing. I don't see how this isn't a good thing.
 
This is worthless to me. There is no way I'm going to buy all new monitors just to get a benefit that if you're running GTX class cards, is'nt that big of a deal anyway.
 


"meaningful comments" ???
 
Status
Not open for further replies.