Nvidia G-SYNC Fixes Screen Tearing in Kepler-based Games

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


What you described was what V-sync already does. This system has drawbacks though. Frames are created, and wait in the buffer until the display's refresh is ready to display it. That results in delays. Further more, the way DX handles V-sync, it buffers one additional frame, resulting in additional latency. OpenGL does it exactly as you mentioned, though with an additional buffer, so writing isn't happening during the displaying process.

Furthermore, there is the issue of time sync to your actions and when it gets displayed. If these frames are rendered in 8ms of time, but it alternates between 0 and 16ms to be displayed to the screen, the time sequence may cause a bit of stutter as it is inconsistent in its delivery time in relation to when the frames are created and ready.

This method allows frames to be displayed the moment they are finished rendering, resulting in no wasted GPU time, and no additional latency caused by V-sync.
 
I watched the video dragonsqrrl posted and I didnt see any tearing or jarring difference till they started spinning the gazebo then it was really apparent. The guy speaking was really excited about stuff that no one could see and had to tell the audience to say wow... Another good idea ruined by over hyping its function. Good job nvidia.
 


I kinda understand but I don’t get why telling the display to wait or slow down on the fly is going to fix it. If the card is generating a reasonable number of FPS and I think someone said the G-Sync needs 30+, why is there noticeable lag? So, something like: You have buffer A and buffer B and one is always ready to be read and ones is always being written to. Whichever was last complete is sent. If A was sent last and B still is not ready send A again. How is that different than the monitor slowing down to continue to show that last completed frame or is that what the G-Sync hardware is doing in the display? Just wondering how it all works. I’ve felt lag for sure but I thought it was way more than one or two or even three 60th (not sure how to write that) of a second. Thanks for the info.
 



Its not really a big deal tbh, this is what vsync should have been in the first place.

Certainly not worth the premium
 


All I can say is that we can tell what an additional 16ms of delay causes. I actually get sick until I get to 80+ FPS in first person games as a result of this delay.

There is a reason FPS gamers like 120hz gaming without v-sync on. There is a noticeable difference. This is especially apparent when moving a mouse to change your view or track a target.
 
Wow. I really feel bad for AMD. I don't see how they can compete with that absolutely GIANT leap in technology. It's not just something more of something existing, it's a brand new idea. Well, farewell AMD, and it's too bad, because Nvidia will juice us out without competition.
 


Why? Because 60fps graphics != 60Hz display. There are major timing issues involved, and the reason why you can see screen tearing even in higher frame-rates. The Radeon crossfire runt frames are a perfect example of bad timing. For $100 extra, you won't be able to more than double your FPS (once you go above twice the refresh rate of the monitor the artifacts tend to go away), unless you had integrated graphics (in which case you can't use gsync anyway)
 


I think you are quite wrong, especially in this case. It isn't a solution that locks out AMD from anything but the specific feature, and it doesn't require any work from developers to use. It simply requires you to purchase the correct hardware, and it works.

I expect AMD to either come out with their own version, or an open standard will be created that AMD will follow. This tech simply makes sense, and something people have mentioned they wanted in the past.
 


I know you disagree but I really just think it should be handled in the card and feels too similar to when companies tried to sell physics cards or cards that made your network connection faster. I don’t think that many people are going to run out and buy it, it will just be one of the many logos on the monitor or video card box you buy that may or may not work. I still don’t understand dual video cards that don’t work. Seems like the game developers should not have to do anything and AMD/NVidia should just make it work. It’s their hardware and drivers and you see reviews about this time they got it right and every game (reviewed) gets almost 200% scaling.

Maybe it’s really the only way to fix these issues but it still not a 100% fix. If it’s really a physical chip that monitor manufactures have to include and pay a royalty for I can’t see this working. It needs to be an industry standard that they can include to cover any video card. Same kind of thing with DisplayPort vs. HDMI. Monitor manufactures just want a contention that works but middlemen want their nickel and dime. Just make a good product that works and I’ll buy it but please stop Apple-ing people.

It kind of locks AMD out, are they going to be able to get monitor manufactures to pay for or include yet another standard? On top of everything else they really want to try and tie my monitor purchase to their card? Sweet, there is the first 30” 4k OLED 600Hz monitor for under a $1000 to replace my Dell 3011… Oh wait, it don’t have G-Sync. 🙁
 


There is a HUGE difference between PhysX and this. Gsync does not require any assistance from developers. This is more like when AMD included MLAA on their 6000 series of cards, then later Nvidia came out with FXAA, only this requires a little help from monitor companies with a chip.

This is a first step, and I'm certain in a couple years, both companies will offer the same thing, and the chip will be standard in most monitors. It makes too much sense not to.

It won't likely work on TV's for many years if ever, but TV's aren't meant for PC's.

 
I agree that this is great technology and will probably have an open standard built around it to support any videocard. However, GSync itself is a dead concept since its locked down just like mantle is a dead concept. Mantle will work on NVidia cards as long as they are of GCN architecture is pretty much the same thing in my eyes.
What NVidia is asking for is monitor makers to include their hardware on every monitor, and it won't have any benefit for the majority of the people who buy the monitor since its locked to NVidia's most recent offerings.
 


But they already have their 3D monitors out there so it's not unprecedented.
 
G-SYNC actually includes a sequel to LightBoost:
http://www.blurbusters.com/confirmed-nvidia-g-sync-includes-a-strobe-backlight-upgrade/

So you can have your cake and eat it too. The "LightBoost" strobe backlight feature is becoming an officially sanctioned NVIDIA feature, easily enabled without hacks.
 
You guys keep on throwing out Vsync 60hz blah blah blah. You people need to realize that the way a frame is buffered with vsync on creates a lag in regards to input (like your mouse feeling like its lagging behind when you're moving it from one side to the other). Gsync literally eliminates that feel and allows you to sync your monitor with higher fps so you can get better response. It's a fantastic piece of technology that will greatly improve FPS gaming in particular.

Nvidia is not expecting every monitor brand out there to install a Gsync module into it. Gsync will be a "feature" that will most likely be available with select monitors (particularly geared towards gamers obviously). There is so much hate about Gsync and I just don't understand why. You guys should be happy that there is finally a option out there that allows us to take FULL advantage of higher fps in gaming and have it give you the response you love so much.

The last time I was able to achieve something like this was using my old CRT monitor with ReForce, I would lock my monitor in at 100hz and lock my fps in Counter Strike 1.3 or 1.6 at 100fps and I got perfect syncronization (I had a good video card then btw). I personally have been dying to see a LCD/LED with a feature like this for a while. I for am really happy Nvidia is now offering this.
 
"If they know the monitor is running at XXHz then why don’t they just make the card send a frame in sync with the monitor? You have a buffer that is read to send out a full frame even if it has not been updated or if it has been update 1-2-3-4 or more times since the last read. Just read and send the full frame each time in sync with the monitor. It just seems like they are more interested in software and hardware to sell and license. "
Isn't that just Vsync?
 
Status
Not open for further replies.