G-Sync Technology Preview: Quite Literally A Game Changer

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Tanquen

Distinguished
Oct 20, 2008
256
8
18,785
I still don’t get why we need it. Many programs, movies, games are silky smooth with no need for extra hardware in the monitor. The monitor don’t care, it’s the cards that’s sending a messed up frame. You know the monitor wants a frame ever second or so depending on the refresh rate. Buffer A and B are always building frames. Only one is ready every 60th or 120th of a second. Send the one that is ready. Time to send a new frame and a new one is not ready? Send the same one again! “No, it’s better to have the monitor changes it frame rate on the fly?” What? It’s the same thing. Slow the monitor down so it’s still showing the last frame. This should all be handled in the card but then they could not license a monitor to you too.
 
wurkfur wrote:
I personally have a setup that handles 60+ fps in most games and just leave V-Sync on. For me 60 fps is perfectly acceptable and even when I went to my friends house where he had a 120hz monitor with SLI, I couldn't hardly see much difference.

I applaud the advancement, but I have a perfectly functional 26 inch monitor and don't want to have to buy another one AND a compatible GPU just to stop tearing.

At that point I'm looking at $400 to $600 for a relatively paltry gain. If it comes standard on every monitor, I'll reconsider.
A big part of 120hz is not what it looks like, but latency it reduces. You feel much more connected to the view in 1st person games. Just watching won't give you the difference. You also need to have good FPS as well.
 

Traciatim

Distinguished


If you truly believe that, then you don't understand.

Currently the choice is either have it wait for the next refresh displaying the same frame, which causes huge discrepancies in your frame rate and that causes you to see judder or you can decide to just draw the frame when it's ready even if the monitor is in the middle of drawing another frame, which causes the screen to tear. There is no option that causes silky-smoothness, except now there is with G-Sync.

The reason movies look fine at 24fps and 30fps is all motion blur that's accurately recorded on the film used. So you aren't actually seeing silky smooth motion but instead a blurry mess of the motion that happened during the 41ms or 33ms that the film captured that your brain is pretty good at putting together as motion. Though, if you run a 24p movie on a 60hz TV you also get that kind of unnatural feeling motion, because some frames last longer than others causing judder. Hence why 120Hz TV's attempt to fix it since 120Hz is a multiple of both 24 and 30 making it to the frames can always be displayed for 5 or 4 scans respectively (or they use the motion compensation to interpolate frames between the two actual frames . . . essentially making a new frames that are a blend of the one previous and next recorded ones - which sometimes works but sometimes looks awful).

Also, G-Sync will reduce the amount of lag you feel because you don't need triple buffering enabled while v-sync is on in order to keep frames flowing as smoothly as possible. Currently if you have V-Sync and triple buffering on using a 60hz screen then the frame you are seeing on the screen is a frame that was generated 33ms ago, there is one in the buffer made 16ms ago, and one being worked on. That means if something happens in game and appears on your screen you won't see it for 33ms. G-Sync solves that problem by always flipping to the new frame when it's ready. How this translates to real world performance is that if a gamer can react to something on screen in 120ms and two of these people are side by side, the one with G-Sync will be able to react in that 120ms, while the non-G-Sync user waits for 33ms first and then gets to react causing the time to be 153ms . . . that's over 20% faster. If you don't think that's a big deal then you obviously don't play any games where reaction times have any bearing on the outcome.
 

BrookH

Honorable
Dec 12, 2013
4
0
10,510
I never turn off V-sync, because the tearing makes me nauseous. I'll take bad stutter over nausea any day. But I remember reading recently about another method for fixing this problem which reduces detail of fast-moving artifacts to maintain a minimum framerate. This makes a lot of sense: I don't need a highly detailed rendering of something which is moving too fast to see clearly anyway. And in a fast-moving scene, I will probably not even be looking at the static elements, and so they can be slight less detailed as well. But I can't find the article. Does anyone recognize it?
 

tanjo

Distinguished
Sep 24, 2011
272
1
18,810
The maximum stutter (and input lag) difference with G-Sync and triple-buffered V-sync is less than the duration of 1 frame (~16.666ms @ 60Hz or ~8.333 @ 120Hz).

The advantage of G-Sync is smoother look at low fps due to consistent frame time. You would hardly see any difference between G-Sync and V-sync w/triple buffering at higher refresh rates (100+). The problem with triple buffering is implementation.

If you can't get past 60 fps, you're better off upgrading your system (or dialing down the settings) than replacing your monitor... unless you're in the market for a new monitor and the price difference between a monitor with and without G-Sync is small, then why not?

EDIT: If G-Sync will add $100+ to the price (vs without) then it's better to put it on higher resolution screens (1440p+) to make game play smoother at below 60 fps (but higher than 40) since it's harder to crank up fps at higher resolutions.
 
If they won't open up the tech for everyone (not only license, but make it PART of the standards), then I'll keep my pocket closed as well, with my money inside.

There's no way i'll trade a little tearing on my 120Hz Monitor over the incapability to choose between AMD and nVidia (be hardware locked) as my next video card. That's is an stupid as hell compromise I will not take.

Cheers!
 

theLiminator

Distinguished
Jan 16, 2010
108
0
18,690
Yeah, this is truly a huge innovation. I never game with V-sync on, the input lag is actually really noticeable, especially when playing games like CS, SC or Dota.
 

fulle

Distinguished
May 31, 2008
968
0
19,010
This reminds me of Lucid's Virtual Vsync. Only, Lucid's solution was purely software based... and open enough that I think it was even licensed to next gen consoles.

Meanwhile this requires an Nvida GPU and a Monitor that has an Nvida chip in it. I don't really get the excitement over this, to be honest.

Meanwhile, if someone didn't want to use Lucid's solution, or it didn't work well with a specific game... simply forcing triple buttering with Vsync on, works great in most situations. So long as your FPS is at a decent level, the latency introduced isn't even noticeable... and you're less likely to run into issues than if you tried to use G-Sync.

Waste of time technology IMHO. Especially if Nvidia keeps it closed off.
 

Tanquen

Distinguished
Oct 20, 2008
256
8
18,785


Low frame rates with motion blur are not the issue. Sending the same frame 5 times then the next 10 times then next 1 time and then the next 25 times and or with half of one from and half of another (I think) is. There is no reason this could not be handled in the card. The review and NVidia even say it’s not going to work for everything. The monitor is just displaying the same thing over and over until the onboard chipset driver tells it to change something. You have the max refresh/frequency of the light source and the max refresh of the Liquid Crystal Display element and then the chipset driver. The artifacts are caused by the messed up frames from the card. Telling the monitors onboard chipset driver to wait vs. sending the complete frame again.

“G-Sync will be able to react in that 120ms, while the non-G-Sync user waits for 33ms first and then gets to react causing “ G-Sync is not magical making more frames. You have to wait for the next one to be ready somewhere.
I’ve played lots of games and some have lots of mouse lag and some have very little or none that I can see. Some have lots of tearing and some have none that I can see. Some games with V-Sync on are really bad, some look/feel great. These variances all exist without G-sync.

If it was not proprietary, didn’t add more hardware, fixed everything, was free, open, sounds good. This is not that. I just hope AMD or someone that knows what they are doing comes up V-Sync 2.0 that just works.
 

therogerwilco

Distinguished
Dec 22, 2009
196
0
18,680
Until they add it to 2560x1600 panels and 4k res panels, I don't care.

Anyone complaining about 'tearing/stuttering' and they're using 1080 are idiotic.
 

Bondfc11

Honorable
Sep 3, 2013
232
0
10,680
the 248 is the highest selling gaming monitor on the planet - Nvidia is clearly going after that market since their kits will be released soon as well. N is expecting people to shell out bucks to upgrade their exisitng monitor - although new ones are right around the corner.
 

Bondfc11

Honorable
Sep 3, 2013
232
0
10,680
the 248 is the highest selling gaming monitor on the planet - Nvidia is clearly going after that market since their kits will be released soon as well. N is expecting people to shell out bucks to upgrade their exisitng monitor - although new ones are right around the corner.
 

lpedraja2002

Distinguished
Dec 8, 2007
620
0
18,990


Well the only explanation is that you don't know what screen tearing looks like or you've been playing every game with v-sync on.



I don't think you understand what screen tearing actually is. The people who are affected by screen tearing are the ones who surpass the monitor's refresh rate, which normally is 60hz, which in this day and age is almost everyone. Some people are not bothered by screen tearing at all, some don't even know it exists but for me its annoying as hell. I do agree that Nvidia should make this a standard, but I don't know how their business works and given that they've been profitable, I'm sure they considered all the options.
 

hannibal

Distinguished
Good to see that this actually works. Hopefully we will see a standardised version of this is some day... If not, well hopefully it wont be too expensive to the Nvidia users! It would not be nice to see them ripped of.
In best the case this can be really good invention, in bad case Apple like good but expensive closed experience.
 

qlum

Distinguished
Aug 13, 2013
195
0
18,690
I wished amd and intel would work together on presenting an alternative to gsync, only with a broad opposition can we murder the vile vendor lock here. Hopefully such a standard will be adopted by nvidea in the end and only then will we get broad adoption.
 

INick

Honorable
Dec 12, 2013
1
0
10,510
The three main complaints:
- hardware locked (only nvidia GPUs)
- more expensive hardware (monitors)
- focused on image quality & user experience improvement other than performance improvement

These are the same cons that apple products have. !However, the reason that makes people buy a product is the user experience. If the hardware is stronger (more fps) but the experience (image & video quality) is crappy (tearing/lag/stuttering/ghosting) then there is no reason to pay for "crappy" high technology if there is an alternative. And considering that a monitor is meant to be something to have for quite a few years (just like an apple product), I think it's a very smart move from nvidia and in the right direction. What's missing is something like a "Steve Jobs" marketing campaign and then "BOOM" a new era for monitor technologies.

But we also need competition in order not to be raped by nvidia. So AMD needs to invent something similar and soon. AMD, if you are listening... HELP!!!
 

hannibal

Distinguished


I think that Intel is in better position to this, to offer a alternative to this. Just because it is so much bigger. The good point for AMD is that they have been a little bit more keen on using open CL and those kind of tools.
Big part is allso those peoples who deside Display port standards and HDMI standards. One big solution is so much better for market, but allso we need pioneers and Nvidia is in that position now!
Who knows.. maybe G-sync will be killer stuff for Nvidia for next 2-3 years and after that it is possible that G-synk dies under the pressure of more open system that makes the same thing.
 

Suvrojit

Honorable
Dec 12, 2013
1
0
10,510
I find it very pleasing for a new primitive tech like G-SYNC has the advantages of peak performance and less input lag without the tearing but there are some clear disadvantages too with its implementation like requiring Displayport 1.2 and with surround configuration. Also there is no where mentioned but I would like to ask this question will G-SYNC module be in operation for software media players so that we can see 24fps movies at 24hz because the current 60Hz monitors converts the 24fps into 60Hz internally. If NVIDIA can make this happen I'm expecting a slow but a gradual revolution.
 

imsurgical

Distinguished
Oct 22, 2011
175
36
18,720
wurkfur: "I personally have a setup that handles 60+ fps in most games and just leave V-Sync on. For me 60 fps is perfectly acceptable and even when I went to my friends house where he had a 120hz monitor with SLI, I couldn't hardly see much difference.

I applaud the advancement, but I have a perfectly functional 26 inch monitor and don't want to have to buy another one AND a compatible GPU just to stop tearing.

At that point I'm looking at $400 to $600 for a relatively paltry gain. If it comes standard on every monitor, I'll reconsider. "

I agree with this as I'm still growing accustomed to my Asus 27" 1440p monitor. BUT, aren't they making a separate module for G-Sync to buy separate and to use for legacy moniters that won't have it pre-installed? Thought NVIDIA stated that at one point? That being the case if benchmarks down the road show the same gain or close to the same gain as an integrated module built into the monitor I'd be more than content with purchasing that also?
 

MaCk0y

Distinguished
Apr 17, 2013
37
0
18,530
"Until now, your choices were V-sync on or V-sync off, each decision accompanied by compromises that detracted from the experience."

There is also the option to cap frame rate.
 


IIRC in the original article about G-Sync there was mention that if a monitor was retro fitted it would also get a years warranty.
 

squirrelboy

Honorable
May 3, 2013
89
0
10,640
WHY THE HELL is it cheaper to buy one of these screens in america, and ship them around the world to the netherlands, than to buy one here?!
importing:
$250 (€180) for the screen, $108 (€80) for shipping, so a total of $358, or €260
buying here:
€295 ($405!!!) for the screen, €5 ($6,50) for shipping, so a total of €300, or $412
how is this possible?
 
Status
Not open for further replies.