[SOLVED] Reality of G-Sync Compatible

Oct 25, 2019
13
0
10
Hi all!

I've been planning a rig around the RTX 2070 Super. At first I (foolishly) thought, since I'm going with nvidia, I'd just quickly pair with a G-Sync monitor. I'm now realising that options are limited for a 27" 1440p 144-165hz monitor that is not wallet breaking.

Enter G-Sync compatible.

Opens up a whole range of other monitors to choose from. However, my concern is, what does it "really" mean in terms of gaming performance? Wouldn't want the $$ gone into the GPU to go to waste. Found a nice table that seemed to summrize the difference. But it felt a lot like marketing and not real explanation of the performance difference.

So question to ppl who have experience with geforce GPU and also both normal G-Sync and G-Sync compatible monitors.

Is G-Sync compatible:
  1. "G-Sync Equal" - no difference in performance
  2. "G-sync Lite" - some minor reduction in performance or features, but is barely discernable by average gamer
  3. "G-sync Compromised" - obvious reduction in performance and features. Basically stay away or get an AMD GPU

Thanks!
 
Solution
Ah the age-old tale of G-Sync vs FreeSync. A story of two companies racing to the peak of the variable refresh rate mountain at the same time from opposite sides.

In ~October 2013 Nvidia came out with G-Sync for variable refresh rate mere months before it was ratified by VESA (AMD was showing FreeSync at CES in January 2014). Unfortunately, because Nvidia were in before the standards body, they had to employ a chip/module embedded in the monitor to enable this feature. That's true G-Sync. Also, G-Sync (for good reason) was proprietary and locked to only Nvidia GPUs.

Once VESA and AMD got variable refresh (VESA Adatpive-Sync) standardized (May 12, 2014), "FreeSync" was coined/born later that year. "FreeSync" uses no module because...
Depends, Nvidia has a list of monitors that they 100% certify, then there is everything else. If you are worried the best bet is to search the specific freesync monitor model you are looking at along with gysnc compatibility.

I think most monitors work just fine with no or little issues such as screen flickering on light or dark backgrounds can be an issue, so again its more up to you to look for specifics.
 
Ah the age-old tale of G-Sync vs FreeSync. A story of two companies racing to the peak of the variable refresh rate mountain at the same time from opposite sides.

In ~October 2013 Nvidia came out with G-Sync for variable refresh rate mere months before it was ratified by VESA (AMD was showing FreeSync at CES in January 2014). Unfortunately, because Nvidia were in before the standards body, they had to employ a chip/module embedded in the monitor to enable this feature. That's true G-Sync. Also, G-Sync (for good reason) was proprietary and locked to only Nvidia GPUs.

Once VESA and AMD got variable refresh (VESA Adatpive-Sync) standardized (May 12, 2014), "FreeSync" was coined/born later that year. "FreeSync" uses no module because it works via the Display Port protocol (layman terms) so it adds zero cost to the BoM of a monitor (not that manufacturers can't/won't charge more for premium features like VRR compared to fixed refresh) There were some downsides to FreeSync early on (which Nvidia was keen to exploit), but that's all been shored up, and for the past 3+ years (I could give you exact number, but don't want to search it), FreeSync is functionally identical to GSync.
Now, Nvidia has/does hold G-Sync monitor to high standards (for good reason, more on that in a minute). G-Sync monitors all have ~40-144Hz refresh rates (with low-framerate comphensation), and low-ish latency (would have to look up exact number). There may be some other secondary requirements, but again, lets go with this for now. Two main reasons for this:
  1. Those specs offer a "premium" gaming experience. Nobody's going to argue that.
  2. Because the cost of the G-Sync module imparted a ~$250 (at first, closer to $150 later on) increase in the BoM of a monitor, Nvidia needed G-Sync monitors to fall in a price bracket that wouldn't deter customers from the price hike.
This became even more dire/pronounced when Nvidia briefly released the G-Sync HDR module that carried a whopping ~$500 price tag on its own (before you add the cost of the monitor). Oops FreeSync can do HDR VRR for free too!!

In that time, Nvidia had a vested interest with themselves (because monitor makers have to pay Nvidia to use the G-Sync module, which they use to pay off their GSync research investment) and their monitor manufacturer partners to cling to their "dead horse" as long as possible to recoup their development costs. They've actually alluded to this strategy in that they were monitoring (heh!) the GPU and monitor market demand closely to choose when it was best to abandon ship. Obviously, in January of this year, they caved and enabled support for "FreeSync" (VESA Adaptive Sync).

"G-Sync Compatible" is simply a categorization put forth by Nvidia on "FreeSync" monitors that meet similar specifications that G-Sync monitors follow. From Nvidia: "We will test monitors that deliver a baseline VRR experience"
  1. G-SYNC Compatible testing validates that the monitor does not show blanking, pulsing, flickering, ghosting or other artifacts during VRR gaming (not difficult to pass, given later requirements)
  2. supporting a VRR range of at least 2.4:1 (e.g. 60Hz-144Hz) (so now we're talking about 120Hz or better monitors)
  3. and offer the gamer a seamless experience by enabling VRR by default. (there's the kicker!! Monitors have to have VRR enabled by default out of the box)
  4. Even if you meet all the above requirements, you need to be able to get Nvidia to actually "validate" your product and put it on their list. Not saying there's anything nefarious going on, but there's a certain value that Nvidia has in their control to endow a product with their "stamp of approval", and with absolute power....
I have first hand experience with a 40-75Hz VRR monitor and my personal 48-144Hz monitor, and I will agree that 48-144Hz (allowing for LFC) is certainly the better experience. AMD's take on that was/is "why limit VRR to only customers with deep pockets?" As time has progressed, the cost of 120+Hz monitors has decreased dramatically. Heck, you can get a 24" 48-144Hz monitor these days for $150 or less. So, where the limited refresh range was initially a way to bring VRR to even the low budget masses, it's now being utilized by high resolution monitors that haven't quite gained the adoption/bandwidth/tech to reach into the triple digits refresh rates.

Here are the real requirements you need to meet:
  1. GeForce GTX 10-Series, GeForce GTX 16-Series and GeForce RTX 20-Series or newer graphics card.
  2. A "FreeSync" monitor that has a Display Port input. (FreeSync over HDMI is proprietary to AMD currently)
A VERY COMMON MISCONCEPTION is that GSync is superior to "FreeSync". That's simply not true (and from a technical standpoint could be considered the reverse these days). You have to consider that there's two parts to a VRR monitor:
  1. The spec of the panel itself (refresh rate, refresh range, response time, contrast, brightness, local dimming, etc). As described above, since Nvidia only implemented its GSync modules in premium tier panels, consumers have likened GSync to a superior product. I've heard people say "you just know it's good". True! Whereas, since FreeSync was FREE, manufacturers just threw it in everything, leaving the consumers to decide what level of quality was acceptable to the market.
  2. The VRR "tech" making the panel operate. Again, GSync and FreeSync do the same thing.
One gripe I have with the market (jeez, it's been 5 years, let's get this figured out) is the lack of standardization for advertising VRR. Still the best resource I know if is this list curated by AMD (obviously no true G-Sync monitors on that list though)
 
Last edited:
  • Like
Reactions: Hammyhamster
Solution
Oct 25, 2019
13
0
10
Ah the age-old tale of G-Sync vs FreeSync. A story of two companies racing to the peak of the variable refresh rate mountain at the same time from opposite sides.

In ~October 2013 Nvidia came out with G-Sync for variable refresh rate mere months before it was ratified by VESA (AMD was showing FreeSync at CES in January 2014). Unfortunately, because Nvidia were in before the standards body, they had to employ a chip/module embedded in the monitor to enable this feature. That's true G-Sync. Also, G-Sync (for good reason) was proprietary and locked to only Nvidia GPUs.

Once VESA and AMD got variable refresh (VESA Adatpive-Sync) standardized (May 12, 2014), "FreeSync" was coined/born later that year. "FreeSync" uses no module because it works via the Display Port protocol (layman terms) so it adds zero cost to the BoM of a monitor (not that manufacturers can't/won't charge more for premium features like VRR compared to fixed refresh) There were some downsides to FreeSync early on (which Nvidia was keen to exploit), but that's all been shored up, and for the past 3+ years (I could give you exact number, but don't want to search it), FreeSync is functionally identical to GSync.
Now, Nvidia has/does hold G-Sync monitor to high standards (for good reason, more on that in a minute). G-Sync monitors all have ~40-144Hz refresh rates (with low-framerate comphensation), and low-ish latency (would have to look up exact number). There may be some other secondary requirements, but again, lets go with this for now. Two main reasons for this:
  1. Those specs offer a "premium" gaming experience. Nobody's going to argue that.
  2. Because the cost of the G-Sync module imparted a ~$250 (at first, closer to $150 later on) increase in the BoM of a monitor, Nvidia needed G-Sync monitors to fall in a price bracket that wouldn't deter customers from the price hike.
This became even more dire/pronounced when Nvidia briefly released the G-Sync HDR module that carried a whopping ~$500 price tag on its own (before you add the cost of the monitor). Oops FreeSync can do HDR VRR for free too!!

In that time, Nvidia had a vested interest with themselves (because monitor makers have to pay Nvidia to use the G-Sync module, which they use to pay off their GSync research investment) and their monitor manufacturer partners to cling to their "dead horse" as long as possible to recoup their development costs. They've actually alluded to this strategy in that they were monitoring (heh!) the GPU and monitor market demand closely to choose when it was best to abandon ship. Obviously, in January of this year, they caved and enabled support for "FreeSync" (VESA Adaptive Sync).

"G-Sync Compatible" is simply a categorization put forth by Nvidia on "FreeSync" monitors that meet similar specifications that G-Sync monitors follow. From Nvidia: "We will test monitors that deliver a baseline VRR experience"
  1. G-SYNC Compatible testing validates that the monitor does not show blanking, pulsing, flickering, ghosting or other artifacts during VRR gaming (not difficult to pass, given later requirements)
  2. supporting a VRR range of at least 2.4:1 (e.g. 60Hz-144Hz) (so now we're talking about 120Hz or better monitors)
  3. and offer the gamer a seamless experience by enabling VRR by default. (there's the kicker!! Monitors have to have VRR enabled by default out of the box)
  4. Even if you meet all the above requirements, you need to be able to get Nvidia to actually "validate" your product and put it on their list. Not saying there's anything nefarious going on, but there's a certain value that Nvidia has in their control to endow a product with their "stamp of approval", and with absolute power....
I have first hand experience with a 40-75Hz VRR monitor and my personal 48-144Hz monitor, and I will agree that 48-144Hz (allowing for LFC) is certainly the better experience. AMD's take on that was/is "why limit VRR to only customers with deep pockets?" As time has progressed, the cost of 120+Hz monitors has decreased dramatically. Heck, you can get a 24" 48-144Hz monitor these days for $150 or less. So, where the limited refresh range was initially a way to bring VRR to even the low budget masses, it's now being utilized by high resolution monitors that haven't quite gained the adoption/bandwidth/tech to reach into the triple digits refresh rates.

Here are the real requirements you need to meet:
  1. GeForce GTX 10-Series, GeForce GTX 16-Series and GeForce RTX 20-Series or newer graphics card.
  2. A "FreeSync" monitor that has a Display Port input. (FreeSync over HDMI is proprietary to AMD currently)
A VERY COMMON MISCONCEPTION is that GSync is superior to "FreeSync". That's simply not true (and from a technical standpoint could be considered the reverse these days). You have to consider that there's two parts to a VRR monitor:
  1. The spec of the panel itself (refresh rate, refresh range, response time, contrast, brightness, local dimming, etc). As described above, since Nvidia only implemented its GSync modules in premium tier panels, consumers have likened GSync to a superior product. I've heard people say "you just know it's good". True! Whereas, since FreeSync was FREE, manufacturers just threw it in everything, leaving the consumers to decide what level of quality was acceptable to the market.
  2. The VRR "tech" making the panel operate. Again, GSync and FreeSync do the same thing.
One gripe I have with the market (jeez, it's been 5 years, let's get this figured out) is the lack of standardization for advertising VRR. Still the best resource I know if is this list curated by AMD (obviously no true G-Sync monitors on that list though)
Firstly. Best. Response. Ever.

Now on to serious stuff. To make sure I understand your fully. For the average gamer like me: if I have a decent GF GPU (like to 2070 super),

1. I'm going to get the the same amount out of my GPU, regardless of whether it is pure G-Sync or G-Sync Compatible.

2. Actual performance is dependent on the individual monitor specs. Not which adaptive sync tech it employs.

3. The "benefit" of a pure G-Sync monitor is that it has been already "curated" by a team in Nvidia. So likely its high performing on its own.

4. If using Nvidia GPU, its still worth to get a G-Sync Compatible monitor, because at least someone has tested the compatibility with Nvidia GPU (less bugs).

Did I get that right?

Side note: many things in this world really call out for standardisation. Metric units, phone chargers, wall sockets, etc. Pls vote for a machine overlord to make it happen.
 
  1. Yes
  2. Yes
  3. .... Yes.... (don't like your wording, more like GSync is equal to GSync Compatible in specs/performance)
  4. Nvidias use of the term "Compatible" is misleading. All "FreeSync" monitors with a Display Port input are VRR compatible with your GPU. The benefit of GSync Compatible is that you know you're getting a VRR experience that meets the standards mentioned earlier. Also, Nvidia enables variable refresh automatically for those certified monitors (otherwise you have to click the button in the driver menu to enable.... not a big deal. You only do it once)
I can appreciate that Nvidia is taking the time with GSync Compatible certification to make it easy for consumers to find/ buy a monitor that will deliver good performance without having to look up reviews and specs, and be able for VRR to be enabled automatically. There are a number of good VRR monitors out there that can't qualify for GSC because they don't have VRR enabled out of the box though. In the future as more and more people own VRR capable GPUs, this will become a non-issue.
 
Last edited:
Oct 25, 2019
13
0
10
  1. Yes
  2. Yes
  3. .... Yes.... (don't like your wording, more like GSync is equal to GSync Compatible in specs/performance)
  4. Nvidias use of the term "Compatible" is misleading. All "FreeSync" monitors with a Display Port input are VRR compatible with your GPU. The benefit of GSync Compatible is that you know you're getting a VRR experience that meets the standards mentioned earlier. Also, Nvidia enables variable refresh automatically for those certified monitors (otherwise you have to click the button in the driver menu to enable.... not a big deal. You only do it once)
I can appreciate that Nvidia is taking the time with GSync Compatible certification to make it easy for consumers to find/ buy a monitor that will deliver good performance without having to look up reviews and specs, and be able for VRR to be enabled automatically. There are a number of good VRR monitors out there that can't qualify for GSC because they don't have VRR enabled out of the box though. In the future as more and more people own VRR capable GPUs, this will become a non-issue.
Thank you for the thorough and clear explanation!
 

TRENDING THREADS