[SOLVED] Does G-Sync still matter?

belo

Distinguished
Jul 29, 2008
140
1
18,715
Looking for a new monitor and I have intentions to upgrade to a 30 series GPU in the future. As many know, the price of a g-sync monitor is a lot more than free-sync. Most of the articles I've found on the subject are several years old and I'm wondering if VRR is not really brand limited in 2021 or if I should still try to find a g-sync monitor.

Thanks!
 
Solution
NVIDIA has a list of monitors that have a G-Sync Module and those that it tested to work within their tolerances, whatever that may be.

Though in my limited experience, you can still tell the GPU to enable G-Sync on any variable refresh rate display. But the overall experience may not be ideal if wasn't tested by NVIDIA. Also it seems most FreeSync monitors don't go below 40Hz while G-Sync is capable of the entire range. But I'm sure an RTX 30 series card should have no trouble keeping 40+ FPS
while a lot of displays are "compatible" now with newer series AMD & Nvidia cards
there is still a noticeable difference, however slight, using an Nividia GPU with an actual G-Sync chip
or using an AMD GPU with Freesync.

how noticeable the difference may be using a compatible display vs a real G-Sync,
it may depend on the manufacturer and\or model so reading user reviews may be the best way to determine.
 
  • Like
Reactions: belo
NVIDIA has a list of monitors that have a G-Sync Module and those that it tested to work within their tolerances, whatever that may be.

Though in my limited experience, you can still tell the GPU to enable G-Sync on any variable refresh rate display. But the overall experience may not be ideal if wasn't tested by NVIDIA. Also it seems most FreeSync monitors don't go below 40Hz while G-Sync is capable of the entire range. But I'm sure an RTX 30 series card should have no trouble keeping 40+ FPS
 
  • Like
Reactions: belo
Solution
this monitor states "compatible" which makes me nervous that it's not an actual g-sync monitor. Is there a definitive list out there somewhere?
https://www.newegg.com/p/N82E16824475097?Item=N82E16824475097
this monitor does not use an actual G-Sync chip.
it just has some generic Adaptive Sync processor that fits the Nvidia "G-Sync Compatible" spec.

recently trying to help a friend find a newer ultra-wide high refresh rate G-Sync monitor there really weren't a lot of options.
it seems the majority out there are currently Freesync \ G-Sync Compatible.
 

falcon291

Honorable
Jul 17, 2019
647
145
13,290
I don't know how much actual difference between G-Sync and Freesync. But it seems that the price of the G-Sync chip just does not justify the difference, so buyers do not buy and manufacturers do not produce monitors with G-Sync chip.

It seems like G-Sync will be next Betamax. Price matters.
 
The original G-Sync requiring a hardware module has been rebranded as "G-Sync Ultimate"

nVidia threw in the towel and has offered Freesync over Displayport since GeForce driver 417.71 for GTX 10 series cards, possible because that's part of the old Displayport 1.2A spec. This software solution is branded as "G-Sync Compatible"

"G-Sync compatible" HDMI drivers appeared for HDMI 2.0 RTX 20 series and GTX 16xx in 2019 (it's really just VESA's HDMI 2.1 spec incorporating AMD's FreeSync as VRR adaptive refresh) however note it wasn't until RTX 30 series that they got HDMI 2.1 which allows 4k144
 

belo

Distinguished
Jul 29, 2008
140
1
18,715
this monitor does not use an actual G-Sync chip.
it just has some generic Adaptive Sync processor that fits the Nvidia "G-Sync Compatible" spec.

recently trying to help a friend find a newer ultra-wide high refresh rate G-Sync monitor there really weren't a lot of options.
it seems the majority out there are currently Freesync \ G-Sync Compatible.

this is my experience as well, which is why I'm scratching my head given the popularity of nvidia cards. I asked this question because surely everyone isn't buying these expensive ass cards and just using basic monitors? So there must be some benefit for freesync?
 
this is my experience as well, which is why I'm scratching my head given the popularity of nvidia cards. I asked this question because surely everyone isn't buying these expensive ass cards and just using basic monitors? So there must be some benefit for freesync?
Look at BFG's post. Newer nvidia cards with the newest drivers can use free sync via display port and hdmi.
 
Freesync works fine, but supposedly the hardware chip of G-Sync ultimate works better below 40fps when both start to get into frame doubling (AMD calls that "low framerate compensation"). Until you get your new GPU, that monitor should work fine with your 1050Ti, which can use G-Sync Compatible mode over Displayport and stay above that when gaming at 720p.
 

belo

Distinguished
Jul 29, 2008
140
1
18,715
The original G-Sync requiring a hardware module has been rebranded as "G-Sync Ultimate"

nVidia threw in the towel and has offered Freesync over Displayport since GeForce driver 417.71 for GTX 10 series cards, possible because that's part of the old Displayport 1.2A spec. This software solution is branded as "G-Sync Compatible"

"G-Sync compatible" HDMI drivers appeared for HDMI 2.0 RTX 20 series and GTX 16xx in 2019 (it's really just VESA's HDMI 2.1 spec incorporating AMD's FreeSync as VRR adaptive refresh) however note it wasn't until RTX 30 series that they got HDMI 2.1 which allows 4k144

Sorry for being a little slow. Is it fair to say that if you're looking for g-sync you still need at least a g-sync compatible monitor in order to take advantage of the feature? And that all free-sync monitors aren't compatible by default?
 

Eximo

Titan
Ambassador
There wasn't a rebranding, G-Sync Ultimate supports more features found in newer monitors. Support for HDR, 4K 144hz, etc. Free-Sync Premium is the new equivalent from AMD.

G-Sync is still G-Sync, and has slight advantages over FreeSync, the mentioned lower VRR thresholds and a very minor latency advantage. (Most people can't tell the difference)

Wouldn't go out of my way to buy an original G-Sync module, DP only on the monitor. G-Sync 2.0 has an additional HDMI port and about the same features, which is what most G-sync monitors are. 1440p 165hz or 4K 60hz or up to 300hz 1080p.
 
  • Like
Reactions: belo

belo

Distinguished
Jul 29, 2008
140
1
18,715
It is kinda sad when the "best gaming monitor" is a <=$1500 OLED TV.

I bought a 65" LG C9 OLED over the summer for a ~$1200.00. It's crazy to see what a 27" non-4k monitor goes for when you consider the specs and size differences. I'm sure there's a reason for this and I know we don't need 4k for a monitor, but still head scratching for me.
 
Freesync is the standard non-hardware way to do adaptive refresh, and most Freesync monitors do work fine with nVidia cards. By buying a "G-Sync Compatible" monitor, you are getting a Freesync monitor that nVidia has certified will work for sure with their cards.

Given that all the latest hardware module adaptive refresh monitors can also do HDR and high refresh rates, it's fair to say there's simply no way to charge the substantial premium for real G-Sync any more without offering premium features too. Note that AMD's Freesync Premium still does not require an extra hardware module, as it's just a branding indicating that such features are there. As with G-Sync Compatible such a certification can be given retroactively.

One of the biggest advantages of doing things in software is you can add new features later (which is how things like draft-wifi products can be upgraded to the final spec after it's finalized, or how Winmodems were upgraded to the final 56k spec using just a new driver), while just adding high refresh rates required a respin of the G-Sync chip.

The proprietary module does assure the buyer of a certain standard level of performance. Freesync monitors came in a wide range of usable minimum refresh rates, some too high to be of much use. I think it's hilarious though that nVidia will charge you extra for a hardware scaler in a monitor, yet not even include hardware schedulers in their graphics cards since Fermi. This is instead performed in software using the drivers, presumably so it can appear their GPUs are more power-efficient, since all the heat from that is coming off the CPU and not the GPU. There's got to be a missed branding opportunity in there somewhere.
 
I bought a 65" LG C9 OLED over the summer for a ~$1200.00. It's crazy to see what a 27" non-4k monitor goes for when you consider the specs and size differences. I'm sure there's a reason for this and I know we don't need 4k for a monitor, but still head scratching for me.
I think it's simply due to a lack of demand for higher DPI monitors despite the fact several boutique grade computer systems (e.g., mostly from Apple and Microsoft, and maybe a few others) come with monitors that have higher DPI than the standard ~96. And the fact that Windows sucked at DPI scaling (I don't think it's as bad now).

I used a 4K monitor as a main for a few years and did enjoy the higher DPI for how crisp everything looked. And since it was a 27" monitor, I could lower the resolution in games to 1440p and it didn't really look different than a native 27" 1440p monitor.

I think it's hilarious though that nVidia will charge you extra for a hardware scaler in a monitor, yet not even include hardware schedulers in their graphics cards since Fermi. This is instead performed in software using the drivers, presumably so it can appear their GPUs are more power-efficient, since all the heat from that is coming off the CPU and not the GPU. There's got to be a missed branding opportunity in there somewhere.
This isn't exactly true. What NVIDIA got rid of was the part of the scheduler that handled dependency checking and job re-ordering, realizing that a lot of the stuff is predictable. Since instruction latency is already known, there's was no reason to have hardware figure how to order instructions and handle dependencies. There's still a scheduler that handles distribution of said instructions on the execution units themselves.

While I don't know what goes into compiling GPU commands, I can't imagine it's actually that bad to begin with. After all, a bulk of this is already being done via your API of choice.
 
Last edited:
While it was probably a reasonable design choice for DX11 (which only used multi-core pretty inefficiently so there was plenty of excess CPU power just being wasted), in DX12 it turns out to actually be pretty bad. As in bad enough for a $1500 RTX3090 to be regularly outperformed by a $279 RX5600XT unless you have the latest and fastest CPU.

Software can be faster than hardware if you have CPU to spare. It's just that even slow hardware will offload work from the CPU, effectively giving you more CPU to use for other things. AMD had no choice but to go with hardware schedulers because their console designs essentially were based on slow Jaguar APUs making available CPU power very limited.

Yep, this is the instruction scheduler for preventing data hazards we are talking about here which is handled by the compiler (and simplified to static scheduling and fixed-latency instructions for Kepler), not the much higher level Hardware Accelerated GPU Scheduling (HAGS) or WDDM GPU scheduler for issuing commands. Using the CPU to do such low level work reminds me of those early Realtek NICs that moved every packet with software--performance was fine, but the CPU hit was tremendous back when single-core was the norm. Nowadays with multi-cores, people will often disable those offloading features and interrupt moderation for "better performance."