NVIDIA GPU on a Freesync monitor

Oct 3, 2018
6
0
10
I plan on getting a new graphics card soon, with the release of the 20 series of Nvidia GPUs I plan on maybe getting the 2080 or possibly going back to the 1080ti. Because of this I also want to upgrade my display to an ultra wide monitor at 1440p...but the problem I've found is most monitors that fill my criteria have AMDs Freesync on them.
Now I don't intend to use the Freesync feature with the work around I found out about, I just want to get a 1440p monitor but all the ones that are suitable and in my budget come with Freesync.

I'd imagine that if I do get a monitor with Freesync enabled but use a Nvidia GPU the freesync feature will be permanently disabled and I carry on as normal. I just wanna know if this is okay to do before I make any purchases...?
 
Solution
The feature is not disabled, just not available. So you end up with a perfectly suitable monitor. You can still enabled v-sync if you want to discourage tearing, but then you have to make sure your setup can handle the refresh rate of the monitor.

G-sync requires a G-sync module as the scalar in the monitor. Nvidia just hasn't bothered to directly support Adaptive Sync via the DisplayPort standard which is what FreeSync is.

The work around is interesting, and it is one of those odd gray areas I doubt they will bother to patch. How many people have two GPUs, one from each brand? I don't think anyone has sat down and measured it to see how well it performs scientifically.
The feature is not disabled, just not available. So you end up with a perfectly suitable monitor. You can still enabled v-sync if you want to discourage tearing, but then you have to make sure your setup can handle the refresh rate of the monitor.

G-sync requires a G-sync module as the scalar in the monitor. Nvidia just hasn't bothered to directly support Adaptive Sync via the DisplayPort standard which is what FreeSync is.

The work around is interesting, and it is one of those odd gray areas I doubt they will bother to patch. How many people have two GPUs, one from each brand? I don't think anyone has sat down and measured it to see how well it performs scientifically.
 
Solution


The videos I've seen from people who have tried to use the work around have said that it does work. It's just not something I'm willing to try. Like you said about the G-sync scaler, it is nice to have but it puts the price of the monitor up as it's not as accessible like Freesync.

As lons as the monitor will act as a regular monitor but without the freesync I don't mind getting one to go with the new graphics card.

Thanks for the help!
 
I've seen it working as well. No one I know of has taken a high speed camera and measured if it is doing as good a job as straight freesync or g-sync.

I imagine both are using some formed packet that indicates to the monitor when to start pulling from the buffer and draw the frame. Why re-invent the protocol when it is just sitting there. Might not even be something Nvidia could patch out without adding latency to check for authorized hardware.