G-Sync -FreeSync converter?

Elchi

Distinguished
Oct 29, 2008
9
0
18,520
Does anybody knows or heard about a G-Sync to FreeSync converter (or vice versa).

Considering that a good monitor is a consequent investment and that it lasts longer than a graphic card this would ease the life of users.
I'm unsure if this is also technically possible or if possible if it can be done for reasonable costs.

Thanks in advance for the answers
 
Solution


G-Sync requires a piece of hardware called a scaler that is proprietary by Nivida. In otherwords, you have to pay Nvidia each time you put it into a monitor and you have to agree to their terms.

Free-Sync also requires a scaler but AMD does not control the sale or design of them. They have an open standard they gives companies an idea of how to make a Free-Sync scaler and...


That's not true.. G-Sync is more expensive because NVidia chooses to charge for it...
 


G-Sync requires a piece of hardware called a scaler that is proprietary by Nivida. In otherwords, you have to pay Nvidia each time you put it into a monitor and you have to agree to their terms.

Free-Sync also requires a scaler but AMD does not control the sale or design of them. They have an open standard they gives companies an idea of how to make a Free-Sync scaler and that's it.

Replacing a scaler in a monitor is not something that can be done so the answer to your question would be no, it is not possible.
 
Solution


NVidias solution is implemented as a hardware in every monitor, thus it controls the monitor and gpu at the same time better and prevents "ghosting" which is an advantage over amd whole rely mainly on drivers. Also freesync gets a bit hurt when it comes to the very high or very low fps, while gsync works better.
 


Free Sync is implemented in the monitor hardware as well. If it wasn't nearly every monitor would be advertising Free-Sync. The scaler in the monitor does not control the GPU, it just coordinates the frames.

Ghosting has nothing to do with adaptive sync and Nvidia's G-sync most certainly does not prevent it. Ghosting is when fast movement on the screen occurs and the pixels aren't able to refresh quick enough, thus leaving a trail.

You last sentence is the opposite of reality. Free Sync can operate at higher and lower refresh rates than G-Sync. G-Sync only works as low as 30 FPS. Free-Sync works as low as 7 FPS.

You should be ashamed to be spreading such blatant misinformation. You need only google the subject to figure out this, the most basic of information I have provided.
 


That first article you linked is trash. If I wanted to read an article that regurgitated Nvidia PR I would just go to their website. That article does not have one ioda of proof except for an Nvidia promotional video showing the difference in ghosting between multiple monitors. FYI different monitors are going to have different response rates and varying ghosting. This has nothing to do with G-Sync. This just proves that they had no idea what they were talking about. They at the very least needed to use the same monitor with and without the G-Sync module otherwise the test is worthless.

On your second article

"Firstly, it’s just smoother, especially at lower frame rates.

This isn’t a night and day difference. Arguably, the delta is so small that you wouldn’t notice it unless consciously looking out for it."

Besides the author making a contradictory statement right off the bat, this article is old. AMD has since fixed the stuttering issue when below the refresh rate threshold.

And once again that article is comparing different screens to figure out ghosting. Different screens are going to have varying degrees of ghosting regardless of G-Sync and Free-Sync.

Any PR bullshit from nvidia on G-Sync reducing ghosting is just that, bullshit. G-Sync is an adaptive refresh rate technology, it doesn't deal with response times. Just because G-Sync only shows up on monitors with high refresh rates does not mean it gets rid of ghosting, the monitor would have done that with or without G-Sync.
 


That probably isn't completely true. It could be done with an extra piece of currently non-existent hardware, that would need to be capable of decoding the g-sync signal(so would need nvidias scaler chip), and then encoding a freesync signal. However that would probably introduce a lot of latency; making it pointless, as you'd probably get lower latency with regular old v-sync.

So you're kind of correct, but not technically correct.
 



This was my initial thinking. Theoretically the latency should not exceed 1 frame assuming you have a buffer storing the previous frame including the sync signal.



 
G-sync is bullocks. Then again, so is HDMI (why not DP!). It's only a bit like the phys-x cards when they came out, I think they spun ASIC's for it, but it's SIMD and that's where GPU's shine, GPGPU made it fit nicely right on the GPU. B limy... we have CPU's for a reason, don't spin an ASIC for every little piece of programmable routine.... why do TV's have firmware (and update via USB port)?? It's because they are programmable. I think some northbridges in the past were fpga's but they went to asics and now it's basically all on the CPU. I'm guessing TV's / monitors (really no difference but size) have asic scalers, some may not be programmable. What G-sync chip adds to a controller board is cost. Consider 'game mode' is 'supposed' to bypass scaling and processing so it's like a direct connection... if the G-sync chip added anything to that then make a cheeky dongle to add to the cheeky GPU or just tell people to tie their cables in knots if they want less expensive and more-to-the-point bullocks.
 


A buffer is what introduces latency FYI, Nvidia even talked about these technical details with a live interview with AwesomeHardware (the guy from newegg).