Question Question about upgrading the graphic card

rulerss

Prominent
Nov 23, 2021
66
3
535
Hello everyone and nice to see you again!

I have a little problem and want to hear some suggestions.

I just bought a samsung odyssey g40, which is great and I like it. The problem is, it said it has g-sync, but in the monitor options it doesn't have, only freesync.

Is g-sync that important, and if it is, is freesync from amd as good as nvidia?

Anyway, I want to upgrade in the near future from my 1660 ti, to a more powerful 3060ti, or something similar from nvidia.

Should I switch to Ati, because of the monitor not having g-sync? What would be the equivalent to 3060ti?
 
You may need to enable G-SYNC in Nvidia Control Panel for it to show up as an option on your monitor.

G-SYNC has actual hardware in the monitor that is supposed to provide a better and wider range of adaptive sync.
 

rulerss

Prominent
Nov 23, 2021
66
3
535
I don't have that option in the nvidia control panel. It is missing. And nor in the monitor settings. In the monitor there is only freesync.

Thank you for explaining me what g-sync means.
 

rulerss

Prominent
Nov 23, 2021
66
3
535
What do you mean? I plugged in the cable from the monitor in the graphic card and that's it. After that I looked in the nvidia control panel and in the monitor settings and nothing. Only freesync appears in the monitor.
 

Eximo

Titan
Ambassador
What do you mean? I plugged in the cable from the monitor in the graphic card and that's it. After that I looked in the nvidia control panel and in the monitor settings and nothing. Only freesync appears in the monitor.

Which cable?

GPUs generally have multiple ports, and so do monitors.

Freesync over HDMI is quite common. G-sync uses the Display Port standard, and will work with Free Sync compatible monitors over Display Port.

I think that changes with HDMI 2.1, but your GPU doesn't have that either.
 

rulerss

Prominent
Nov 23, 2021
66
3
535
There was a hdmi cable and a power supply cable on the monitor. I don't know if 1660 Ti supports g-sync.
 

rulerss

Prominent
Nov 23, 2021
66
3
535
You mean another cable for the monitor and graphic card? But I already received from the factory. It's a hdmi port. I just checked in the monitor options. It says hdmi 2.0.
 

Eximo

Titan
Ambassador
Your GPU doesn't support G-sync over HDMI, at all.
Your Monitor doesn't support G-Sync over HDMI, at all.

Both your monitor and GPU support G-sync Compatible mode via Display Port. You must use a Display Port cable. Yes, that means getting one.
 

rulerss

Prominent
Nov 23, 2021
66
3
535
Hmm, I wondered why I have more cable slots behind the monitor. So it is a new slot compared to hdmi?
Like for example, before hdmi, there was vga slot.
One last question: Freesync which is with Amd, works on hdmi?
 

Eximo

Titan
Ambassador
Hmm, I wondered why I have more cable slots behind the monitor. So it is a new slot compared to hdmi?
Like for example, before hdmi, there was vga slot.
One last question: Freesync which is with Amd, works on hdmi?

I wouldn't call it new exactly. HDMI was first decided upon for the home theater market back in 2002. Display Port came along as a more advanced standard for desktop PCs to replace DVI in 2006. VGA and DVI have more or less disappeared at this point, older style budget cards and some motherboards still come with VGA.

For the most part, yes, freesync over HDMI is the defacto cheap implementation. So the really cheap monitors that have HDMI and VGA only will work with Freesync. HDMI 1.4 and HDMI 2.0 monitors that say freesync support can usually do it over HDMI. They can always do it over Display Port since that implementation is much older.

Display Port 1.2a (2013) has supported VRR for a much longer time. G-Sync was first to market for VRR and they have stuck pretty much to making it a Display Port exclusive until HDMI 2.1 came out and supported the protocol.

HDMI 2.1 (2017, but only recently came to market with the latest console generation) now supports VRR natively, HDMI 2.0 and earlier use the video card driver to enable the capability via software rather than protocol.

I should add that the early HDMI implementation of FreeSync often have a pretty limited VRR range. So on the low end they don't work as well since they will default to VRR off when under say 30FPS. G-Sync with a proper G-Sync module starts doubling up frames, which kills latency, but keeps tearing away.
 
  • Like
Reactions: rulerss

rulerss

Prominent
Nov 23, 2021
66
3
535
Thank you for such a beautiful answer!

Back in the days, when you bought a new monitor, you just plugged it in, vga cable or hdmi cable. Now you have to buy extra stuff for extra option. This is new to me, if I understand correct.
So the technology of g-sync is a bit older and I didn't noticed it because of a missing cable?

Yes, the monitor has hdmi 2.0. It can also select 1.4, but I think that is bad.
Typical Amd marketing, making cheap stuff, like they did with Amd Athlon and Phenom.
So if I understand correct, freesync is weaker than g-sync?

What do you mean, no latency, but keeps tearing away?
 

Eximo

Titan
Ambassador
Thank you for such a beautiful answer!

Back in the days, when you bought a new monitor, you just plugged it in, vga cable or hdmi cable. Now you have to buy extra stuff for extra option. This is new to me, if I understand correct.
So the technology of g-sync is a bit older and I didn't noticed it because of a missing cable?

Yes, the monitor has hdmi 2.0. It can also select 1.4, but I think that is bad.
Typical Amd marketing, making cheap stuff, like they did with Amd Athlon and Phenom.
So if I understand correct, freesync is weaker than g-sync?

What do you mean, no latency, but keeps tearing away?

No your monitor doesn't have G-Sync, so not because you are missing a cable. VRR is a built in feature of Display Port, and so a G-Sync compatible card can make use of it. Freesync over DisplayPort is just corporate branding of an existing feature. I want to say G-Sync came out around 2013 in the form of an upgrade kit for a certain ASUS monitor. The first G-sync monitor the ASUS PG278Q (1440p 144hz) came out in July 2014. The first freesync monitor came out in December 2014, but was 4K 60hz, so they didn't directly compete. 4K was pretty tough to run in 2014, so it made sense.

AMD's implementation over older HDMI is a nearly a pure software solution, so there are processing penalties which lead to latency.

Total Latency is a measure of how much time passes between a command to generate a frame and that frame actually being displayed. Also similar to input lag, how long an action you make takes to get to the screen.

I believe it was mentioned before, but true G-Sync monitors have an Nvidia designed chip that runs the monitor, and a G-Sync GPU can control it directly. Since it is Nvidia from GPU through the monitor they did all they could to reduce the time it takes for the GPU to modify the Monitor's refresh rate and actually display an image. A lot of prediction involved in how many frames the GPU thinks it can make and changing the refresh rate to match.

Fully completed frames are delivered to the monitor, and since the GPU commands the refresh rate, it only draws the frame at the start of a refresh cycle. This means a single frame per cycle, ie no tearing.

Without V-Sync, G-Sync, or FreeSync the monitor simply draws whatever is in the frame buffer, whether it be one two, or a dozen renders. You can see the line between different frames when this happens, which is known as tearing.

G-Sync uses Adaptive Sync below 40 FPS, which allows it to continue preventing tearing, but it does that by showing the same frame multiple times. This effectively doubles the latency since no change is made to the scene. FreeSync Premium and Premium Pro also have this capability.

FreeSync is simpler. It uses the monitor's normal scalar. In the case of display port the GPU can inform the monitor what refresh rate to use as it predicts how many frames it can deliver. Freesync over HDMI is still hardware dependent, they have wired an additional chip into the monitor's scalar to control the vertical blanking signal, but that extra software to hardware layer adds a little processing time.

Blind tests have shown that people can't really tell the difference, but the differences can be measured.
 
Last edited:

Eximo

Titan
Ambassador

rulerss

Prominent
Nov 23, 2021
66
3
535
Thanks again for the answer.

You said that I don't have true g-sync? Because it's not hardware chip in my monitor? Only software, like an emulation? Only the 1660ti card has it?
Why does it say then on the samsung brand: g-sync compatible?

I can see on the monitor settings now, freesync premium on. Does it mean it is activated, even though I don't have Ati graphic card?

So Freesync still works with the factory hdmi cable 2.0? But it is a bit software and not full hardware? I know the difference. I had 56k modem to connect to the internet. And the hardware one was better, you didn't had lag back in the day.

I don't see a display port 2.0 cable, only hdmi 2.0, which I think I already have from the factory. I can only see display port 1.4 as a maximum setting. Or 1.2.
 

rulerss

Prominent
Nov 23, 2021
66
3
535
I have found something:


It says that g-sync compatible, doesn't have the hardware thing, but still works.

I guess I'll need that cable that was missing. Thanks again for your help.

You can close now the thread.
 

Eximo

Titan
Ambassador
You said that I don't have true g-sync? Because it's not hardware chip in my monitor? Only software, like an emulation? Only the 1660ti card has it?
Why does it say then on the samsung brand: g-sync compatible?

I can see on the monitor settings now, freesync premium on. Does it mean it is activated, even though I don't have Ati graphic card?

So Freesync still works with the factory hdmi cable 2.0? But it is a bit software and not full hardware? I know the difference. I had 56k modem to connect to the internet. And the hardware one was better, you didn't had lag back in the day.

I don't see a display port 2.0 cable, only hdmi 2.0, which I think I already have from the factory. I can only see display port 1.4 as a maximum setting. Or 1.2.

Correct, your monitor has Display Port which supports variable refresh rate technologies on its own. Nvidia can use this even without the G-Sync module. They introduced the G-sync module to control the entire process and sell it at a premium.

Monitor settings are somewhat unique, but it won't work unless you enable G-Sync compatible from the Nvidia control panel.

Yes, if you had a AMD GPU you could use Freesync over HDMI. Basically the driver does all the work and there is a small off the shelf chip wired up to the monitor scalar to control the vertical blanking, that gets sent a signal over HDMI.

I do vaguely recall software based 56K modems and the CMR slots on cheap motherboards, but actual modems were so cheap by that point I have no idea why they bothered.

You would just buy a display port 2.0 cable, doesn't matter what version your monitor has or your GPU for that matter. The higher bandwidth cable will offer support for all the previous versions.
 
  • Like
Reactions: rulerss

rulerss

Prominent
Nov 23, 2021
66
3
535
Yes, I understand now. Thanks for explaining to me how this new technology works.

Back in the days, there were software modems which were fitted into the motherboard slot, and there were hardware modems, and some were also exterior. The software ones were bad, because they had lag. I know, because I had one of them, and when I switched to hardware, everything was good.

The question regarding the subject of this conversation was if I must upgrade to Ati or to Nvidia graphic card in the future, regarding the new monitor I bought. I think the answer would be to stick with Nvidia and get the new display port cable.

I don't know how to mark your answer as solved and close the thread.