I recently got a new PC with a 5900x and RTX 3090.
When hooking it up to HDMI port 4 on my Samsung Q90T it worked fine at first.
But when switching the display settings in Nvidia Control panel to 3840x2160 at 10bit RGB Full settings I encounter weird display loss issues.
Keep in mind I also have input signal plus enabled under my TV settings as it is the only way to enable 10bit and HDR.
For example at those settings with HDR enabled I will boot up FF15 and it will work fine in Borderless mode but switching to fullscreen causes the image to mess up with the bottom 3/4ths of the image being black while the top 1/4 has weird visual garbage being displayed. Switching to 8bit RGB Full HDR mode will display FF15 fine at fullscreen but not in 10bit.
If this was a bandwidth issue then even borderless mode should fail at 10bit but it doesn't.
HDR seems to being working fine even in borderless mode on FF15...one of the few games that displays HDR well.
Also if the PC is idle long enough and turns off the display then the display signal is lost until I unplug the hdmi cable and plug it back in. Doing this will reset the input signal plus setting to being off. Enabling input signal plus brings back the options and everything works fine until encountering the above issue again.
Also I was able to get 120hz to work without enabling game mode by going into the display settings from right clicking on desktop instead of using the Nvidia control panel. Under advanced display settings I was able to enable 120hz and then enable 10bit RGB full mode under my Nvidia control panel settings.
Only problem with using 120hz with these settings is it also has some issues on certain games. Crysis Remastered will outright drop the display signal when starting it up with these settings. Only way to fix the signal drop was to unplug and replug the cable back in which automatically sets input signal plus to off.
The HDMI cable I got (link below) is said to have the full 48GB/s bandwidth and should run 4k 120hz fine. Not sure if this problem is specifically the cable.
https://www.amazon.com/dp/B081SQXPWB?psc=1&ref=ppx_yo2_dt_b_product_details
When hooking it up to HDMI port 4 on my Samsung Q90T it worked fine at first.
But when switching the display settings in Nvidia Control panel to 3840x2160 at 10bit RGB Full settings I encounter weird display loss issues.
Keep in mind I also have input signal plus enabled under my TV settings as it is the only way to enable 10bit and HDR.
For example at those settings with HDR enabled I will boot up FF15 and it will work fine in Borderless mode but switching to fullscreen causes the image to mess up with the bottom 3/4ths of the image being black while the top 1/4 has weird visual garbage being displayed. Switching to 8bit RGB Full HDR mode will display FF15 fine at fullscreen but not in 10bit.
If this was a bandwidth issue then even borderless mode should fail at 10bit but it doesn't.
HDR seems to being working fine even in borderless mode on FF15...one of the few games that displays HDR well.
Also if the PC is idle long enough and turns off the display then the display signal is lost until I unplug the hdmi cable and plug it back in. Doing this will reset the input signal plus setting to being off. Enabling input signal plus brings back the options and everything works fine until encountering the above issue again.
Also I was able to get 120hz to work without enabling game mode by going into the display settings from right clicking on desktop instead of using the Nvidia control panel. Under advanced display settings I was able to enable 120hz and then enable 10bit RGB full mode under my Nvidia control panel settings.
Only problem with using 120hz with these settings is it also has some issues on certain games. Crysis Remastered will outright drop the display signal when starting it up with these settings. Only way to fix the signal drop was to unplug and replug the cable back in which automatically sets input signal plus to off.
The HDMI cable I got (link below) is said to have the full 48GB/s bandwidth and should run 4k 120hz fine. Not sure if this problem is specifically the cable.
https://www.amazon.com/dp/B081SQXPWB?psc=1&ref=ppx_yo2_dt_b_product_details
Last edited: