[SOLVED] HDMI 2.1 signal loss issues with RTX 3090

IronyTaken

Reputable
Jul 10, 2020
31
1
4,535
I recently got a new PC with a 5900x and RTX 3090.
When hooking it up to HDMI port 4 on my Samsung Q90T it worked fine at first.
But when switching the display settings in Nvidia Control panel to 3840x2160 at 10bit RGB Full settings I encounter weird display loss issues.
Keep in mind I also have input signal plus enabled under my TV settings as it is the only way to enable 10bit and HDR.

For example at those settings with HDR enabled I will boot up FF15 and it will work fine in Borderless mode but switching to fullscreen causes the image to mess up with the bottom 3/4ths of the image being black while the top 1/4 has weird visual garbage being displayed. Switching to 8bit RGB Full HDR mode will display FF15 fine at fullscreen but not in 10bit.
If this was a bandwidth issue then even borderless mode should fail at 10bit but it doesn't.
HDR seems to being working fine even in borderless mode on FF15...one of the few games that displays HDR well.
Also if the PC is idle long enough and turns off the display then the display signal is lost until I unplug the hdmi cable and plug it back in. Doing this will reset the input signal plus setting to being off. Enabling input signal plus brings back the options and everything works fine until encountering the above issue again.

Also I was able to get 120hz to work without enabling game mode by going into the display settings from right clicking on desktop instead of using the Nvidia control panel. Under advanced display settings I was able to enable 120hz and then enable 10bit RGB full mode under my Nvidia control panel settings.
Only problem with using 120hz with these settings is it also has some issues on certain games. Crysis Remastered will outright drop the display signal when starting it up with these settings. Only way to fix the signal drop was to unplug and replug the cable back in which automatically sets input signal plus to off.

The HDMI cable I got (link below) is said to have the full 48GB/s bandwidth and should run 4k 120hz fine. Not sure if this problem is specifically the cable.
https://www.amazon.com/dp/B081SQXPWB?psc=1&ref=ppx_yo2_dt_b_product_details
 
Last edited:
Solution
With my 3080 and C9 the first 2.1 cable I had just wasn’t good enough even though it claimed 48Gbps. I ended up replacing with a cable with good reviews by people actually running 4K 120Hz HDR. However my problem was the screen just not displaying or if it did display I would get intermittent loss of signal which sometimes required a hard reset of the PC and switching the TV off and on to get the signal back. From the little reading I did it seems many cables claiming 2.1 48Gbps are just not actually capable.

IronyTaken

Reputable
Jul 10, 2020
31
1
4,535
Here is a screenshot of what the display looks like.
20210525-022315.jpg

This is what happens if I have the nvidia control panel display settings set to 3840x2160 with 10bit at RGB Full. If I have Final Fantasy 15 set to borderless mode it works fine but switching to full screen will have the display looking like it does in the picture.
Going down to 8bit will fix the issue but this cable is supposed to support 48gb/s.
 

smartmantech47

Reputable
Jan 30, 2021
447
19
4,695
Either monitor or gpu is the problem. Put the GPU in another system and keep the same monitor. If it doesn't work swap the monitor. If it works then monitor is the problem. If not then it's the gpu most probably.
 

smartmantech47

Reputable
Jan 30, 2021
447
19
4,695
With my 3080 and C9 the first 2.1 cable I had just wasn’t good enough even though it claimed 48Gbps. I ended up replacing with a cable with good reviews by people actually running 4K 120Hz HDR. However my problem was the screen just not displaying or if it did display I would get intermittent loss of signal which sometimes required a hard reset of the PC and switching the TV off and on to get the signal back. From the little reading I did it seems many cables claiming 2.1 48Gbps are just not actually capable.
 
Solution

IronyTaken

Reputable
Jul 10, 2020
31
1
4,535
Swap the cable. I saw reviews and there are lots of problems with it. Plus no one would spend 180 dollars for a cable. Amazon.com: Zeskit Maya 8K 48Gbps Certified Ultra High Speed HDMI Cable 6.5ft, 4K120 8K60 144Hz eARC HDR HDCP 2.2 2.3 Compatible with Dolby Vision Apple TV 4K Roku Sony LG Samsung Xbox Series X RTX 3080 PS4 PS5: Industrial & Scientific try this. Even if it doesn't work, it's cheap. Not 180 bucks.
The reason the cable is so expensive is the length. Most cables beyond 15 feet won't properly carry a signal unless it's a certain type of cable.
My PC is in another room so I just run a 30 foot cable to my living room tv.
I doubt it's an issue with the video card since I was playing Final Fantasy 15 fine at 4k 120hz 10bit RGB full with hdr on.
The only reason I switch back to 60hz was to have Crysis Remastered working. If I played Crysis at 4k 120hz then the TV will lose signal when starting the game until I remove the hdmi cable and plug it back in. Removing and plugging back in the cable turns off the "input signal plus" setting on my TV. Input signal plus allows for higher bandwidth settings so without it enabled the highest setting available is 4k 60hz RGB limited.
Crysis remastered would work fine at the lowered display output but switching it back on to Input signal plus would automatically set it back to 4k 120hz and the game would once again lose signal until unplugging and plugging back in the HDMI cable.

It's really weird as some games work fine while others do not.

Final Fantasy 15 at 4k 60hz 8bit runs fine but switching from Full-screen to borderless would have no audio coming from the speakers despite the audio settings not changing.
Going back to Full-screen would renable the audio.
 
Last edited:

IronyTaken

Reputable
Jul 10, 2020
31
1
4,535
I tried another cable this time bringing my PC closer so I could use a shorter 8k HDMI 2.1 cable.
https://www.amazon.com/dp/B08QTNNRS4?psc=1&ref=ppx_yo2_dt_b_product_details

Still had the same issue in FF15. So I think I can rule out the cable being the issue.
I even tried putting HDMI 4 in game mode to see if that would help and it didn't.

So It's either the Samsung Q90T display, The video card, or a software/driver issue.
I've heard of drivers or windows update causing display issues so hopefully it is that and not my TV or video card.
I just got my exchange TV back after over a month of waiting since my old Samsung Q90T display was covering in dead pixels.
Unfortunately I don't have another 3000 series card to test it with.
I only have my laptop with a 2070 super. As far as I know only the 3000 series support HDMI 2.1.

My friend has the same TV as me so next thing I suppose I will do is make the 40 minute drive there when I have free time and test his TV out.
 
I tried another cable this time bringing my PC closer so I could use a shorter 8k HDMI 2.1 cable.
https://www.amazon.com/dp/B08QTNNRS4?psc=1&ref=ppx_yo2_dt_b_product_details

Still had the same issue in FF15. So I think I can rule out the cable being the issue.
I even tried putting HDMI 4 in game mode to see if that would help and it didn't.

So It's either the Samsung Q90T display, The video card, or a software/driver issue.
I've heard of drivers or windows update causing display issues so hopefully it is that and not my TV or video card.
I just got my exchange TV back after over a month of waiting since my old Samsung Q90T display was covering in dead pixels.
Unfortunately I don't have another 3000 series card to test it with.
I only have my laptop with a 2070 super. As far as I know only the 3000 series support HDMI 2.1.

My friend has the same TV as me so next thing I suppose I will do is make the 40 minute drive there when I have free time and test his TV out.
I have an LG CX and use these cables with 120hz, 10bit, HDR, and G-sync with no problems. The only issue I have very occasionally is issues with g-sync being enabled causing the CX to do this weird flickering. A quick reset of g-sync either on the CX or Nvidia control panel fixes it for me.
 
While it could be the cable, the fact the issue only shows up in fullscreen makes me think either the display or GPU/Driver as at fault. The fact that you are loosing HDMI on a hibernate makes me think something is going wrong with the handshaking somewhere; LG had similar issues last year when NVIDIA first backported HDMI 2.1 support.

For something like this, your best bet is to probably either ping NVIDIA/Samsung or head over to somewhere like AVSForum and see if any Q90T owners are reporting issues. would have worked out the HDMI 2.1 kings by now, but it wouldn't shock me if Samsung in particular still has a few bugs to iron out.

The only thing I can really recommend is to try a different display driver (even an older one for testing for now) to see if it's a driver issue. I don't see anything obvious it could be.
 
  • Like
Reactions: helper800