TV HDMI slot differences and impact on display resolution

George3356

Reputable
Nov 18, 2016
70
1
4,535
Hey guys!

I would like to ask for help regarding my TV - Samsung UE55KU6172 settings.

My setup is PC->HDMI->TV.

The TV has 3 HDMI ports: HDMI 1/DVI, HDMI 2 and HDMI 3 (ARC).

When I plug the cable into either 2 or 3, I have display and it works fine - I can set 4096x2160 or 3840x2160 (shows as recommended so I use it).And what is most important, I can select 60hz frequency - at 30hz it is just terrible, a sore for the eyes :)

So far I have therefore been using this (HDMI2/3).

However, now I tried the HDMI1/DVI slot and it seems to "unlock" new options such as UHD, etc. However, the resolution offered is only 4096x2160 and only at 30hz - unplayable, uncontrollable. And when I try to enable the UHD option in windows/change resolution to any other, I get a black screen and in a while it seems to cancel the option and then the display is back but UHD is not enabled.

Could anyone please try to explain why this is happening, if it is "normal" and possibly recommend me the best settings to set on the PC/TV to get the most out of the TV? My GPU is MSI GTX 1070ti - I just bought it to get the most out of the new TV...

Oh and a second thing - the information about my display in windows say Bit depth 8bit and standard dynamic range SDR - but I read somewhere the TV should be capable of much more?

Thanks a ton!!
George
 
TV's and PC's have never really played nice with eachother; it's much better then what it used to be (running a LG OLED 55B6P myself), but you usually need to play with a setting or two on the TV end to make everything play nice.

First off: If your TV lets you set a name for each HDMI input, make sure to select "PC" for whichever input the PC is using. That generally fixes the majority of problems, since it clues the TV that a PC signal is coming from the other end of the connection.

As for the resolution, specifically the UHD, what's happening is HDMI 2.0 is limited when HDR is enabled at high resolutions because it doesn't have enough bandwidth. At 4k resolution, HDR content is limited to 30Hz when uncompressed 4:4:4 is being used; that's probably what you're seeing when using that input. HDMI 2.0 does have enough bandwidth for 4k HDR @ 4:2:2, and is typically the way I have my PC set up to my TV. You can adjust these settings from within your GPUs control panel. [Note HDR on Windows is VERY finicky right now; I leave it off within Windows and only enable it for games that need it.]

Also, there's usually a setting on the TV [usually 'Deep Color Settings' or some other nonsense like that] that needs to be enabled for each HDMI input to enable HDR content; check your TVs manual for instructions how to do it for your specific TV.