[SOLVED] Hdmi no 4k picture dvi 4k picture is that right?

Jan 21, 2019
3
0
10
Hi strange one here.
I have a Dell xps 8300 desktop
Intel i5
8gb ram
OS running off ssd
Iv just put a geforce gtx650 graphics card in (it's better than stock one and was free)
I have a LG 4k TV and a Panasonic HD ready 720p TV hooked up the problem I have is I cant have the LG TV hooked via hdmi. tv says unsupported format but I had picture until I ran geforce experience for drivers.
Now I have LG TV hooked by dvi to hdmi adapter with 4k hdmi cable to TV and it's running 4k 60htz which I didn't think dvi could run that high and the other TV on hdmi does anyone have any ideas why I cant get a pic through hdmi on 4k tv
Oh The picture is there until i log on when it loads user settings with hdmi on 4k

Thanks for reading
 
Solution
No problem, glad to help.

As you said too it can't be 4K at 60Hz. The manufacturer's specs page for that card says cards its max resolution via the digital output (DVI) is "Max Digital: 2560x1600 (Dual Link DVI Only)".

I'm not quite sure but the nVidia control panel might be showing information it's getting from the display/TV in the EDID format.

That is a metadata format that display devices (TVs, monitors etc.) use to tell the video source (graphics card) that is running them what capabilities they have in terms of filter type, timings supported by the display, display size, luminance data and pixel mapping data (only for digital displays).

Satan-IR

Splendid
Ambassador


It depends on the HDMI version. I think you can get 4K on HDMI from HDMI 1.3 and forward (at 30Hz). For 4K at 60Hz you need HDMI 2.0 and forwards.

The manufacturer's specification page here says that card can have "3840x2160 at 30Hz or 4096x2160 at 24Hz supported over HDMI".

That DVI to HMDI adapter got to be an active one which is converting signals.
 
Jan 21, 2019
3
0
10
Thank you very much for your reply I know about hdmi 1.3 onwards for 4k had to upgrade cables when I got the TV ill try and lower the hz when I get chance later. To see if I get a image.
The bit that has stumped me is as far as I'm aware i should not get 4k res option through dvi regardless of adapter that's in it the control panel settings should stop at the highest for dvi like my other TV stops at 720p. So is it really 4k? the image is crisp.
Not as good as sky q or xbox's one x but I'm not expecting it to be as it's old GPU.
Is it worth trying to get the hdmi going as it says its putting 4k through dvi or is that false ?Would I get a better picture?
I wont be 4k gaming on it only movies tv ect.
Sorry if I come across picky just trying to get the best of what I got and learn as much as possible at the same time.
Thank you




 

Satan-IR

Splendid
Ambassador
No problem, glad to help. As far as I know maximum resolution you can get through a DVI dual link port with a dual link cable is 2560x1440 at 60Hz.

I don't think it can do 4K at 60Hz with a single DVI dual link.

Maybe it is working at a lower frequency. I have seen 3840x2160 at 30Hz done on a single DVI dual link with a DVI to HDMI adapter.

For the time being the reliable standard port-cable combo for 4K at 60Hz is DisplayPort, until HDMI 2.2 is out.

What is your systems graphics card?
 
Jan 21, 2019
3
0
10
That's what I thought but geforce control panel says it's running 4k 60hz (strange)
Graphics card says geforce evga gtx650. I think 1gb DDr5 it's old card but better than stock 1 that maxed at 1080p in all inputs.
Could I be right in thinking that dvi can put out 4k it's the cables that cant like audio a dvi cable cant put out audio but the port can through hdmi adapter could this be the same for resolution my adapter is straight in the graphics card no dvi cable then 4k hdmi cable.
The adapter is nothing special
Thank you for your help

 

Satan-IR

Splendid
Ambassador
No problem, glad to help.

As you said too it can't be 4K at 60Hz. The manufacturer's specs page for that card says cards its max resolution via the digital output (DVI) is "Max Digital: 2560x1600 (Dual Link DVI Only)".

I'm not quite sure but the nVidia control panel might be showing information it's getting from the display/TV in the EDID format.

That is a metadata format that display devices (TVs, monitors etc.) use to tell the video source (graphics card) that is running them what capabilities they have in terms of filter type, timings supported by the display, display size, luminance data and pixel mapping data (only for digital displays).

 
Solution