[SOLVED] How do I Output 4k to my 4k Ultra hd Tv with new workstation with Geforce RTX 2080ti Pallet TR3 system?

spikeysonic

Prominent
Jul 23, 2018
189
0
580
0
I have just got a workstation built .

Threadripper 3960x, with a Geforce RTX 2080 ti Palit graphics card, 64gb ram so so has grunt.

I have a Samsung ue 6400 40 inch 4k TV, and with the new workstation got a BenQ pd3200Q quad definition colour accurate monitor.

The graphics card has 4 display ports. I have a display port conection to the BenQ monitor which is displaying

2560 x 1440 resolution.


I got a display port to hdmi cable to connect the TV which previously was connected to my laptop with a plain hdmi cable.


So why is the 4k TV which should have the highest resolution only displaying at 1920 x 1080 standard HD as opposed to 4k on the TV and quad hd on the monitor.


How do I get the TV displaying ultra 4k display resolution in hdr? and the monitor quad hd hdr?
 

spikeysonic

Prominent
Jul 23, 2018
189
0
580
0
Ive added the company spec info for both displays

I think this is the graphics card
https://www.scan.co.uk/products/palit-geforce-rtx-2080-ti-gamingpro-oc-11gb-gddr6-vr-ready-graphics-card-4352-core

http://www.palit.com/palit/vgapro.php?id=3006



https://www.benq.eu/en-uk/monitor/designer/pd3200q.html

BenQ PD3200Q

Display 1 Connected to Nvidia Geforce RTX 2080ti
Desktop resolution 2560 x 1400
Active Signal Resollution 2560 x 1440
Refresh rate 59 hz
bit depth 8bit (I think that should be 10bit)
colour format RGB
colour space : Standard Dynamic range SDR




Display 2 Samsung
Display 2 connected to NVIDIA Geforce RTX 2080ti
Desktop Resolution 1920 x 1080
Active Signal Resolution 1920 x 1080
Refresh rate 60hz
Bit depth 8 bit (Should be 10bit)
Colour Format RGB
Colour Space Standard Dynamic Range SDR


This help?

In theroy I should have a Quad and UHD display with 10bit HDR
 

spikeysonic

Prominent
Jul 23, 2018
189
0
580
0
are you sure the DisplayPort to HDMI converter can handle the bandwidth needed?

and have you set the correct output resolution and enabled HDR in Windows Display Settings and/or Nvidia Control Panel?
No idea on the cable/converter just brought one from a phone shop, what should I look for in the cable for bandwidth.


How do I set hdr in the display settings and do I need Nvidia software to do it on the graphics card... Bit clueless here used to crappy laptop , this is the first modern system and still settign it up only got it Thursday evening
 
No idea on the cable/converter just brought one from a phone shop, what should I look for in the cable for bandwidth.
the requirements you need should be listed on the packaging for both the adapter and any separate cable you may be using with it: 4K / HDR capability.
older, lower bandwidth revisions may not support either.

How do I set hdr in the display settings and do I need Nvidia software to do it on the graphics card...
you should be able to set in either, but i would be sure it is set in both.

right-click on Windows desktop > Display settings.

download the standard driver package corresponding to your GPU = Nvidia Advanced Search.
and go through each page of options in Nvidia Control Panel and make sure everything is set correctly for both displays.
 
Last edited:

spikeysonic

Prominent
Jul 23, 2018
189
0
580
0
Why not use a true HDMI cable?

Very simple, I thought the fancy new workstation graphic card only had display port outputs, following your post I had a much closer look and feeling a bit ot a tit one of the ports was a hdmi port, so its fixed with said cable. :) thought all four were display port
 

ASK THE COMMUNITY

TRENDING THREADS