Determining display native resolution

ScottForum

Distinguished
Apr 29, 2014
58
2
18,535
I can't seem to get fonts looking well on my monitor and feel it's something to do with my computer thinks the monitor naive resolution is 1920x1080, even though I'm using the below equipment. Why is my monitor only recognized as a 1920x1080 native resolution?

Monitor: Sharp N6100U 4K TV, capable of a 3840x2160 RGB resolution.
Graphics card: nVidia GeForce GT 730, capable of a 3840x2160 resolution.
HDMI cable: 2.0, capable of 4k, UHD

Capture.JPG
 

Barty1884

Retired Moderator
Just a guess here, but potentially down to refresh rate.

Your TV is 4K @ 60Hz.

The 730 doesn't have HDMI 2.0 (it's 1.4 I believe), so cannot provide 4K @ 60Hz. Best it can do is 4K @ 30Hz.

Depending on how the TV actually works*, it may not be capable of matching the refresh rate.

As a result, it's finding the higher common denominator between the two.
The TV can do 1080p @ 60Hz, and the card can provide it......... so that's where it's detecting.

*Thinking comparable to how the new 8K monitors will only do their full resolution/refresh from Apple products due to how they detect and work with the signal

Again, just a guess.
 

ScottForum

Distinguished
Apr 29, 2014
58
2
18,535


Thanks for the suggestion. Process of elimination, that I should have done beforehand...seems it's either my Dell e-port or KVM switch to blame, as a direct connection from the laptop to the TV/monitor allows for higher resolutions.

.