[SOLVED] HDMI refresh rate can't go higher than 100hz ?

Nov 8, 2021
9
0
10
Hi team,
Happy New Year!:)
I'm currently testing an Asus RX580 8GB OC connected through an HDMI cable to one monitor.
In Windows display setting I'm seeing 60Hz,100Hz, 120Hz and 144Hz. With 60Hz and 100Hz all works fine.
With 120Hz the screen flickers and there is a sort of "grainy look" to the screen like with video artifacts (black and white pixels flickering). With 144Hz the screen turns black and may occasionally show for a second a very grainy look of the image.
With both 120Hz and 144Hz I cannot maintain the refresh rate and have to go back to 100Hz (highest stable rate).

What could be causing this? The monitor is definitely 144Hz (I have used it with another video card with 144Hz). Is it possible the HDMI cable I have is too old? I know that different HDMI versions are important but can't tell if this could be the case. Any other alternatives?


Sidenote: I'm also having a weird problem with the DisplayPort connection where the same video card does not push video out at all. However I tried through both HDMI ports and it works correctly (excpet for the aforementioned refresh rate issue). Any clues why DP connection isn't pushing video out?

Best,
 
Solution
Increasing the fixed refresh rate adds no meaningful load to the GPU, so that shouldn't be a factor.

Since everything works fine at lower refresh rates, it is most likely a flaky cable - works fine until you attemp to push more bandwidth through it than it can reliably handle. Different input/output ports also vary in their ability to cope with sub-optimal cables and termination at the far-end, so the GPU and monitor combination can also be factors.
Nov 8, 2021
9
0
10
Thanks Lutfij.
I can confirm the card was not used for mining (house house).
Model of monitor: AOC G2590PX
I used the same monitor with another card and 144Hz worked.

Is there a test I can do to check if card is half dead? I ran Furmark and it ran correctly for about 10 mins without having crazy temps.

Best regards,
 

InvalidError

Titan
Moderator
Increasing the fixed refresh rate adds no meaningful load to the GPU, so that shouldn't be a factor.

Since everything works fine at lower refresh rates, it is most likely a flaky cable - works fine until you attemp to push more bandwidth through it than it can reliably handle. Different input/output ports also vary in their ability to cope with sub-optimal cables and termination at the far-end, so the GPU and monitor combination can also be factors.
 
Solution
Nov 8, 2021
9
0
10
Quick update. I have realized that the video card actually outputs video "sometimes" through each of its ports. This may look like Lutfij may be right that the card may be half-dead. After restarting the PC 2 or 3 times the card finally works through its ports.
Is there any sort of software test I can do to verify this? Here is a screenshot of GPUz for additional details.

View: https://imgur.com/a/BaG8Sex
 
What resolution are you trying to use? You just might be pushing more pixels at a rate that HDMI 2.0 can't handle. Also, are you using HDR? If so that reduces the available bandwidth.

Assuming HDMI 2.0 without HDR, you should be able to do 1440p @ 144 without needing to resort to any chroma subsampling. If we assume HDR however, the most you can output before subsampling is required is...100Hz. You can achieve 120/144 via subsampling to 4:2:0 if that's the case.

Also remember the cables; if you are using older cables they may not be able to handle the necessary data-rate and that could also be the problem.