[SOLVED] HDMI 2.1 cable for HDMI 2.0 display

modeonoff

Honorable
Jul 16, 2017
1,355
11
11,285
Hi, given that the Nvidia 2080Ti GPU has DisplayPort 1.4 ports and I have DisplayPort 1.4 to HDMI 2.1 adapter but the display only supports HDMI 2.0 4K60Hz, is there any advantage in using HDMI 2.1 cable rather than older HDMI 2.0 cable? Somewhere I read that using HDMI 2.1 cable would still give better image quality but not sure if this is correct or not.
 
Solution
Not image quality, but a higher end cable may be capable of a longer distance before it would start getting too many errors. The more bandwidth that works the higher the settings CAN go, but the hardware in your devices will set the maximum limits. And that would only apply if the cable was near its limits.

Generally there is no significant difference between HDMI cables over 1 - 2 meters. There are certainly crappy ones that don't use all the wires or proper shielding, but they is often reflected in the price.

kanewolf

Titan
Moderator
Hi, given that the Nvidia 2080Ti GPU has DisplayPort 1.4 ports and I have DisplayPort 1.4 to HDMI 2.1 adapter but the display only supports HDMI 2.0 4K60Hz, is there any advantage in using HDMI 2.1 cable rather than older HDMI 2.0 cable? Somewhere I read that using HDMI 2.1 cable would still give better image quality but not sure if this is correct or not.
I would be very skeptical of claims of image quality improvements with fancier cables. This same rhetoric has been used for years with speaker cables (and those are analog signals) .
 
  • Like
Reactions: modeonoff

Eximo

Titan
Ambassador
Not image quality, but a higher end cable may be capable of a longer distance before it would start getting too many errors. The more bandwidth that works the higher the settings CAN go, but the hardware in your devices will set the maximum limits. And that would only apply if the cable was near its limits.

Generally there is no significant difference between HDMI cables over 1 - 2 meters. There are certainly crappy ones that don't use all the wires or proper shielding, but they is often reflected in the price.
 
  • Like
Reactions: modeonoff
Solution
I would be very skeptical of claims of image quality improvements with fancier cables. This same rhetoric has been used for years with speaker cables (and those are analog signals) .

For analog signals there is an argument to use better cables due to EMI and signal quality concerns. But digital is digital; the signal either arrives or it doesn't. There's no advantage using a cable rated for higher bandwidth if the device doesn't support it.
 

kanewolf

Titan
Moderator
For analog signals there is an argument to use better cables due to EMI and signal quality concerns. But digital is digital; the signal either arrives or it doesn't. There's no advantage using a cable rated for higher bandwidth if the device doesn't support it.
Even for analog speakers, it is low frequency, and the difference between a 12AWG finely standed copper and a "99.999% silver" cable will be inaudible, IMO.
 

TRENDING THREADS