[SOLVED] HDMI 2.0b output to HDMI 1.4b input with HDMI 2.1 cable

cronik93

Distinguished
Apr 15, 2010
235
2
18,695
Bought an HDMI 2.1 cable off amazon because why not? Its only $10.

So I have a GTX 1080 with an HDMI 2.0b port. My TV only has HDMI 1.4b ports. Im using an HDMI 2.1 cable.

What I want to know is what is happening between the two. Is a 2.0b signal being sent by the GPU and then converted to 1.4b by the TV? Is the TV telling my GPU to only send a 1.4b signal? Idk, is anybody here able to tell me what is happening?
 
Solution
HDMI is backwards-compatible. So your TV's HDMI 1.4b port limits the amount of bandwidth to the HDMI 1.4 specification.

HDMI 1.4 can display 4k (3840x2160)@30hz and 1920x1080@120hz (and a bunch of in-between resolutions and refresh-rates).

HDMI 2.1 can display 10k (10240 × 4320)@120hz and everything "below" that.

There's nothing wrong with using a HDMI 2.1 cable, your TV just isn't capable of fully-utilizing it. Likewise, your TV isn't capable of fully-utilizing your GPU's theoretical maximum resolution/refresh rates.

Example(made-up because I don't know what TV you have): Your TV is max 1920x1080 resolution with a max refresh rate of 120hz. Your GPU (and the cable you bought) work perfectly well with that TV, but your GPU (and...

larkspur

Distinguished
HDMI is backwards-compatible. So your TV's HDMI 1.4b port limits the amount of bandwidth to the HDMI 1.4 specification.

HDMI 1.4 can display 4k (3840x2160)@30hz and 1920x1080@120hz (and a bunch of in-between resolutions and refresh-rates).

HDMI 2.1 can display 10k (10240 × 4320)@120hz and everything "below" that.

There's nothing wrong with using a HDMI 2.1 cable, your TV just isn't capable of fully-utilizing it. Likewise, your TV isn't capable of fully-utilizing your GPU's theoretical maximum resolution/refresh rates.

Example(made-up because I don't know what TV you have): Your TV is max 1920x1080 resolution with a max refresh rate of 120hz. Your GPU (and the cable you bought) work perfectly well with that TV, but your GPU (and cable) can work with a fancier TV with a resolution of 4k (3840x2160)@60hz.

There are additional things like HDR that the newer HDMI standards support that the older standards do not.
 
  • Like
Reactions: cronik93
Solution

cronik93

Distinguished
Apr 15, 2010
235
2
18,695
HDMI is backwards-compatible. So your TV's HDMI 1.4b port limits the amount of bandwidth to the HDMI 1.4 specification.

HDMI 1.4 can display 4k (3840x2160)@30hz and 1920x1080@120hz (and a bunch of in-between resolutions and refresh-rates).

HDMI 2.1 can display 10k (10240 × 4320)@120hz and everything "below" that.

There's nothing wrong with using a HDMI 2.1 cable, your TV just isn't capable of fully-utilizing it. Likewise, your TV isn't capable of fully-utilizing your GPU's theoretical maximum resolution/refresh rates.

Example(made-up because I don't know what TV you have): Your TV is max 1920x1080 resolution with a max refresh rate of 120hz. Your GPU (and the cable you bought) work perfectly well with that TV, but your GPU (and cable) can work with a fancier TV with a resolution of 4k (3840x2160)@60hz.

There are additional things like HDR that the newer HDMI standards support that the older standards do not.

Am I getting any benefit from using a HDMI 2.1 cable other than being able to upgrade from my 1080p 60Hz TV to a display with higher res and refresh rate? Like less latency.
 
Last edited:

larkspur

Distinguished
No other benefit. Your TV determines the latency. TVs often have high latency - gaming monitors are what you want if you want low latency. But either way - you can use your cable in any configuration where you need HDMI 2.1 (or below). For a gaming monitor (in general) I would use DisplayPort instead of HDMI.
 
  • Like
Reactions: cronik93