To be frank, we have little to no control over sponcon. Often it just shows up. So presumably someone at HDMI wanted to post this to promote HDMI 2.1 end-to-end solutions. We could edit the article, but someone paid to post it as is, and I think it's better for people in the know to leave comments and critiques than for us to try to control whatever an advertiser wants to say.
That' fine. There is no harm in promoting something, unless many products are riddled with bugs, disfunctional features and arrogant marketing, which is often the case with HDMI 2.1 adoption across tech ecosystem. Promo text, for obvious reasons, does not tell us the other side of the story, as the text is geared towards holiday season...
This is exactly why I am posting here, to make your tech outlet aware of what is going on, in hope that you and your colleagues are going to take on this topic and publish more about it, to paint more accurate picture of reality than the promo text wishes to tell us.
It is the perfect time for you folks, following the promo text, to expand on this topic and investigate the adoption of HDMI 2.1 in PC environment. Here are several reasons for all of us to find out more:
- Nvidia has made a complete adoption of it on 3000 Ampere series, with full speed of 48 Gbps (FRL6 data rate). Those GPUs even support DSC 1.2 over HDMI 2.1, which is brilliant
- AMD 6000 RDNA2 series adopted a bit slower ports of 40 Gbps (FRL5 data rate), like LG TV from 2020/2021. I wrote several emails to them asking why not 48 Gbps. No reply so far. It would be good to know why there are differences in adoption of bandwidth, no? Different chips on GPUs? This matters because 4K/120 10-bit RGB video signal works over 40 Gbps, but 4K/144 10-bit RGB already needs full bandwidth of 48 Gbps. Consumers and you tech journalists need to know this, ask questions and publish it. AMD never published on its website that their 6000 GPUs feature ports with 40 Gbps and not 48 Gbps. Why is that? They also never published correct FRL rate, which is FRL5, a clear indicator of port speed. Their website reads: "HDMI 2.1 VRR and FRL", which is correct, but tells us little about important details. I have 6800XT at home and only found our about port speed after extensive testing with LG C9 TV. It's not a big deal, but if anyone wants to buy 4K/144 10-bit monitor, it will become an issue over HDMI, as owners will need to reduce colour space from RGB to chroma subsampling. The public should know this.
- motherboard vendors will shortly start selling models with HDMI 2.1 ports that feature level shifter chips, e.g. from Parade Tech or Realtek, that enable HDMI 2.1 bandwidth and DSC pass-through from APU processor. Some, and I mentioned Asus, already advertise bogus HDMI 2.1 on their motherboards, such as B550 Pro Art. Does the board have level shifter chip? It's doubtful, as they never confirmed it to me when asked to clarify. Can that HDMI port output more than 10 Gbps? Highly doubtful, yet it is advertised as "HDMI 2.1 4K/60". It's embarrassing, misleading and silly to do that, especially for such brand. Please check this with them.
https://www.asus.com/us/Motherboards-Components/Motherboards/ProArt/ProArt-B550-CREATOR/
- New Gigabyte monitor AORUS-FI32U is geared towards gamers. Gigabyte states on their website that this monitor allegedly has two “HDMI 2.1 inputs” that support, quote: “
PS5 and Xbox Series X at 4K UHD@120Hz (4:2:0)”. It’s an odd practice to advertise monitor’s port capability in this way, attracting suspicion immediately. Does it support 4K/120 over
8-bit signal only or over
10-bit too? It’s 10-bit (8-bit+FRC) monitor, but we do not know whether this monitor hosts two standard HDMI 2.0b inputs running at 18 Gbps over older TMDS protocol, which would be enough for 4K/120
8-bit 4-2-0 image, or it hosts real HDMI 2.1 input with over 20 Gbps bandwidth, necessary for 4K/120
10-bit 4-2-0 image over new, faster FRL link. 4K/120 4-2-0 image could be delivered with either of the bandwidths, depending on bit depth. They do not tell us about
bit depth of the supported signal. This is a problem.
Gigabyte also states that refresh rate is, quote: “144Hz and 120Hz for Console Game”. This is tricky too. 4K/144 Hz can be delivered through DP port and not necessarily through HDMI ports without special entry in EDID. HDMI Forum’s 2.1 spec officially supports up to 4K/120 Hz refresh rate, so 4K/144 Hz is out of the spec and might only be added to EDID in special cases. Can this monitor really output 4K/144Hz over HDMI? Would you not be interested to investigate this for PC users?
https://www.gigabyte.com/Monitor/AORUS-FI32U/sp#sp
There are more examples of HDMI 2.1 mess in PC world and I am worried that there will be even more if we do not educate consumers as to what to pay attention to when buying PC components and if we do not challenge vendors to be VERY clear about the specs.