News End-To-End HDMI 2.1 Systems Simplify Home Entertainment

tek-check

Reputable
Aug 31, 2021
37
25
4,535
Hahaha! Who is the author of this ignorant article? Someone arriving from another galaxy?

It is true that HDMI 2.1 "end-to-end" (whatever that means...) system could be put together, but the author is completely silent about scandalous bugs that untested HDMI 2.1 devices exibit at the moment, especially in AVR industry.

Here is everything what we currently know about the state of the game of HDMI 2.1 in AVRs.
It is not a happy-end story. 2020 models of AVRs from Sound United and Yamaha had faulty HDMI 2.1 chips and did not work properly with HDMI 2.1 sources. Thousands are unhappy owners, recalls and complaints. Yamaha receivers, although sold as "HDMI 2.1", do not have working features and speeds above 18 Gbps. There is a free board swap programme for faulty receivers. Only some new low end receivers have multi-input ports with 40 Gbps speed.

Please read the thread below to find out more.

https://www.avsforum.com/threads/hd...ansition-to-40-48-gbps.3199232/#post-60742079
 
Hahaha! Who is the author of this ignorant article? Someone arriving from another galaxy?

It is true that HDMI 2.1 "end-to-end" (whatever that means...) system could be put together, but the author is completely silent about scandalous bugs that untested HDMI 2.1 devices exibit at the moment, especially in AVR industry.

Here is everything what we currently know about the state of the game of HDMI 2.1 in AVRs.
It is not a happy-end story. 2020 models of AVRs from Sound United and Yamaha had faulty HDMI 2.1 chips and did not work properly with HDMI 2.1 sources. Thousands are unhappy owners, recalls and complaints. Yamaha receivers, although sold as "HDMI 2.1", do not have working features and speeds above 18 Gbps. There is a free board swap programme for faulty receivers. Only some new low end receivers have multi-input ports with 40 Gbps speed.

Please read the thread below to find out more.

https://www.avsforum.com/threads/hd...ansition-to-40-48-gbps.3199232/#post-60742079
You're telling me everyone in THIS galaxy knows about all the ins and outs of HDMI 2.1? No way! In fact, probably only a fraction of people know about it to any meaningful degree. But cheers for the information on what isn't working, and don't blame the normal writers at Tom's Hardware for this sponcon. We just work here. :-D
 

Sleepy_Hollowed

Distinguished
Jan 1, 2017
536
234
19,270
No thanks. HDMI is bug ridden and extremely proprietary which causes odd issues between endpoints due to implementation differences.

I will keep display port monitors and USB/fiber sound devices.

Too bad I can’t get away from it on consoles.
 
  • Like
Reactions: Flyfisherman

tek-check

Reputable
Aug 31, 2021
37
25
4,535
don't blame the normal writers at Tom's Hardware for this sponcon. We just work here. :-D
With all respect Jarred, there is no blame on people. The article itself was ignorant. Mistakes happen. We are here to give constructive criticism. People who work at Tom's just need to fact-check sponcon before publishing content. Link for fact-checking AVRs was provided and the best course of action is to research current situation surrounding the roll out of HDMI 2.1 ecosystem, edit the article and republish it so that the public is better informed. Simple.

Some motherboard manufacturers, including Asus, falsely market some MB (ProArt B550) as "HDMI 2.1" without any evidence. When challenged to justify its features or state which level shifter chip MB has to enable high speeds, their customer service says 10 Gbps (HDMI 1.4). Suffice to say, no single APU on the market is able to support natively HDMI 2.1 signal, so level shifter chips must be installed if such MB would ever support speeds up to 48 Gbps.

Several monitor vendors, such as Gigabyte, also market some displays as "HDMI 2.1", without providing any meaningful information about its features taking monitors beyong 18 Gbps (HDMI 2.0). And so on, and so on. There are more examples of this.

The only devices that currently support HDMI 2.1 properly in PC environment are new GPUs from Nvidia (48 Gbps) and AMD (40 Gbps). In consumer electronics its several TVs and the two consoles. Often, there is a mess with features, such as combination of 4K+120+10-bit+HDR+RGB+VRR. On many displays, all these features cannot work together, for variety of reasons that should be investigated more closely.

It's an utter mess out there, full of aggressive marketing traps. Tom's workers are warmly invited to investigate this and re-publish more coherent, granular and accurate narrative about the state of game in HDMI 2.1 world. Please encourage such investigation and inform the public about your outcomes, so that vendors are properly scrutinized for products they are trying to sell.
 
Last edited:
With all respect Jarred, there is no blame on people. The article itself was ignorant. Mistakes happen. We are here to give constructive criticism. People who work at Tom's just need to fact-check sponcon before publishing content. Link for fact-checking AVRs was provided and the best course of action is to research current situation surrounding the roll out of HDMI 2.1 ecosystem, edit the article and republish it so that the public is better informed. Simple.

Some motherboard manufacturers, including Asus, falsely market some MB (ProArt B550) as "HDMI 2.1" without any evidence. When challenged to justify its features or state which level shifter chip MB has to enable high speeds, their customer service says 10 Gbps (HDMI 1.4). Suffice to say, no single APU on the market is able to support natively HDMI 2.1 signal, so level shifter chips must be installed if such MB would ever support speeds up to 48 Gbps.

Several monitor vendors, such as Gigabyte, also market some displays as "HDMI 2.1", without providing any meaningful information about its features taking monitors beyong 18 Gbps (HDMI 2.0). And so on, and so on. There are more examples of this.

The only devices that currently support HDMI 2.1 properly in PC environment are new GPUs from Nvidia (48 Gbps) and AMD (40 Gbps). In consumer electronics its several TVs and the two consoles. Often, there is a mess with features, such as combination of 4K+120+10-bit+HDR+RGB+VRR. On many displays, all these features cannot work together, for variety of reasons that should be investigated more closely.

It's an utter mess out there, full of aggressive marketing traps. Tom's workers are warmly invited to investigate this and re-publish more coherent, granular and accurate narrative about the state of game in HDMI 2.1 world. Please encourage such investigation and inform the public about your outcomes, so that vendors are properly scrutinized for products they are trying to sell.
To be frank, we have little to no control over sponcon. Often it just shows up. So presumably someone at HDMI wanted to post this to promote HDMI 2.1 end-to-end solutions. We could edit the article, but someone paid to post it as is, and I think it's better for people in the know to leave comments and critiques than for us to try to control whatever an advertiser wants to say. As for testing all of this stuff, we're mostly PC focused and don't do a lot with high-end AV testing unfortunately. I don't even own any HDMI 2.1 displays, because they cost quite a bit, nor do I own any HDMI 2.1 capable home theater equipment. PC monitors still tend to be best with DisplayPort, since that has been well defined for a while now. I'd still like 4K 144Hz without DSC (4:2:2 chroma subsampling), which will require DP2.0 or HDMI 2.1, but I can wait a while.
 
  • Like
Reactions: digitalgriffin
Given how buggy the first couple releases were (gen 1) with faulty interfaces that didn't support the full spec, I would say they are pumping to dump the old stock.

It kind of puts them in a conundrum. Everything that they advertised (Gen 1) as being HDMI 2.1 compliant that isn't, is quite pricey. New stuff coming in while better is also cheaper.

How do you dump the older high end pricey stock? By telling the uninformed customer base "It's the best thing since sliced bread" without being honest about teething issues.

Same thing happened when 4K first came out. There wasn't even a reliable standard for 4K/60 and it had to use proprietary interface. And the first HDMI 2.0 projectors for home theater enthusiast couldn't do 18 Mbps that HDMI 2.0 supported, leaving many high quality video formats were not supported at all.

And it wasn't just Yamaha. Every first gen receiver had HDMI 2.1 flaws.
 

tek-check

Reputable
Aug 31, 2021
37
25
4,535
It kind of puts them in a conundrum. Everything that they advertised (Gen 1) as being HDMI 2.1 compliant that isn't, is quite pricey. New stuff coming in while better is also cheaper.
Some TV work just fine. LG 2019 models ("Gen 1") have all important features and 48 Gbps interface. LG 2020/2021 have even more features and 40 Gbps interfaces. Most GPUs with HDMI 2.1 outputs work fine too. The two consoles have 32 Gbps and 40 Gbps ports respectively. PS5 does not have VRR at the moment.
And it wasn't just Yamaha. Every first gen receiver had HDMI 2.1 flaws.
Not correct. First gen of Onkyo AVRs with HDMI 2.1 work fine. Tested and published by Teoh Vincent on youtube. Even Dolby Vision 4K/120 pass-through works fine.

As for generations, if you visit the link I gave in reposnse to Jarred's post, you will find out that HDMI 2.1 chips are classified into generations, rather than AVRs, as those chips go through developmental cycles of 18-24 months and dictate adoption timeline in AVRs. From that point of view, you are right. First gen chips adopted by Sound United and Yamaha were faulty. Second gen chips with 40 Gbps ports have improved the situation (adapter box chip for Denons) and were adopted by Onkyo in their first gen AVRs.
 
Last edited:

tek-check

Reputable
Aug 31, 2021
37
25
4,535
To be frank, we have little to no control over sponcon. Often it just shows up. So presumably someone at HDMI wanted to post this to promote HDMI 2.1 end-to-end solutions. We could edit the article, but someone paid to post it as is, and I think it's better for people in the know to leave comments and critiques than for us to try to control whatever an advertiser wants to say.
That' fine. There is no harm in promoting something, unless many products are riddled with bugs, disfunctional features and arrogant marketing, which is often the case with HDMI 2.1 adoption across tech ecosystem. Promo text, for obvious reasons, does not tell us the other side of the story, as the text is geared towards holiday season...

we're mostly PC focused
This is exactly why I am posting here, to make your tech outlet aware of what is going on, in hope that you and your colleagues are going to take on this topic and publish more about it, to paint more accurate picture of reality than the promo text wishes to tell us.

It is the perfect time for you folks, following the promo text, to expand on this topic and investigate the adoption of HDMI 2.1 in PC environment. Here are several reasons for all of us to find out more:
- Nvidia has made a complete adoption of it on 3000 Ampere series, with full speed of 48 Gbps (FRL6 data rate). Those GPUs even support DSC 1.2 over HDMI 2.1, which is brilliant

- AMD 6000 RDNA2 series adopted a bit slower ports of 40 Gbps (FRL5 data rate), like LG TV from 2020/2021. I wrote several emails to them asking why not 48 Gbps. No reply so far. It would be good to know why there are differences in adoption of bandwidth, no? Different chips on GPUs? This matters because 4K/120 10-bit RGB video signal works over 40 Gbps, but 4K/144 10-bit RGB already needs full bandwidth of 48 Gbps. Consumers and you tech journalists need to know this, ask questions and publish it. AMD never published on its website that their 6000 GPUs feature ports with 40 Gbps and not 48 Gbps. Why is that? They also never published correct FRL rate, which is FRL5, a clear indicator of port speed. Their website reads: "HDMI 2.1 VRR and FRL", which is correct, but tells us little about important details. I have 6800XT at home and only found our about port speed after extensive testing with LG C9 TV. It's not a big deal, but if anyone wants to buy 4K/144 10-bit monitor, it will become an issue over HDMI, as owners will need to reduce colour space from RGB to chroma subsampling. The public should know this.

- motherboard vendors will shortly start selling models with HDMI 2.1 ports that feature level shifter chips, e.g. from Parade Tech or Realtek, that enable HDMI 2.1 bandwidth and DSC pass-through from APU processor. Some, and I mentioned Asus, already advertise bogus HDMI 2.1 on their motherboards, such as B550 Pro Art. Does the board have level shifter chip? It's doubtful, as they never confirmed it to me when asked to clarify. Can that HDMI port output more than 10 Gbps? Highly doubtful, yet it is advertised as "HDMI 2.1 4K/60". It's embarrassing, misleading and silly to do that, especially for such brand. Please check this with them.
https://www.asus.com/us/Motherboards-Components/Motherboards/ProArt/ProArt-B550-CREATOR/

- New Gigabyte monitor AORUS-FI32U is geared towards gamers. Gigabyte states on their website that this monitor allegedly has two “HDMI 2.1 inputs” that support, quote: “PS5 and Xbox Series X at 4K UHD@120Hz (4:2:0)”. It’s an odd practice to advertise monitor’s port capability in this way, attracting suspicion immediately. Does it support 4K/120 over 8-bit signal only or over 10-bit too? It’s 10-bit (8-bit+FRC) monitor, but we do not know whether this monitor hosts two standard HDMI 2.0b inputs running at 18 Gbps over older TMDS protocol, which would be enough for 4K/120 8-bit 4-2-0 image, or it hosts real HDMI 2.1 input with over 20 Gbps bandwidth, necessary for 4K/120 10-bit 4-2-0 image over new, faster FRL link. 4K/120 4-2-0 image could be delivered with either of the bandwidths, depending on bit depth. They do not tell us about bit depth of the supported signal. This is a problem.

Gigabyte also states that refresh rate is, quote: “144Hz and 120Hz for Console Game”. This is tricky too. 4K/144 Hz can be delivered through DP port and not necessarily through HDMI ports without special entry in EDID. HDMI Forum’s 2.1 spec officially supports up to 4K/120 Hz refresh rate, so 4K/144 Hz is out of the spec and might only be added to EDID in special cases. Can this monitor really output 4K/144Hz over HDMI? Would you not be interested to investigate this for PC users?
https://www.gigabyte.com/Monitor/AORUS-FI32U/sp#sp

There are more examples of HDMI 2.1 mess in PC world and I am worried that there will be even more if we do not educate consumers as to what to pay attention to when buying PC components and if we do not challenge vendors to be VERY clear about the specs.