News HDMI 2.2 is here with new 'Ultra96' Cables — up to 16K resolution, higher maximum 96 Gbps bandwidth than DisplayPort, backwards compatibility & more

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
what i don't get and don't see ( or maybe overlooked or just don't understand the basics? gladly, please help out or point):
It seems the HDMI Forum is quite stingy with details. I went through a lot of what they announced, hoping for a better understanding of FRL, but I saw very little explanatory technical implementation. So, I think we would need someone who's a member to explain how the underlying technology changed. Perhaps that's the sort of leak that will happen in EDA forums or electronics circles. Not sure if it'll reach us, here.

What is the difference in the cable itself?
I have a rudimentary understanding of standard electric installation and have crimped quite a few lan cables. So i am trying to sort out my understanding of the actual difference that makes one cable 3x more expensive than another (we see this with usbc as well)
copper not aluminum and/or more gauge= less resistance = better signal?
More insulation layers = less interference / crosstalk?
When talking about high-frequencies, I think it's not insulation but shielding that matters. You could even have a situation where each differential pair is shielded from the others. If you look at the HDMI connector pin-out, each differential pair has its own ground. So, I think that could connect to a shield around the pair.

Then, that opens up the matter of the quality of shielding used, not to mention things like the dialectic of the jackets used around the individual wires. Maybe there's been some progress on materials research and newer formulations have come onto the market that improve performance in these areas, at least for the price or are at least more flexible or durable.

Also, I just want to point out that maybe there will be no difference in the better-quality cables, between 48 Gbps and 96 Gbps, but perhaps all they're doing is raising the minimum bar. Maybe the lower-quality cables that could handle 48 Gbps will attenuate signals too much at 96 Gbps, and thus wouldn't pass certification. Perhaps the better-engineered or better-built 48 Gbps cables will certify at 96 Gbps as is.

The Wikipedia page has a list of the different HDMI cable certification standards (but not the details you want to see). There are 5 different data rates. I'm glad to see the latest two tiers got rid of a separate hard-wired Ethernet channel. I assume those pins are now used as regular data channels and any Ethernet is just handled as another payload type.

I wish they hadn't used names like "High Speed" and "Premium High Speed". When I buy HDMI cables, try to shop by bit-rate. Searching for "Premium High Speed " seems like it can lead to too many false matches,

Obviously the protocols change with the level , but...
are the cables are chipped to communicate their capabilities / limitations(?)
and/or is this mostly a gatekeeper for the patent royalties
HDMI is one of those standards that charges royalties. If the cables do actively state their certification, I'd love to have an inexpensive cable tester that would just tell me what a cable can handle, because I now have lots of HDMI cables and they're not always labeled with their capability.

MCP will change everything radically in the next years and was knocked out 'in a few months' while it took 'decades' (can't be bothered to look anything up, sorry ) to update jpeg, usb, sata... AI competitors are even harmonizing product names: i.e. canvas is always kind of the same thing.
Okay, interesting.
 
Last edited:
Would also be a matter of availability of 96G compatible optical cables as large screens are usually mounted at a distance from PC and even at just little >3 meters even 48G stability starts to suck badly on copper (a proud owner of two 5m copper cables = fail, 7m optical cable = huge success).
I find it interesting that you have stability issues at 48Gb on copper. People use copper cables inside their server racks at 100Gb port speed for distances up to 5M. I wonder if the cables you have used just aren't quite thick enough for that distance as 5M 100Gb DACs are 26AWG.
 
My guess? More rigorously tested, higher end specs and faster internals in the cables themselves (like repeaters) since at extremely high frequencies, to my understanding, these cables have to have built in electronics in order for the cable to work beyond extremely short lengths. Mind you, this could be a huge scam. But I didn't get far enough in EECS to help on that matter.
I haven't followed the matter too closely, but I think DisplayPort had active vs. passive cables, since the previous standard (40 Gbps?) came out. I don't know if the active part was only located in the connectors, or if there was a sort of lump in the middle.

What I can say is this just reinforces my preference for the extant professional grade option that does away with the pesky "digital rights management"
I'm not sure the cable has any role in HDCP. However, if a host (e.g. graphics card, streaming box, etc.) fails to negotiate a HDCP link with the display, then it will simply refuse to show protected content. So, your choice is really between watching protected content or not.
 
If it was up to me, I'd pitch all the DRM overhead and go back to the pre-existing pro grade version still used to pump tons of throughout for high end cinema cameras. If you need more bandwidth, just snag optical. And if that still isn't enough, in the next ten or so years we may see graphene cables that can work either for electrical or optical cabling purposes.
Are you thinking of SDI? I think the OS will simply refuse to allow protected content to be played over it.

I have yet to find an HDMI cable that won't eventually bug-out at some point. Been using them since the mid 2000s.
Huh. I think I've never had that happen. However, I always try to buy quality cables, I'm rarely messing with them, and I'm rarely pushing them to their limits.
 
It makes sense to upgrade the HDMI standards, to use a single cable and be able to show 16K, and higher frame rate 8K (even for gaming).
Something nobody has brought up is the amount of energy required to send data at these speeds. 10G twisted-pair Ethernet requires a lot more energy than SFP+ and I assume the same applies to 96 Gbps HDMI. Imagine needing active cooling for your HDMI transceiver!

I guess there's a related point about how much it will cost to implement these data rates and how much more the cables will cost. If it's too expensive, it will not catch on.
 
I find it interesting that you have stability issues at 48Gb on copper. People use copper cables inside their server racks at 100Gb port speed for distances up to 5M. I wonder if the cables you have used just aren't quite thick enough for that distance as 5M 100Gb DACs are 26AWG.
It'd be interesting if someone with expertise on this could weigh in, but I'd just point out that QSFP has literally twice as many pins as HDMI. It's also about 10 years newer than the original HDMI standard that locked in the connector design and basic pin assignments.
 
  • Like
Reactions: jeremyj_83
I find it interesting that you have stability issues at 48Gb on copper. People use copper cables inside their server racks at 100Gb port speed for distances up to 5M. I wonder if the cables you have used just aren't quite thick enough for that distance as 5M 100Gb DACs are 26AWG.
and @ bit_user.

They have implemented a new protocol, which enables higher speed, and uses more wires than the prior speed; while remaining backwards compatible (like adding PAM to PCIe).

 
  • Like
Reactions: bit_user
and @ bit_user.

They have implemented a new protocol, which enables higher speed, and uses more wires than the prior speed; while remaining backwards compatible (like adding PAM to PCIe).

Excellent! I was (unsuccessfully) searching for info on FRL. That did seem key to their recent bandwidth increases. After looking at that page, I can now see that the TMDS -> FRL transition unlocked 45.3% more throughput at the same per-wire bit rate. That means the per-wire bit rate only had to increase by 83.5%, to go from 18 to 48 Gbps.

I would just point out that this was introduced in HDMI 2.1. So, I'm not sure how much it explains the speed-doubling in HDMI 2.2. Perhaps it enabled them to do something like adopting PAM4?

HDMI VersionRelease DateSignalling MethodMax Bandwidth (Gbps)
1.0​
2002-12-09TMDS
2.2​
1.3​
2006-06-22TMDS
10.2​
2.0​
2013-09-04TMDS
18.0​
2.1​
2017-11-28FRL
48.0​
2.2​
2025-06-25FRL
96.0​

BTW, I've omitted all the in-between standards that didn't introduce any new speeds.
 
Last edited:
  • Like
Reactions: Rob1C