News HDMI 2.2 is here with new 'Ultra96' Cables — up to 16K resolution, higher maximum 96 Gbps bandwidth than DisplayPort, backwards compatibility & more

This garbage should just die.

Are you sure? I think they were forbidden to support it on Linux.
Yeah, probably this. AMD tried, for years, to work with the HDMI consortium to find some way to support HDMI 2.1+ in their open source drivers, but got shut down at every attempt. So, it's probably only by AMD maintaining a closed-source driver option (with only their workstation cards being "officially" supported) that they can offer higher.

Someone else (I think Intel?) did manage to support HDMI 2.1 with open source drivers, so the issue appears to be how the display controller is designed and how much of its functionality gets handled in closed-source firmware vs. open source drivers.

Basically, what the HDMI consortium wants to avoid is open source drivers containing too many details of the specification, that other implementers could study and replicate, without themselves having to pay a license fee.

And, you know, this all comes down to patents. If HDMI were patent-free, then there wouldn't be a need for such onerous royalties and then there would be no need for such secrecy.
 
It is not about games, as a matter of fact no hardware can play games at 16K resolution ... but Movies will come at that resolution soon.
The digital cinemas I've gone to are still using 4k and that seemed like enough resolution to me. I can't imagine any benefit for them going above 8k, unless we're talking about something like Sphere.

I think it's more about wall-sized displays for visualization and realtime monitoring. Think: lots of charts, or maybe lots of video feeds from security cameras, etc.

As a matter of fact, I know some security cameras are already doing 8k, so maybe 16k isn't far off?
 
It is not about games, as a matter of fact no hardware can play games at 16K resolution ... but Movies will come at that resolution soon.
Nope, they won't. while 4k obtained widespread adoption by both consumers and the movie industry, there is no push whatsoever to go beyond 4k. Hell, even most "4k" movies are just upscaled 2k!

Additionally, the benefits of 8k are only really visible on TVs over 85", 16k would require at LEAST a 120" screen to see any noticeable difference over 4k!

Now for gaming and/or other computing applications, there is a need for higher resolution support. You will sit MUCH closer to a monitor than a TV/movie screen, and for certain applications (medical for one), every pixel counts!

You have to also take into consideration, especially for gaming, that this is not only "16k" capable, but will enable 4k at high refresh rates.
 
Hell, even most "4k" movies are just upscaled 2k!
Depends, but not many recent movies, or true classics that are based on new transfers.

Here's another good resource:


Additionally, the benefits of 8k are only really visible on TVs over 85", 16k would require at LEAST a 120" screen to see any noticeable difference over 4k!
It all depends on how far you sit. I sit about 10 feet away from my 65" screen and I think even 4k is overkill for me.
 
Last edited:
Out curiosity, does anybody here look at a game in 4k and say “that’s just ugly. It needs more resolution.” ?
Yes. Me. 4K->8K DLSS looks much (and yes I mean a lot, these crisp thin edges do matter) better than raw 4K in static / close to static / slowmo scenes (fast scenes look just the same as 4K due to DLSS dropping resolution).

8K 65" screen @ 3 meters (LG QNED 96) / nV 4090.

I was just "YAY!!!!!!" when they added DLSS to Trails into Daybreak and I could go full throttle maxed 8K there.
This made not a day but my whole month.

Anime upscaling with nV superres actually also looks better in 8K than 4K. Same reason again: the edge lines and corners are ultimately smooth and upscale pixelation becomes unnoticeable.
 
Last edited:
Having 8K@60 without DSC would actually be a paramount. Pity it would require a new screen and gpu though, so probably not until 2.3/2.4's out 😀 Would also be a matter of availability of 96G compatible optical cables as large screens are usually mounted at a distance from PC and even at just little >3 meters even 48G stability starts to suck badly on copper (a proud owner of two 5m copper cables = fail, 7m optical cable = huge success).
 
Last edited:
Additionally, the benefits of 8k are only really visible on TVs over 85", 16k would require at LEAST a 120" screen to see any noticeable difference over 4k!
For me personally, there is a huge difference between 8K and 4K on 65" @ 3 meters.
@ 8K pixelated edges can't be seen while @ 4K they are clearly present everywhere.
 
Would also be a matter of availability of 96G compatible optical cables as large screens are usually mounted at a distance from PC and even at just little >3 meters even 48G stability starts to suck badly on copper
One thing I'm interested in is the transition from TMDS (Transition-Minimized Differential Signalling) to FRL (Fixed-Rate Link), that occurred in HDMI 2.1, and if it played any significant role in the bandwidth-doubling we see in 2.2.

The Wikipedia page lists the data rates available as a function of the FRL rate. I'm not yet clear whether it's fundamental or incidental that the FRL rates each seem double of the TMDS rates.

FWIW, I found this reference for FRL rates, in HDMI 2.1:
 
Last edited:
Just like with HDMI 2.1, I'm sure the 2.2 moniker will not have all features listed. It will continue to be required that reviewers test TV's boasting HDMI 2.2 inputs are actually capable of everything.
 
16k TVs/Monitors would require Vastly larger storage for Movies. Buying a movie at 16k on disk will require tech we don't have. A 4k disk is 100GB max. That works fine for 4k. A 16K movie is 16 times more data so we would need a disk that holds roughly 1,600GB. I think we will need a different format than disks for that. Please correct my math if I am confused. (Long night last night).

I have never been one of those who say things like "no one needs that much, memory or GPU or CPU power". I also have never seen a monitor that was "Too Large". (Exception, I was like 11 and watched a movie in the front row of an old school movie theater in the early 70s. I only tried that once). 16k will have uses buy I doubt we will see home applications in the near future. I am totally thrilled with the new HDMI spec though!
 
Then, maybe enable anti-aliasing?
: D
4K->8K+DLSS is much better than 4K/8xMSAA, that's the thing. The edge pixelation does not come from aliasing, it comes from the 2x2 pixel size that is noticeable on 65" @ 3m and that's it. 8K is the only way to make hairs hair thin.
 
Last edited:
16k TVs/Monitors would require Vastly larger storage for Movies. Buying a movie at 16k on disk will require tech we don't have. A 4k disk is 100GB max. That works fine for 4k. A 16K movie is 16 times more data so we would need a disk that holds roughly 1,600GB.
That would be silly, IMO. Since nobody can see the same level of detail at 16k as they could at 4k, you could crank up the quantization, block sizes, etc. Also, it should compress better, because you're not simply encoding uncorrelated random pixels, but rather information that's highly-correlated with what's being encoded at lower resolutions. Lastly, there are better codecs than we had when the UHD blu-ray standard was finalized.

Probably nobody would complain about the quality, if it were only 400 GB. Consider that 4k blu-ray discs are probably less than 2x the size of 2k discs, yet they have 4x the pixels. I don't know about you, but I haven't heard anyone complaining about the visual fidelity of the UHD blu-ray format.

I also have never seen a monitor that was "Too Large".
The 32" screen that I use at work is about the max for me. If it were any bigger, my neck would get sore from having to move my head too much.

I'm not saying nobody needs a bigger screen, but I do think you start to get into more specialized niches, when you go way bigger than that.

(Exception, I was like 11 and watched a movie in the front row of an old school movie theater in the early 70s. I only tried that once).
Exactly. Most movies are shot with the expectation that you can comfortably fit most of the frame in your field of vision. So, if you can only justify a 16k screen by sitting way close to it, then I think you're probably not using it to watch regular movies.

In contrast, Omnimax (IIRC) is designed to create a more immersive experience and actually should fill your entire field of view.

I am totally thrilled with the new HDMI spec though!
I'm pleased, if it means you can get more bandwidth out of lower-spec cables. However, I'm not sure if that's actually true.
 
what i don't get and don't see ( or maybe overlooked or just don't understand the basics? gladly, please help out or point):

What is the difference in the cable itself?
I have a rudimentary understanding of standard electric installation and have crimped quite a few lan cables. So i am trying to sort out my understanding of the actual difference that makes one cable 3x more expensive than another (we see this with usbc as well)
copper not aluminum and/or more gauge= less resistance = better signal?
More insulation layers = less interference / crosstalk?
Obviously the protocols change with the level , but...
are the cables are chipped to communicate their capabilities / limitations(?)
and/or is this mostly a gatekeeper for the patent royalties

/aside
I heard recently (superintelligent? AI daily brief, something like that) that they ran a deep research to compare the tech standard battles of the last 70 years in comparison with everything happening now with AI: MCP will change everything radically in the next years and was knocked out 'in a few months' while it took 'decades' (can't be bothered to look anything up, sorry ) to update jpeg, usb, sata... AI competitors are even harmonizing product names: i.e. canvas is always kind of the same thing.

/// aside
where there is a will, there is a way. And where there is a need, there is a bush
(for example the 'Magnoliopsida Socienta necessata' or the 'Magnoliopsida Frutex coercitus' but hopefully not the 'Magnoliopsida Cohortia fatalis' ).
 
what i don't get and don't see ( or maybe overlooked or just don't understand the basics? gladly, please help out or point):

What is the difference in the cable itself?
I have a rudimentary understanding of standard electric installation and have crimped quite a few lan cables. So i am trying to sort out my understanding of the actual difference that makes one cable 3x more expensive than another (we see this with usbc as well)
copper not aluminum and/or more gauge= less resistance = better signal?
More insulation layers = less interference / crosstalk?
Obviously the protocols change with the level , but...
are the cables are chipped to communicate their capabilities / limitations(?)
and/or is this mostly a gatekeeper for the patent royalties
My guess? More rigorously tested, higher end specs and faster internals in the cables themselves (like repeaters) since at extremely high frequencies, to my understanding, these cables have to have built in electronics in order for the cable to work beyond extremely short lengths. Mind you, this could be a huge scam. But I didn't get far enough in EECS to help on that matter.

What I can say is this just reinforces my preference for the extant professional grade option that does away with the pesky "digital rights management" bullshit that has been bullshit since they first really started implementing it in the 1990s.

We all know that people who want to circumvent it have a relatively easy time doing so, and this really just comes down to "You will own nothing and be glad for it! [Chorus]"

Beyond specific repeaters, the rest is likely what you suggested—better equipment.

At some point, they'll probably be forced to optical cables again. Optical with graphene electric would be my guess. Optical possibly being sent through graphene or traditional glass filament.
 
If it was up to me, I'd pitch all the DRM overhead and go back to the pre-existing pro grade version still used to pump tons of throughout for high end cinema cameras. If you need more bandwidth, just snag optical. And if that still isn't enough, in the next ten or so years we may see graphene cables that can work either for electrical or optical cabling purposes.

The pre-existing standard is great because it securely locks onto the device AND you don't get glitches like you do with HDMI. I have yet to find an HDMI cable that won't eventually bug-out at some point. Been using them since the mid 2000s.
Having 8K@60 without DSC would actually be a paramount. Pity it would require a new screen and gpu though, so probably not until 2.3/2.4's out 😀 Would also be a matter of availability of 96G compatible optical cables as large screens are usually mounted at a distance from PC and even at just little >3 meters even 48G stability starts to suck badly on copper (a proud owner of two 5m copper cables = fail, 7m optical cable = huge success)
 
NASA released an 8K video several years ago, there's a bit of 8K content at YouTube if you search for it:

Sony Crystal LED video walls are several years old, and 8K television is broadcast in Japan; it's a simple doubling to upscale to 16K:

There is new technology, such as Innolux N3D which displays 3D without glasses; that benefits from better HDMI resolution:

It makes sense to upgrade the HDMI standards, to use a single cable and be able to show 16K, and higher frame rate 8K (even for gaming).