News Acer Introduces Its First HDMI 2.1 Monitor

Giroro

Splendid
So HDMI 2.1 is only capable to 4k at 120Hz and 8k60? I'm not sure that follows. 8k (7,680 by 4,320) is supposed to be 4x the resolution of 4k (3840 x 2160), right? So wouldn't that require 4x the bandwidth?
I would expect that if you wanted to game at 8k60, then you would need a interface that is capable of playing 4k at 240 Hz. (and also GPU, lol). Or, an interface that topped out at 4k 120Hz should only be able to push 8k30.
Is there some black magic devilry at play? Can HDMI 2.1 run 4k faster than it is currently being used , or does it run 8k slower? Or are other features like HDR factoring in somehow?

Also, I like the look of the stand for the Predator XB273U NX a lot. Hopefully (for the price) it is made of high quality materials like metal, because it would probably be pretty flimsy in plastic.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
Is there some black magic devilry at play? Can HDMI 2.1 run 4k faster than it is currently being used , or does it run 8k slower? Or are other features like HDR factoring in somehow?
No devilry, but determining supported resolutions/refresh rates can get very complicated. You also have to account for bit depth, whether chromo subsampling is used, HDR support, and if the link is using Display Stream Compression.

HDMI 2.1 can support 4k/120Hz with chroma 4:4:4 (no subsampling) at 12 bit color depth. In order to support 8k/60Hz at the same chroma 4:4:4 and 12 bit color depth, Display Stream Compression needs to be used. Otherwise, 4:2:0 subsampling needs to be used to lower the needed bandwidth to 48Gbps and eliminate the need for DSC.

HDMI 2.1 can support 4k 240Hz if desired, but you would have to reduce bandwidth elsewhere (chroma subsampling or lower color bit depth) or use DSC. Using DSC, HDMI 2.1 theoretically can support 10k 120Hz using 4:2:0 subsampling.
 

mac_angel

Distinguished
Mar 12, 2008
565
83
19,060
So, this again brings up a question I've asked repeatedly that TomsHardware, or any other tech review sites have answered.
This is an HDMI 2.1 4K monitor, without G-Sync. HDMI 2.1 certification requires Variable Refresh Rate as part of the standard. NVidia's Ampere cards are HDMI 2.1, which means they are also suppose to have VRR built into it. This makes G-Sync obsolete. There are many, many displays, both monitors and even TVs that support VRR. Why is no one testing this technology?
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
This is an HDMI 2.1 4K monitor, without G-Sync. HDMI 2.1 certification requires Variable Refresh Rate as part of the standard. NVidia's Ampere cards are HDMI 2.1, which means they are also suppose to have VRR built into it. This makes G-Sync obsolete. There are many, many displays, both monitors and even TVs that support VRR. Why is no one testing this technology?

HDMI VRR is not tied to HDMI 2.1. There are HDMI 2.0b devices that support it and not all 2.1 ports do. The link below lists the feature support of many HDMI 2.1 TV's, and you'll see quite a few no current, nor announced VRR support:

List: 4K TVs and 8K TVs with HDMI 2.1

HDMI VRR doesn't make G-Sync any more obsolete than Free Sync tried and failed to.
 
I'm sure by May more monitors with the same HDMI capabilities around 32" and high refresh rates will be available from other brands than Acer due to the new consoles. And then I think they'll get cheaper.
 

prolfe

Distinguished
Jan 9, 2005
252
0
18,780
No devilry, but determining supported resolutions/refresh rates can get very complicated. You also have to account for bit depth, whether chromo subsampling is used, HDR support, and if the link is using Display Stream Compression.

HDMI 2.1 can support 4k/120Hz with chroma 4:4:4 (no subsampling) at 12 bit color depth. In order to support 8k/60Hz at the same chroma 4:4:4 and 12 bit color depth, Display Stream Compression needs to be used. Otherwise, 4:2:0 subsampling needs to be used to lower the needed bandwidth to 48Gbps and eliminate the need for DSC.

HDMI 2.1 can support 4k 240Hz if desired, but you would have to reduce bandwidth elsewhere (chroma subsampling or lower color bit depth) or use DSC. Using DSC, HDMI 2.1 theoretically can support 10k 120Hz using 4:2:0 subsampling.
@spongiemaster Thank you for a thorough explanation without the flame or snark so common in online comments these days. I too was wondering how it was possible and I really appreciated your explanation.
 

mac_angel

Distinguished
Mar 12, 2008
565
83
19,060
HDMI VRR is not tied to HDMI 2.1. There are HDMI 2.0b devices that support it and not all 2.1 ports do. The link below lists the feature support of many HDMI 2.1 TV's, and you'll see quite a few no current, nor announced VRR support:

List: 4K TVs and 8K TVs with HDMI 2.1

HDMI VRR doesn't make G-Sync any more obsolete than Free Sync tried and failed to.

HDMI VRR is meant to be a standard for HDMI 2.1.

https://www.hdmi.org/spec/hdmi2_1

And, yes, there are HDMI 2.0b devices that do have HDMI VRR added on to their devices, which is a very large part of my argument in asking WHY is HDMI VRR not being mentioned, let alone even tested with NVidia's new Ampere GPUs? There are a great number of 4K TVs that do have VRR built in, namely the Samsung Q series TVs, as well as the RU8000 (2019). There are a great many options for displays that have VRR technology that are way cheaper than buying a G-Sync certified TV.
HDMI.org really messed up with their certification of the video signal being compressed OR non-compressed and letting there be an incompatibilty between the two. Hence the HDMI 2.1 bug on A/V Receivers and next gen consoles. But all cables and devices have to match their spec and be tested to support these things before they are allowed to put HDMI 2.1 on their package or page. HDMI VRR is one of these things.
NVidia also promotes the VRR technology within their 30 series GPUs

https://www.nvidia.com/en-us/geforce/news/rtx-30-series-hdmi-2-1/

yet, for some reason, no tech review company even mention this, let alone do any testing. If you search Google, you'll find lots of people asking, as well as it being mentioned from NVidia and HDMI.org, but no reviews even mention it, let alone try it.