Source: https://www.corsair.com/us/en/explo...21b-and-hdmi-21b-everything-you-need-to-know/The 2.1b variant that is included with Nvidia 50-series cards and was announced at CES supports longer cables for UHBR20, which stands for Ultra-High Bit Rate at 20Gb/s per-lane, with four lanes on tap for up to 80Gb/s of bandwidth.
The new cables were unveiled at CES by VESA, which is the industry group that ratifies new display standards. The new cables are called DP80LL, with the two L’s in the name signifying “low loss” and the rest of the name translating to DisplayPort 80Gb/s. Previously, there were just passive UHBR20 cables that were restricted to one meter in length, but the new DP80LL cables are active, and can be up to three meters in length. Suffice to say Nvidia 50-series owners should not have a problem with bandwidth when trying to game at 4K using DisplayPort 2.1b, but note you will need a DP80LL cable to enable high-bandwidth transmission at three meters.
RTX 40-series: DP 1.4aWhat does the 50 series and 40 series use?
UHBR20 is just bandwidth and not the actual measure of GPU performance.Can the 9070 XT handle UHBR20 ? Can the 5070 Do it?
Could you link a media for resolution stats?40 series uses 1.4a
2.1a officially supports UHBR 13.5
2.1b officially supports UHBR 20
so basically just an upshift in the max refresh rate available in all resolutions. 4K 240hz 10bit should be doable on 2.1b, and a maybe on 2.1a.
That said, at that point you are very much dependent on the monitor and cable.
Here is a link i found when i initially wrote my reply but i didn't post it then.Could you link a media for resolution stats?
Also, just found this;Will 2560x1440 (2k) @240hz work WITHOUT DSC using and AMD 9070 XT with DP 2.1a ?
Monitor would work fine but that up to what DP 1.4 can do.Will the RX9070 work fine with the monitor using DP 1.4? Ofc the dp cable would be 1.4 aswell, or uhbr 13.5 certified but i see no point doing so since it would fallback to 1.4 anyways?
Question is, would the monitor run with DSC or no?
Source: https://en.wikipedia.org/wiki/Display_Stream_Compression#HistoryUsing DSC with HBR3 transmission rates, DisplayPort 1.4 can support 8K UHD (7680 × 4320) at 60 Hz or 4K UHD (3840 × 2160) at 240 Hz with 30 bit/px RGB color and HDR. 4K at 96 Hz 30 bit/px RGB/HDR can be achieved without the need for DSC. On displays which do not support DSC, the maximum limits are unchanged from DisplayPort 1.3 (4K 120 Hz, 5K 60 Hz, 8K 30 Hz).
DSC is optional and it's usage depends on specific monitor.Question:
With HDR 10 enabled, DSC would also be enabled.
Without HDR, using SDR 10 Bit colors, DSC would be disabled.
Is this correct?
source: https://en.wikipedia.org/wiki/DisplayPort#Refresh_frequency_limits_for_HDR_videoHDR extensions were defined in version 1.4 of the DisplayPort standard. Some displays support these HDR extensions, but may only implement HBR2 transmission mode if the extra bandwidth of HBR3 is unnecessary (for example, on 4K 60 Hz HDR displays).
Since there is no definition of what constitutes a "DisplayPort 1.4" device, some manufacturers may choose to label these as "DP 1.2" devices despite their support for DP 1.4 HDR extensions. As a result, DisplayPort "version numbers" should not be used as an indicator of HDR support.
Monitor would work fine but that up to what DP 1.4 can do.
Source: https://en.wikipedia.org/wiki/Display_Stream_Compression#History
Your monitor has DP 1.4 DSC;
specs: https://rog.asus.com/monitors/27-to-31-5-inches/rog-strix-oled-xg27aqdmg/spec/
So, your monitor can run with DSC.
DSC is optional and it's usage depends on specific monitor.
source: https://en.wikipedia.org/wiki/DisplayPort#Refresh_frequency_limits_for_HDR_video
Since DSC is optional feature of DP 1.4 and it depends on specific monitor, best to contact Asus and ask them directly. This way, you'll get definitive answer.Question was if DSC would be always active or automatically enabled if the bandwidth limit exceeds or would it have to be manually enabled?
Oh no please, contacting asus support. Just no...Since DSC is optional feature of DP 1.4 and it depends on specific monitor, best to contact Asus and ask them directly. This way, you'll get definitive answer.
Effectively nothing for most people. It’s only people pushing silly refresh rates at 4 and 8KDP 2.1a vs 2.1b
What's the difference?
AMD RX 9070 XT uses 2.1a but nvidia use 2.1b or? What does the 50 series and 40 series use?
Can the 9070 XT handle UHBR20 ? Can the 5070 Do it?
So the switch from non DSC to DSC happens automatically when for example playing a video game that exceeds the bandwidth. Does DSC happen from one frame to another, what if in one second the bandwidth limit isnt exceeded but in the next second it is and in the second after that it isnt anymore, does DSC switch from one second to another, does it cause inputlag or increased frametime/responstime, how exactly does it work simplified?DSC will be used automatically, only if the format cannot be transmitted uncompressed.
Data rate for a display interface is a constant value that depends on the video format, it doesn't vary during usage.So the switch from non DSC to DSC happens automatically when for example playing a video game that exceeds the bandwidth. Does DSC happen from one frame to another, what if in one second the bandwidth limit isnt exceeded but in the next second it is and in the second after that it isnt anymore, does DSC switch from one second to another, does it cause inputlag or increased frametime/responstime, how exactly does it work simplified?
I really just want to know if i can run this monitor without using DSC.
Help me do the maths?
DP 1.4 which the monitor uses can do HBR3 (32.4 Gbit/s) max.
Is this max bandwidth enough for 2560x1440p @240hz and 10Bit color depth or will DSC be used?
Trying to understand this...
Scenario:
2560x1440p default resolution is enabled
240Hz default refreshrate is enabled
10Bit default colordepth is enabled.
Graphics intense game being rendered with 2560x1440p and uncapped variable refreshrate is being executed.
Displayport version 1.4 technology is being used.
Can the monitor under any circumstances exceed the DP 1.4 HBR3 32.4Gbit/s bandwith considering the above scenario?
Thanks for elaborating.Data rate for a display interface is a constant value that depends on the video format, it doesn't vary during usage.
The maximum data rate of DP 1.4 is 25.92 Gbit/s. 2560×1440 at 240 Hz 10 bpc will exceed this.
https://linustechtips.com/topic/729...s-v2/?section=calc&H=2560&V=1440&F=240&bpc=10
The calculation is shown in the link, 30.77 Gbit/s. It is 1.19 times the 25.92 Gbit/s maximum.Thanks for elaborating.
By how much will it exceed?
30.77Gbit/s is with HDR, but i would use the monitor with SDR because HDR10 is worthless so i would turn it off, doesn't than mean i would remain within the bandwitdh that doesn't exceed limits so DSC won't be active?The calculation is shown in the link, 30.77 Gbit/s. It is 1.19 times the 25.92 Gbit/s maximum.
HDR itself doesn't factor into the calculation. It's just about whether you are transmitting with 8 bpc or 10 bpc color depth. 10 bpc with HDR is the same as 10 bpc without HDR for the purposes of calculating data rate.30.77Gbit/s is with HDR, but i would use the monitor with SDR because HDR10 is worthless so i would turn it off, doesn't than mean i would remain within the bandwitdh that doesn't exceed limits so DSC won't be active?
How far would you have to lower the refreshrate with HDR to stay within the non DSC range?
How do you calculate it?
On wikipedia there is no chart providing information for a setup that uses 10bpc but without HDR.