Question DisplayPort 2.1a versus 2.1b ?

Elliah246

Prominent
Feb 11, 2024
90
1
535
DP 2.1a vs 2.1b
What's the difference?
AMD RX 9070 XT uses 2.1a but Nvidia use 2.1b or?
What does the 50 series and 40 series use?
Can the 9070 XT handle UHBR20 ? Can the 5070 do it?
 
The 2.1b variant that is included with Nvidia 50-series cards and was announced at CES supports longer cables for UHBR20, which stands for Ultra-High Bit Rate at 20Gb/s per-lane, with four lanes on tap for up to 80Gb/s of bandwidth.

The new cables were unveiled at CES by VESA, which is the industry group that ratifies new display standards. The new cables are called DP80LL, with the two L’s in the name signifying “low loss” and the rest of the name translating to DisplayPort 80Gb/s. Previously, there were just passive UHBR20 cables that were restricted to one meter in length, but the new DP80LL cables are active, and can be up to three meters in length. Suffice to say Nvidia 50-series owners should not have a problem with bandwidth when trying to game at 4K using DisplayPort 2.1b, but note you will need a DP80LL cable to enable high-bandwidth transmission at three meters.
Source: https://www.corsair.com/us/en/explo...21b-and-hdmi-21b-everything-you-need-to-know/

What does the 50 series and 40 series use?
RTX 40-series: DP 1.4a
RTX 50-series: DP 2.1b

DP 2.1 and 2.1a wasn't out when RTX 40-series released.

Can the 9070 XT handle UHBR20 ? Can the 5070 Do it?
UHBR20 is just bandwidth and not the actual measure of GPU performance.

Look UHBR20 as 8 lane wide highway and UHBR10 as 4 lane wide highway. Just because UHBR20 can sustain more cars driven at any given moment (more data throughput), doesn't mean that the cars on the highway drive faster or that there are more cars in the first place. It's just overhead for things to come.

E.g DP 2.1b can do 4K at 480 Hz. NONE of the current GPUs out there, can do gaming on 4K at 480 Hz. RTX 5090, at best, gets ~100 FPS/Hz on 4K.
 
40 series uses 1.4a

2.1a officially supports UHBR 13.5
2.1b officially supports UHBR 20

so basically just an upshift in the max refresh rate available in all resolutions. 4K 240hz 10bit should be doable on 2.1b, and a maybe on 2.1a.

That said, at that point you are very much dependent on the monitor and cable.
 
40 series uses 1.4a

2.1a officially supports UHBR 13.5
2.1b officially supports UHBR 20

so basically just an upshift in the max refresh rate available in all resolutions. 4K 240hz 10bit should be doable on 2.1b, and a maybe on 2.1a.

That said, at that point you are very much dependent on the monitor and cable.
Could you link a media for resolution stats?

Will 2560x1440 (2k) @240hz work WITHOUT DSC using and AMD 9070 XT with DP 2.1a ?
 
Could you link a media for resolution stats?
Here is a link i found when i initially wrote my reply but i didn't post it then.
Article: https://www.pcworld.com/article/619...he-new-ultra-high-bit-rate-certification.html

Will 2560x1440 (2k) @240hz work WITHOUT DSC using and AMD 9070 XT with DP 2.1a ?
Also, just found this;
link: https://linustechtips.com/topic/729232-guide-to-display-cables-adapters-v2/

You need to fill out output and input ports to see the data. Which then is shown in nice table format.
At the bottom of 1st reply is also a Calculator that you can use, to get your answer.
 
According to wikipedia.
2560x1440p @240hz requires 30.77 Gbit/s datarate - HDR
2560x1440p @240hz requires 24.62 Gbit/s datarate - SDR
UHBR13.5 can do 52.22 Gbit/s.

AMD Radeon RX 9070 uses Displayport 2.1a
The monitor i'm interested in, ASUS ROG Strix OLED XG27AQDMG uses Displayport 1.4
If it matters, the monitors also uses 10 Bit color depth.

Will the RX9070 work fine with the monitor using DP 1.4? Ofc the dp cable would be 1.4 aswell, or uhbr 13.5 certified but i see no point doing so since it would fallback to 1.4 anyways?

Question is, would the monitor run with DSC or no?

Sources:
https://en.wikipedia.org/wiki/DisplayPort


Edit:
Wikipedia says,

DisplayPort version 1.4 was published 1 March 2016. No new transmission modes are defined, so HBR3 (32.4 Gbit/s) as introduced in version 1.3 still remains as the highest available mode. DisplayPort 1.4 adds support for Display Stream Compression 1.2 (DSC), Forward Error Correction, HDR10 metadata defined in CTA-861.3, including static and dynamic metadata and the Rec. 2020 color space, for HDMI interoperability,<a href="https://en.wikipedia.org/wiki/DisplayPort#cite_note-26"><span>[</span>23<span>]</span></a> and extends the maximum number of inline audio channels to 32.

Question:
With HDR 10 enabled, DSC would also be enabled.
Without HDR, using SDR 10 Bit colors, DSC would be disabled.
Is this correct?

2.png

1.png
 
Last edited:
Will the RX9070 work fine with the monitor using DP 1.4? Ofc the dp cable would be 1.4 aswell, or uhbr 13.5 certified but i see no point doing so since it would fallback to 1.4 anyways?
Monitor would work fine but that up to what DP 1.4 can do.

Question is, would the monitor run with DSC or no?
Using DSC with HBR3 transmission rates, DisplayPort 1.4 can support 8K UHD (7680 × 4320) at 60 Hz or 4K UHD (3840 × 2160) at 240 Hz with 30 bit/px RGB color and HDR. 4K at 96 Hz 30 bit/px RGB/HDR can be achieved without the need for DSC. On displays which do not support DSC, the maximum limits are unchanged from DisplayPort 1.3 (4K 120 Hz, 5K 60 Hz, 8K 30 Hz).
Source: https://en.wikipedia.org/wiki/Display_Stream_Compression#History

Your monitor has DP 1.4 DSC;
specs: https://rog.asus.com/monitors/27-to-31-5-inches/rog-strix-oled-xg27aqdmg/spec/

So, your monitor can run with DSC.

Question:
With HDR 10 enabled, DSC would also be enabled.
Without HDR, using SDR 10 Bit colors, DSC would be disabled.
Is this correct?
DSC is optional and it's usage depends on specific monitor.

HDR extensions were defined in version 1.4 of the DisplayPort standard. Some displays support these HDR extensions, but may only implement HBR2 transmission mode if the extra bandwidth of HBR3 is unnecessary (for example, on 4K 60 Hz HDR displays).
Since there is no definition of what constitutes a "DisplayPort 1.4" device, some manufacturers may choose to label these as "DP 1.2" devices despite their support for DP 1.4 HDR extensions. As a result, DisplayPort "version numbers" should not be used as an indicator of HDR support.
source: https://en.wikipedia.org/wiki/DisplayPort#Refresh_frequency_limits_for_HDR_video
 
Monitor would work fine but that up to what DP 1.4 can do.



Source: https://en.wikipedia.org/wiki/Display_Stream_Compression#History

Your monitor has DP 1.4 DSC;
specs: https://rog.asus.com/monitors/27-to-31-5-inches/rog-strix-oled-xg27aqdmg/spec/

So, your monitor can run with DSC.


DSC is optional and it's usage depends on specific monitor.


source: https://en.wikipedia.org/wiki/DisplayPort#Refresh_frequency_limits_for_HDR_video

I saw that the monitor uses DP 1.4 DSC.
Question was if DSC would be always active or automatically enabled if the bandwidth limit exceeds or would it have to be manually enabled?
Obviously i want to run this monitor WITHOUT DSC, but i did not yet understand if this is possible using it's DP 1.4 with 240Hz 2560x1440p 10Bit without HDR.
 
DSC will be used automatically, only if the format cannot be transmitted uncompressed.
So the switch from non DSC to DSC happens automatically when for example playing a video game that exceeds the bandwidth. Does DSC happen from one frame to another, what if in one second the bandwidth limit isnt exceeded but in the next second it is and in the second after that it isnt anymore, does DSC switch from one second to another, does it cause inputlag or increased frametime/responstime, how exactly does it work simplified?

I really just want to know if i can run this monitor without using DSC.
Help me do the maths?
DP 1.4 which the monitor uses can do HBR3 (32.4 Gbit/s) max.
Is this max bandwidth enough for 2560x1440p @240hz and 10Bit color depth or will DSC be used?
Trying to understand this...

Scenario:
2560x1440p default resolution is enabled
240Hz default refreshrate is enabled
10Bit default colordepth is enabled.
Graphics intense game being rendered with 2560x1440p and uncapped variable refreshrate is being executed.
Displayport version 1.4 technology is being used.
Can the monitor under any circumstances exceed the DP 1.4 HBR3 32.4Gbit/s bandwith considering the above scenario?
 
So the switch from non DSC to DSC happens automatically when for example playing a video game that exceeds the bandwidth. Does DSC happen from one frame to another, what if in one second the bandwidth limit isnt exceeded but in the next second it is and in the second after that it isnt anymore, does DSC switch from one second to another, does it cause inputlag or increased frametime/responstime, how exactly does it work simplified?

I really just want to know if i can run this monitor without using DSC.
Help me do the maths?
DP 1.4 which the monitor uses can do HBR3 (32.4 Gbit/s) max.
Is this max bandwidth enough for 2560x1440p @240hz and 10Bit color depth or will DSC be used?
Trying to understand this...

Scenario:
2560x1440p default resolution is enabled
240Hz default refreshrate is enabled
10Bit default colordepth is enabled.
Graphics intense game being rendered with 2560x1440p and uncapped variable refreshrate is being executed.
Displayport version 1.4 technology is being used.
Can the monitor under any circumstances exceed the DP 1.4 HBR3 32.4Gbit/s bandwith considering the above scenario?
Data rate for a display interface is a constant value that depends on the video format, it doesn't vary during usage.

The maximum data rate of DP 1.4 is 25.92 Gbit/s. 2560×1440 at 240 Hz 10 bpc will exceed this.

https://linustechtips.com/topic/729...s-v2/?section=calc&H=2560&V=1440&F=240&bpc=10
 
The calculation is shown in the link, 30.77 Gbit/s. It is 1.19 times the 25.92 Gbit/s maximum.
30.77Gbit/s is with HDR, but i would use the monitor with SDR because HDR10 is worthless so i would turn it off, doesn't than mean i would remain within the bandwitdh that doesn't exceed limits so DSC won't be active?
How far would you have to lower the refreshrate with HDR to stay within the non DSC range?
How do you calculate it?

On wikipedia there is no chart providing information for a setup that uses 10bpc but without HDR.
 
Last edited:
30.77Gbit/s is with HDR, but i would use the monitor with SDR because HDR10 is worthless so i would turn it off, doesn't than mean i would remain within the bandwitdh that doesn't exceed limits so DSC won't be active?
How far would you have to lower the refreshrate with HDR to stay within the non DSC range?
How do you calculate it?

On wikipedia there is no chart providing information for a setup that uses 10bpc but without HDR.
HDR itself doesn't factor into the calculation. It's just about whether you are transmitting with 8 bpc or 10 bpc color depth. 10 bpc with HDR is the same as 10 bpc without HDR for the purposes of calculating data rate.

You can find the maximum refresh frequency with given settings by selecting the "Max refresh frequency" mode at the top of the calculator.