News DisplayPort vs. HDMI: Which Is Better For Gaming?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

csm101

Distinguished
Aug 8, 2007
180
13
18,715
here's my personal experience regarding this. i got a acer predator XB3 , the same model that was reviewed by TH (https://www.tomshardware.com/reviews/acer-predator-xb273k-4k-144hz-gaming-monitor-hdr,5998.html) and it turns out in order to play some older game that supported HDR you need HDMI so i got a HDMI 2.0 cable. the monitor did not came with one. so games like BF1, mass effect Andromeda, AC Origins, HITMAN, HITMAN 2 demands that you have a HDMI connection in order to enable HDR within the game. on the other hand games like gears 5, SOTTR, FF XV, COD WW II, Metro Exodus are capable of handing HDR via DP connection. so until this BS settles down in gaming industry for anyone out there who have a nvidia card and have a g-sync monitor better make sure to have a HDMI 2.0 cable so you can play the games in HDR if they demand a HDMI connection.
 
here's my personal experience regarding this. i got a acer predator XB3 , the same model that was reviewed by TH (https://www.tomshardware.com/reviews/acer-predator-xb273k-4k-144hz-gaming-monitor-hdr,5998.html) and it turns out in order to play some older game that supported HDR you need HDMI so i got a HDMI 2.0 cable. the monitor did not came with one. so games like BF1, mass effect Andromeda, AC Origins, HITMAN, HITMAN 2 demands that you have a HDMI connection in order to enable HDR within the game. on the other hand games like gears 5, SOTTR, FF XV, COD WW II, Metro Exodus are capable of handing HDR via DP connection. so until this BS settles down in gaming industry for anyone out there who have a nvidia card and have a g-sync monitor better make sure to have a HDMI 2.0 cable so you can play the games in HDR if they demand a HDMI connection.
Sorry, but this is absolutely incorrect about needing HDMI 2.0 to do HDR in at least some of those games. I have that monitor as well, and it supports HDR modes in all the games I've tried that have HDR support, using DP 1.4. I have tested it with Hitman 2, BF1, and others. I can't think of any technical reason why HDR wouldn't work over DP in some of the games you listed -- could be some specific resolution / refresh rate combinations don't work, but HDR in general should be fine.

Of course, 4K 144Hz 10 bpp color needs 39.19 Gbps (plus audio) bandwidth, so it can't work over DP 1.4. Which is why you can't select that resolution without using 4:2:2 YCrCb. That cuts the bandwidth requirement by 33%, which allows 144Hz 4K 10 bpp just fine. This is Hitman 2, tested just now:

58
 
Last edited:

bit_user

Polypheme
Ambassador
4:2:2 YCrCb. That basically cuts the bandwidth requirement in half,
4:2:2 means 2 samples in each of the chroma channels for every 4 luma samples. So, it's 2/3rds the bandwidth of 4:4:4. When you go to 4:2:0 (which also breaks the naming convention I just explained), it drops to half of 4:4:4.

The rationale behind 4:2:0 is supposedly that it's 4:2:2 on half of the scanlines and 4:0:0 on the other half. Since 4:1:1 was already taken, someone decided to call this 4:2:0. Genius (not).
 
4:2:2 means 2 samples in each of the chroma channels for every 4 luma samples. So, it's 2/3rds the bandwidth of 4:4:4. When you go to 4:2:0 (which also breaks the naming convention I just explained), it drops to half of 4:4:4.

The rationale behind 4:2:0 is supposedly that it's 4:2:2 on half of the scanlines and 4:0:0 on the other half. Since 4:1:1 was already taken, someone decided to call this 4:2:0. Genius (not).
Oops, I put half (thinking of 4:2:0) but you're right. I'll correct that / edit. Thanks!
 
  • Like
Reactions: bit_user
Apr 8, 2020
1
2
15
Thunderbolt just uses DisplayPort routed over the connection. Thunderbolt 2 supports DP 1.2 resolutions, and Thunderbolt 3 supports DP 1.4. I'll add a note in the article, though.

It is not as simple as that. Intel Alpine Ridge Thunderbolt 3 controllers first released in 2016 are DP 1.2 only (as I discovered in my 2019 Dell XPS 13 9380. Also some implementations of Thuderbolt 3 controllers only route x2 PCIe to the Thunderbolt port as full x4 PCIe lanes are optional not required. All the newer 2018 Titan Ridge controllers support DP 1.4. USB 4 which is basically Thunderbolt 3 rebranded with several of the optional features in the earlier spec including DP 1.4 now being mandatory. Thunderbolt 4 is not released yet so it is still a work in progress but rumors say it will be based on PCIe 4 and thus support it's faster link rates.

https://ark.intel.com/content/www/u...400,97401,97402,94031,87402,94032,87401,94030

https://en.wikipedia.org/wiki/Thunderbolt_(interface)#Thunderbolt_3
 

Deicidium369

Permanantly banned.
BANNED
Mar 4, 2020
390
61
290
This is really ho-hum. I only moved from VGA to HDMI when the new desktop had a GPU that allowed it.
Aside from that ... no difference. I had no flickering or noise with VGA. It worked perfectly. The HDMI is no better.
Once again ... ho-hum.
As opposed to an Exiting Video Connection? Pretty much most things that, you know, just work are ho hum.
 

a.henriquedsj

Commendable
Apr 23, 2020
13
1
1,515
Nice article.

I believe that I can't use 2560x1440 + 144hz + g-sync on my external monitor because my notebook port is USB 3.1 Gen. 2 Type-C with bandwidth up to 10 Gbps or am I wrong?

My monitor is the dell s2716dg and I use the display port cable to connect to the Dell DA300 adapter to connect to the Type-C port on the Asus ROG Zephyrus G GA502.

I currently have this connection at 2560x1440 + 120Hz. I would love to use g-sync for that would have to have a Thunderbolt 3 port with 40 Gbps?

I am sorry for my english.
 
Nice article.

I believe that I can't use 2560x1440 + 144hz + g-sync on my external monitor because my notebook port is USB 3.1 Gen. 2 Type-C with bandwidth up to 10 Gbps or am I wrong?

My monitor is the dell s2716dg and I use the display port cable to connect to the Dell DA300 adapter to connect to the Type-C port on the Asus ROG Zephyrus G GA502.

I currently have this connection at 2560x1440 + 120Hz. I would love to use g-sync for that would have to have a Thunderbolt 3 port with 40 Gbps?

I am sorry for my english.
If your port tops out at 10 Gbps, then that will limit your resolution and refresh rate, yeah. The GPU driving the port also matters, though. Is it coming off the dedicated GPU (GTX 1660 Ti?), or is it running off the Intel integrated GPU (UHD Graphics 630)? Theoretically, it's supposed to be capable of DP1.4 connectivity, but with laptops there are lots of other potential factors. If you can do 1440p 120Hz but not 144Hz, you're probably maxed out on the port.
 
  • Like
Reactions: bit_user
Apr 25, 2020
1
2
15
"8b/10b encoding for example means for every 8 bits of data, 10 bits are actually transmitted, with the extra bits used for data correction."

8b/10b encoding is used to ensure that the DC bias of the signal is zero. Capacitors and inductors in the signal path can otherwise build up charges that would affect the signal. While it will expose some errors, 8b/10b is a crappy error correction/detection scheme.
 

a.henriquedsj

Commendable
Apr 23, 2020
13
1
1,515
If your port tops out at 10 Gbps, then that will limit your resolution and refresh rate, yeah. The GPU driving the port also matters, though. Is it coming off the dedicated GPU (GTX 1660 Ti?), or is it running off the Intel integrated GPU (UHD Graphics 630)? Theoretically, it's supposed to be capable of DP1.4 connectivity, but with laptops there are lots of other potential factors. If you can do 1440p 120Hz but not 144Hz, you're probably maxed out on the port.


When I connect through the HDMI port it uses the AMD card, but when it is through the type c port it connects to the nvidia card (dGPU)

At least that's what appears on the nvidia panel rs

https://ibb.co/TDXF6Rq
 
When I connect through the HDMI port it uses the AMD card, but when it is through the type c port it connects to the nvidia card (dGPU)

At least that's what appears on the nvidia panel rs

https://ibb.co/TDXF6Rq
When I connect through the HDMI port it uses the AMD card, but when it is through the type c port it connects to the nvidia card (dGPU)

At least that's what appears on the nvidia panel rs

https://ibb.co/TDXF6Rq
So it's a USB Type-C cable on one end and DisplayPort on the other? Seems like that should work, but it's probably just something specific to the laptop.
 

a.henriquedsj

Commendable
Apr 23, 2020
13
1
1,515
So it's a USB Type-C cable on one end and DisplayPort on the other? Seems like that should work, but it's probably just something specific to the laptop.


The connection is being made like this:

Monitor dell s2716dg + Adapter dell da300 + Asus ROG Zephyrus G GA502

I am in doubt about the bandwidth of the type-c port or some limitation on the dell adapter.
 
The connection is being made like this:

Monitor dell s2716dg + Adapter dell da300 + Asus ROG Zephyrus G GA502

I am in doubt about the bandwidth of the type-c port or some limitation on the dell adapter.
I suspect, after looking into it further, that the weak link is the DA300 adapter. I could be wrong, but here's the data.

2560x1440 @ 144 Hz would require 14.08 Gbps. If your adapter is limited to 10 Gbps, that's out of the question. However, 2560x1440 @ 120Hz is still going to need 11.59 Gbps -- only 100Hz would actually get you below 10 Gbps (9.57 Gbps). Maybe Dell just fudged some things, or maybe I'm missing some factor, but even DP1.2 should be able to do 1440p @ 144Hz. So, either your laptop or the DA300 (probably the DA300) is somehow limiting actual available bandwidth.
 

a.henriquedsj

Commendable
Apr 23, 2020
13
1
1,515
I suspect, after looking into it further, that the weak link is the DA300 adapter. I could be wrong, but here's the data.

2560x1440 @ 144 Hz would require 14.08 Gbps. If your adapter is limited to 10 Gbps, that's out of the question. However, 2560x1440 @ 120Hz is still going to need 11.59 Gbps -- only 100Hz would actually get you below 10 Gbps (9.57 Gbps). Maybe Dell just fudged some things, or maybe I'm missing some factor, but even DP1.2 should be able to do 1440p @ 144Hz. So, either your laptop or the DA300 (probably the DA300) is somehow limiting actual available bandwidth.



Dell reported that even at 120 hertz it is already working in a non-standard way. Asus said that G-sync does not work on type-c. Both responses were very general.
In my next notebook I will prioritize by a mini display port.

Thank you for your help.
 
May 8, 2020
1
0
10
Hi there - I am not super tech savvy and I was wondering if someone could help me with a decision.

I recently purchased the Samsung UR59C 32" 16:9 4K Curved LCD Monitor (link: https://www.samsung.com/us/computing/monitors/curved/32-ur59c-curved-4k-uhd-monitor-lu32r590cwnxza/ )

I am trying to decide between using a Display port 1.2 cable, or an HDMI 2.0 cable. The prices are prettymuch the same and do not affect my decision. I have a Macbook Air 2017 which supports 4K UHD screen resolution (surprising, right?) . What cable do you think would be best? I know that both of the cables will work, but I want to know which would be better. I have provided the links to the cables below. I just want to know if there is any difference in screen resolution. I am using this monitor to do work but I also want it to run and look as good as possible. I appreciate any input!! Thank you

HDMI 2.0 to Thunderbolt 2.0: https://ezq.com/X40084-mini-displayport-to-HDMI-4K-60HZ-cable.html

Display port 1.2 to Thunderbolt 2.0: https://www.amazon.com/gp/product/B07VCCTX3N/ref=ppx_yo_dt_b_asin_title_o01_s00?ie=UTF8&psc=1
 

bit_user

Polypheme
Ambassador
Hi there - I am not super tech savvy
FWIW, I'm not super mac savvy.

I am trying to decide between using a Display port 1.2 cable, or an HDMI 2.0 cable.
Eithe should work fine, if you're okay with "Millions of Colors" (i.e. 16.8 million; 8-bits per channel; 24-bit color). If you want to utilize the full color capability of your display (i.e. 1 billion colors; 10-bits per channel; 30-bit color), at the full 60 Hz, then you need to use DisplayPort - HDMI 2.0 cannot carry that much information.

I cannot vouch for this specific cable. Make sure the connector matches the one on your laptop, and ideally check that others are using it with a 4K display.
 
Aug 9, 2020
1
1
15
This was a great article - thank-you for the depth. My question is about the history - what brought DisplayPort into being in the first place? HDMI was already in place for four years when DisplayPort came out, and they seemed ready and willing to update it with better bandwidth to support future upgrades. Seeing the HDMI was already almost ubiquitous, what was the compelling reason for an entirely different standard? It's not necessarily a bad thing, of course.
 
  • Like
Reactions: JarredWaltonGPU
This was a great article - thank-you for the depth. My question is about the history - what brought DisplayPort into being in the first place? HDMI was already in place for four years when DisplayPort came out, and they seemed ready and willing to update it with better bandwidth to support future upgrades. Seeing the HDMI was already almost ubiquitous, what was the compelling reason for an entirely different standard? It's not necessarily a bad thing, of course.
HDMI was defined by a corporation that charges royalty fees for its use, HDMI Licencing LLC. DisplayPort was defined by VESA as an open standard with no licensing fee. Besides cost, DP was created to push higher resolutions and refresh rates, as HDMI was lagging behind in this area.

Dual-link DVI was able to drive 2560x1600 at 60 Hz from 1999 (I don't think the first such displays came until several years later), and HDMI didn't reach that capability until version 1.3/1.4. DP started with slightly higher bitrates than both DL DVI-D as well as HDMI 1.3, and my recollection is that it was used and advanced more quickly than HDMI.

I'm not certain that's 100% accurate, but there are GPUs that support much higher resolutions on DP than on HDMI. Whether that was a technical limitation of the time, or simply choosing to support higher bitrates on DP, I'm not sure.
 
  • Like
Reactions: bit_user

bit_user

Polypheme
Ambassador
My question is about the history - what brought DisplayPort into being in the first place?
To add to Jarred's excellent answer, I would also observe that HDMI is more complex than DisplayPort - at least, initially. I could be wrong on this, but I think HDMI mandated support of both YUV and RGB colorspaces and possibly interlaced video formats.

HDMI also had a lot more features that are really oriented towards home theater, and of limited usefulness (if any), for computers, such as CEC (Consumer Electronics Control) and support for more audio formats. And later, things like ARC (Audio Return Channel) and embedded Ethernet.

I don't know how complete these lists are, but see:
 
Last edited:
  • Like
Reactions: JarredWaltonGPU

pilotsh

Distinguished
May 1, 2014
173
2
18,695
Soooo, nVidia just Revealed HDMI 2.1 and Display Port 1.4a on their RTX 30 Series Graphics Cards. I am guessing this means that for these graphics cards, a HDMI 2.1 connection wins, once we start seeing HDMI 2.1 monitors!....?

Is my thinking correct?
 
Soooo, nVidia just Revealed HDMI 2.1 and Display Port 1.4a on their RTX 30 Series Graphics Cards. I am guessing this means that for these graphics cards, a HDMI 2.1 connection wins, once we start seeing HDMI 2.1 monitors!....?

Is my thinking correct?
Yeah, HDMI 2.1 is superior to DP1.4 -- obviously provided you have an HDMI 2.1 capable display. Pretty surprised DP2.0 didn't make it into Ampere! No DP2.0 displays yet anyway, whereas there are HDMI2.1 displays, so maybe that's it.
 

pilotsh

Distinguished
May 1, 2014
173
2
18,695
Yeah, HDMI 2.1 is superior to DP1.4 -- obviously provided you have an HDMI 2.1 capable display. Pretty surprised DP2.0 didn't make it into Ampere! No DP2.0 displays yet anyway, whereas there are HDMI2.1 displays, so maybe that's it.

I have a GTX 780 and will be buying a 30 series card.... because nVidia is releasing first AND they absolutely mixed up the pricing for the better for consumers. Many in my 'specs boat' I would have thought... (780, 7xx,8xx,9xx,10xx).

If AMD have any hope beating nVidia it would be to include DP2.0 which will beat even HDMI2.1, but by then (November?) many of their customers may have already bought the Nvidia 30 series. ? :unsure::rolleyes:

Confusing times....if I was AMD I would launch NOW too (to stop the customers just pre-ordering nVidia cards).... but maybe a case of "missed it by that much" for AMD this time...?

Relevance to OP:
(It is interesting to be in a time when people think: "that monitor/peripheral is perfect EXCEPT, it will limit my output because of the connection TYPES/VERSION, so I will keep looking"). As an example, for performance, I was looking at the Samsung G7 Odyssey monitor, I almost bought it..... but now if I get it I would bottleneck the graphics card, as the monitor is only HDMI 2.0 and Display Port 1.4 (not even '1.4a')... and many people would think along the similar lines and not buy a product.
Manufacturers must hate it right now!
 
Last edited:

bit_user

Polypheme
Ambassador
I was looking at the Samsung G7 Odyssey monitor, I almost bought it.....
Me too, but the Amazon reviews are looking very mixed.

but now if I get it I would bottleneck the graphics card, as the monitor is only HDMI 2.0 and Display Port 1.4 (not even '1.4a')...
According to this, 1.4a just improved DSC (Display Stream Compression), which I don't even want.

https://en.wikipedia.org/wiki/DisplayPort#1.4a

What's more important is whether that monitor is HBR3, which was introduced back in DisplayPort 1.3, yet most 1.4 monitors still don't support it. Unfortunately, I haven't been able to confirm whether that or other monitors I'm considering support HBR3. The only ones I've seen advertise it are Gigabyte's AORUS FI27Q-P and CV27Q-SA.
 

anonymuos

Distinguished
Oct 5, 2010
13
0
18,520
One thing I like about HDMI devices is Consumer Electronics Control (CEC). DisplayPort can also have it via the AUX channel but no devices seem to actually support it.
 
Status
Not open for further replies.

TRENDING THREADS