Question confusion over hdmi data rate and pci express 3.0 or even 2.0

majidok

Distinguished
Oct 25, 2009
66
0
18,540
1
Hi everyone,
I was actually gonna get a 2.0 hdmi cable(well one with 48 Gigabits per second throughput) even though I didnt need that bandwidth that I realized the older hdmi cables have like 10 Gigabits per second which is abit more than 1.2 Giga bytes per second then i realized even Pci express 2.0 is 1 Giga Bytes per second per lane. so thats like 16 Giga Bytes per second for even pci express 2.0 at 16 lanes . So my question is how does 16 gigabytes (Well in the maximum demanding scenario) of data gets transmitted over a 10 Gigabit per second hdmi cable

so a most demanding scene in a game would require 16 gigabytes in the case of pcie 2 or 32 gigabytes per second if a gpu manages to use all the bandwidth of pci express 3.0 to be transmitted over a 10 or even 18 giga bits per second hdmi . The final rendered result has to go through the hdmi with that low bandwidth
 
Last edited:

DSzymborski

Polypheme
Moderator
It's a little hard to know for sure as the question is worded rather confusingly, but it appears that you're conflating the bandwidth of the PCIE slot (your GPU communicating with your PC) with the bandwidth of your HDMI port (your GPU's final output communicating with your monitor). These are two very different things. In addition, you're also mashing gigabits and gigabytes together; they're not the same thing and there are eight bits in a byte.
 

Jacozeelie

Respectable
Mar 1, 2019
583
81
1,970
1
The input for your gpu, pcie, is the numbers it has to crunch to produce a image, or many images in a second. The end result is sent via hdmi out thru cable to your monitor. The higher the res and refreshrate, the higher the bitrate.
 

majidok

Distinguished
Oct 25, 2009
66
0
18,540
1
It's a little hard to know for sure as the question is worded rather confusingly, but it appears that you're conflating the bandwidth of the PCIE slot (your GPU communicating with your PC) with the bandwidth of your HDMI port (your GPU's final output communicating with your monitor). These are two very different things. In addition, you're also mashing gigabits and gigabytes together; they're not the same thing and there are eight bits in a byte.
I know the difference between bytes and bits , thats how I came up with the question anyway..

I edited the question to make more sense
 
Last edited:

DSzymborski

Polypheme
Moderator
I know the difference between bytes and bits , thats how I came up with the question anyway..

I edited the question to make more sense
The answer to that part is still the same. You're conflating two different types of data. The bandwidth of the PCI Express slot and the bandwidth of the HDMI port have nothing to do with each other. It's completely different data of a completely different purpose going to completely different areas.
 

majidok

Distinguished
Oct 25, 2009
66
0
18,540
1
Sorry I just got back after long time , so are you saying that less or equal to 1.2 Giga bytes of data are being pushed through the hdmi port to the monitor very second?
 

DSzymborski

Polypheme
Moderator
Sorry I just got back after long time , so are you saying that less or equal to 1.2 Giga bytes of data are being pushed through the hdmi port to the monitor very second?
The newest HDMI has a bandwidth cap several times that. It depends on the resolution, color format, etc.

But again, the PCIE speed has nothing to do with the speed that the communication with the monitor happens. It's not the same data or even the same type of data.
 

ASK THE COMMUNITY