Hi everyone,
I was actually gonna get a 2.0 hdmi cable(well one with 48 Gigabits per second throughput) even though I didnt need that bandwidth that I realized the older hdmi cables have like 10 Gigabits per second which is abit more than 1.2 Giga bytes per second then i realized even Pci express 2.0 is 1 Giga Bytes per second per lane. so thats like 16 Giga Bytes per second for even pci express 2.0 at 16 lanes . So my question is how does 16 gigabytes (Well in the maximum demanding scenario) of data gets transmitted over a 10 Gigabit per second hdmi cable
so a most demanding scene in a game would require 16 gigabytes in the case of pcie 2 or 32 gigabytes per second if a gpu manages to use all the bandwidth of pci express 3.0 to be transmitted over a 10 or even 18 giga bits per second hdmi . The final rendered result has to go through the hdmi with that low bandwidth
I was actually gonna get a 2.0 hdmi cable(well one with 48 Gigabits per second throughput) even though I didnt need that bandwidth that I realized the older hdmi cables have like 10 Gigabits per second which is abit more than 1.2 Giga bytes per second then i realized even Pci express 2.0 is 1 Giga Bytes per second per lane. so thats like 16 Giga Bytes per second for even pci express 2.0 at 16 lanes . So my question is how does 16 gigabytes (Well in the maximum demanding scenario) of data gets transmitted over a 10 Gigabit per second hdmi cable
so a most demanding scene in a game would require 16 gigabytes in the case of pcie 2 or 32 gigabytes per second if a gpu manages to use all the bandwidth of pci express 3.0 to be transmitted over a 10 or even 18 giga bits per second hdmi . The final rendered result has to go through the hdmi with that low bandwidth
Last edited: