[SOLVED] Does it matter in what PCIe x16 slot i put in my GPU?

Solution
The signal degradation issue is bunk. PCI is broken down into lanes and the multiplier defines how many lanes are being used. If there were a signal degradation issue plugging a NVMe PCI drive, which pulls considerably more bandwidth on a per lane bases than any video card, on a lower slot would make it inoperable. The issue is the number of lanes available on many mainstream chipset/cpu configurations is limited. Often this number is 20 or less. So if you only have 20 PCI lanes available and you have two x16 slots there are simply not enough lanes to go around. Depending on the board you have to configure where lanes are active and often lower sets cannot go beyond certain lane levels and sometimes slot 2 steals lanes from lower...
That will depend on which motherboard. Some don't have the 2nd slot operating at the full 16 lanes. And some may disable PCIe lanes on other devices when a card is present.
assuming both PCIe x16 Slots perform equally no matter how many Cards are present, would that make a difference? Im Hearing that the length of the PCIe lanes could cause issues with Signal degradation and whatnot but does a few more Inches really make a difference?
 
assuming both PCIe x16 Slots perform equally no matter how many Cards are present, would that make a difference? Im Hearing that the length of the PCIe lanes could cause issues with Signal degradation and whatnot but does a few more Inches really make a difference?
I can't answer either of those questions and be 100% positive. My knowledge of the motherboard design engineering isn't that deep. If someone wiser than I does not come up with an answer, I suggest you contact the board manufacturer.

My suspicion is that signal degradation is a non-issue. More cards in more PCIe lanes however may make a tiny difference.
 
The signal degradation issue is bunk. PCI is broken down into lanes and the multiplier defines how many lanes are being used. If there were a signal degradation issue plugging a NVMe PCI drive, which pulls considerably more bandwidth on a per lane bases than any video card, on a lower slot would make it inoperable. The issue is the number of lanes available on many mainstream chipset/cpu configurations is limited. Often this number is 20 or less. So if you only have 20 PCI lanes available and you have two x16 slots there are simply not enough lanes to go around. Depending on the board you have to configure where lanes are active and often lower sets cannot go beyond certain lane levels and sometimes slot 2 steals lanes from lower slots or internal controllers like M.2 (Which uses 4 pci lanes itself) that make configuration a mess. Also depending on the board things just don't fit well when you are trying to shove a massive GPU in and slot 1 is usually set up to fit the GPU while slot 2 may have interference issues.

This is part of the reason that people pay through the nose for HEDT processors. Last time I looked Threadripper and its chipset provide something like 60 PCI lanes. So to even have the ability to run a dual GPU configuration (Assuming X16) you would need 32 lanes to begin with. Throw in the SATA controller, and some NVMe drives, a USB controller and a PCI card capture card and that count rises pretty fast. Even without the second GPU if you are running NVMe drives the lane count issue can add up pretty fast.
 
Last edited:
Solution