[SOLVED] Can someone explain me the PCI-E slot variations & sizes of Motherboard to GPU? Read specs below.

Mar 24, 2019
37
3
35
0
Okay, so I have a Gigabyte Z370M D3H motherboard. I was reading its specs here about the Expansion Slots, and I can't understand most of it. What's " x16 ". I have measured it's size which is 8cm.
But what's the meaning of
  • 1x PCI-E 3.0 x16 @ x16
  • 1x PCI-E 3.0 x16 @ x4
  • 2x PCI-E 3.0 x1 @ x1
I only see 2 slots for gpu/sound card, then why is there 3 bullet points. Another thing, I have a Geforce GT 710 GPU (no fan version), but it dosen't state the pcie slot size anywhere. So what term do I search to get the size of GPU's slot size? Planning to buy gtx 1050 ti and up GPUs but I don't know if it fits...
Please Explain. And Thanks in advance.
 

NoMercyBeAst

Prominent
Oct 18, 2017
96
15
545
3
Okay, so I have a Gigabyte Z370M D3H motherboard. I was reading its specs here about the Expansion Slots, and I can't understand most of it. What's " x16 ". I have measured it's size which is 8cm.
But what's the meaning of
  • 1x PCI-E 3.0 x16 @ x16
  • 1x PCI-E 3.0 x16 @ x4
  • 2x PCI-E 3.0 x1 @ x1
I only see 2 slots for gpu/sound card, then why is there 3 bullet points. Another thing, I have a Geforce GT 710 GPU (no fan version), but it dosen't state the pcie slot size anywhere. So what term do I search to get the size of GPU's slot size? Planning to buy gtx 1050 ti and up GPUs but I don't know if it fits...
Please Explain. And Thanks in advance.
Typically they will all be PCI Express, but for a graphics card you need a PCI Express x16 slot. There are three versions of this slot(your motherboard only has one that is 3.0), but they’re backwards compatible, so a modern PCI Express 3.0 graphics card will work in a motherboard with a PCI Express x16 2.0 slot.(you have 3.0 slot so its best)

It's most common to use the upper-most one for a graphics card, but if you're fitting two cards in an nVidia SLI or AMD Crossfire setup, you'll need both. Check which standard your motherboard supports before investing in a pair of cards, though.
 
Reactions: Mr.Stork
Mar 24, 2019
37
3
35
0
Typically they will all be PCI Express, but for a graphics card you need a PCI Express x16 slot. There are three versions of this slot(your motherboard only has one that is 3.0), but they’re backwards compatible, so a modern PCI Express 3.0 graphics card will work in a motherboard with a PCI Express x16 2.0 slot.(you have 3.0 slot so its best)

It's most common to use the upper-most one for a graphics card, but if you're fitting two cards in an nVidia SLI or AMD Crossfire setup, you'll need both. Check which standard your motherboard supports before investing in a pair of cards, though.
Thanks a lot, Man!
So, I have 3.0 which is the latest? If so, nice! But I can't find the size of GPUs PCIe Slot. Let's say of 1050ti and what I own, gt 710. I know, mine is 8cm, but I want to see it on the online specs. So what do I write while searching?
Also, does newer GPUs only supports x16 and not x4? (newer meaning maybe gtx 1050ti series+).

And yeah, my motherboard specs say it supports crossfire, not sli, so should I invest on AMD? I mean, I can use my GT 710 on the pcie slot below(x4) and a new GPU on the above pcie slot(x16)?? Illustrated by me: View: https://imgur.com/a/vcOW99a

Again Thanks, hope to get the answers soon!
 
I'll just chime in....

The bold, underlined parts below informs us of the PCI-E generation/standard/revision:
  • 1x PCI-E 3.0 x16 @ x16
  • 1x PCI-E 3.0 x16 @ x4
  • 2x PCI-E 3.0 x1 @ x1
The bold, underlined parts below informs us of the length of the PCI-E slot itself:
  • 1x PCI-E 3.0 x16 @ x16
  • 1x PCI-E 3.0 x16 @ x4
  • 2x PCI-E 3.0 x1 @ x1
The bold, underlined parts below informs us of the bandwidth available to the installed component:
  • 1x PCI-E 3.0 x16 @ x16
  • 1x PCI-E 3.0 x16 @ x4
  • 2x PCI-E 3.0 x1 @ x1
As NoMercyBeAst mentions, there are different 'versions' of the PCI-E slot, each with a different theoretical maximum data transfer rate. This is a handy reference: https://en.wikipedia.org/wiki/PCI_Express#History_and_revisions
 
Okay, so I have a Gigabyte Z370M D3H motherboard. I was reading its specs here about the Expansion Slots, and I can't understand most of it. What's " x16 ". I have measured it's size which is 8cm.
But what's the meaning of
  • 1x PCI-E 3.0 x16 @ x16
  • 1x PCI-E 3.0 x16 @ x4
  • 2x PCI-E 3.0 x1 @ x1
I only see 2 slots for gpu/sound card, then why is there 3 bullet points. Another thing, I have a Geforce GT 710 GPU (no fan version), but it dosen't state the pcie slot size anywhere. So what term do I search to get the size of GPU's slot size? Planning to buy gtx 1050 ti and up GPUs but I don't know if it fits...
Please Explain. And Thanks in advance.
  • 1x PCI-E 3.0 x16 @ x16 - means there is a single PCI-e slot that is 16 lanes 'big' physically and wired for all 16 lanes. That's the 'fastest' slot on the motherboard and should be used for the GPU.
  • 1x PCI-E 3.0 x16 @ x4 - means there is a single PCI-e slot that is 16 lanes 'big' but is actually wired for only 4 lanes. That means it's only 1/4th the speed capability of the other x16 slot but that doesn't really matter for a second GPU, in SLI for instance.
  • 2x PCI-E 3.0 x1 @ x1 - means there are two PCI-e slots that are each 1 lane 'big' and wired for the 1 lane. These are the slowest and smallest since they only have one lane for data traffic.
A 'lane' is a lane of data traffic across the PCI-e bus. Think of it as a lane of a superhighway: more lanes, more traffic that can pass simultaneously.

The 3.0 means they are all PCI-e Generation 3.0...the latest iteration of the PCI-e standard that's in mass production. It offers faster data transfer rates than Generation 2 and prior. Don't worry about compatibility, though, as all generation devices work in all generation slots at the lowest generational capability.
 
Last edited:
Mar 24, 2019
37
3
35
0
I'll just chime in....

The bold, underlined parts below informs us of the PCI-E generation/standard/revision:
  • 1x PCI-E 3.0 x16 @ x16
  • 1x PCI-E 3.0 x16 @ x4
  • 2x PCI-E 3.0 x1 @ x1
The bold, underlined parts below informs us of the length of the PCI-E slot itself:
  • 1x PCI-E 3.0 x16 @ x16
  • 1x PCI-E 3.0 x16 @ x4
  • 2x PCI-E 3.0 x1 @ x1
The bold, underlined parts below informs us of the bandwidth available to the installed component:
  • 1x PCI-E 3.0 x16 @ x16
  • 1x PCI-E 3.0 x16 @ x4
  • 2x PCI-E 3.0 x1 @ x1
As NoMercyBeAst mentions, there are different 'versions' of the PCI-E slot, each with a different theoretical maximum data transfer rate. This is a handy reference: https://en.wikipedia.org/wiki/PCI_Express#History_and_revisions
So how do I know which GPU has which PCI-e slots? cuz when I google it, nothing comes up.. Help, please.
And does all x16 lane PCIe slots are of the same size? which is 8cm approx?

Also while you're at it, can you answer the below questions, please?


I can't find the size of GPUs PCIe Slot. Let's say of 1050ti and what I own, gt 710. I know, mine is 8cm, but I want to see it on the online specs. So what do I write while searching?
Also, does newer GPUs only supports x16 and not x4? (newer meaning maybe gtx 1050ti series+).

And yeah, my motherboard specs say it supports crossfire, not sli, so should I invest on AMD? I mean, I can use my GT 710 on the pcie slot below(x4) and a new GPU on the above pcie slot(x16)?? Illustrated by me: View: https://imgur.com/a/vcOW99a

Again Thanks, hope to get the answers soon!
 
So how do I know which GPU has which PCI-e slots? cuz when I google it, nothing comes up.. Help, please.
And does all x16 lane PCIe slots are of the same size? which is 8cm approx?

Also while you're at it, can you answer the below questions, please?


I'm not sure what your concern is with GPU slot size...any modern GPU will use a PCI-e x16 slot and they are all the same size. All you should be concerned with is using the one that is also wired for x16. Your motherboard manual will tell you which slot to use for the GPU, and it's almost always, if not always, the one closest to the CPU.

Don't concern yourself with SLI or Crossfire so select any GPU that makes the best value sense for you.

To use either of those (SLI or Crossfire) you have to buy two matched GPU's. Even then performance doesn't equal just buying a single, higher performing GPU, that is probably cheaper than the two combined (short of a heinously expensive 2080ti). And what's worse: SLI or Crossfire really only works on certain select games.
 
You're fundamentally misunderstanding something: A graphics card won't have a PCI-E slot, because it isn't a component you slot something into; you slot the graphics card into a PCI-E slot on the motherboard. Effectively you're looking for something which isn't there.

It isn't so much that the graphics card which supports a particular data transfer rate, it is the PCI-E slot which does. It's possible to install an RTX 2080 ti into a PCI-E 1.0 x16 slot; it just won't have the theoretical maximum data transfer rates as being installed in a PCI-E 3.0 x16 slot (all else being equal). NoMercyBeAst had already explained this aspect of compatibility.

The PCI-E standards I linked are standards for a reason. They conform to certain physical specifications. I would imagine the vast majority of graphics card buyers are more concerned about the length of the card itself rather the length of the PCI-E slot it was made to fit into.

drea.drechsler further explains the data transfer aspect. A physical x16 slot can run at x4 theoretical speeds depending on usage and wiring. It's why with multiple graphics card set ups you may see things like x16/x8, or x8/x8/x4 for data transfer rates.

Whether you want to use multiple graphics cards is up to you. Current standard advice is don't bother if it's for gaming because very few games are coded to use multiple graphics cards; it's best to get the most powerful graphics card you can afford.
 
Mar 24, 2019
37
3
35
0
it's best to get the most powerful graphics card you can afford.
I'm not sure what your concern is with GPU slot size...any modern GPU will use a PCI-e x16 slot and they are all the same size.
I now understand, but what if the GPU's version is 4.0 and my Motherboard's pci-e slot is 3.0. It will limit the bandwidth of the CPU, so how can I know the PCI-e version of the Card?
I'm a little new to this stuff, so I don't want to mess up something.

And another thing, will GT 710(no fan edition) work on the 1x PCI-E 3.0 x4 with a GTX 1050ti on the x16 slot? I mean the Drivers won't interfere, right? I don't want to waste the GT 710 if I buy a new GPU, so I wanna plug that in the x4 and record with it in OBS or to get some rendering boost, etc. (btw, what else can I do with 2 gpus and no sli/crossfire)

Thanks a lot for these replies, I learned a lot today. Thanks again!
 
I now understand, but what if the GPU's version is 4.0 and my Motherboard's pci-e slot is 3.0. It will limit the bandwidth of the CPU, so how can I know the PCI-e version of the Card?
I'm a little new to this stuff, so I don't want to mess up something.

And another thing, will GT 710(no fan edition) work on the 1x PCI-E 3.0 x4 with a GTX 1050ti on the x16 slot? I mean the Drivers won't interfere, right? I don't want to waste the GT 710 if I buy a new GPU, so I wanna plug that in the x4 and record with it in OBS or to get some rendering boost, etc. (btw, what else can I do with 2 gpus and no sli/crossfire)

Thanks a lot for these replies, I learned a lot today. Thanks again!
EDIT ADD: BTW, I assume this is theoretical since there are no Gen 4 devices I know of right now...

If your GPU support PCI-e Gen 4 and you put it in a PCI-e Gen 3 slot it will operate at Gen 3. It won't cripple the CPU any more than that: all other PCIe lanes from the CPU will operate at the maximum generation of the device that it's attached to up to Gen 3. The card's specifications should say what maximum PCI-e generation it supports.

The drivers will install to use whatever compatible card it finds in whichever slot it's in.

Plugging the GT710 in the second slot when you buy a new video card and using it is pretty much getting into advanced operating modes with dual GPU's. While it can possibly be done, it gets a bit complicated installing drivers to handle both cards simultaneously. Each system will handle it differently and you DO have to worry about one conflicting with the other. Don't get me wrong it can (usually) be done but it's well out of the ordinary. You can try it but don't be surprised if it doesn't work as expected and getting help will be dicey.

But once you have that, you can set up dual virtual machines and dedicate the video resources to specific machines. It's been something they've done in Linux for quite some time and lately doing in Windows 10.
 
Last edited:
Mar 24, 2019
37
3
35
0
If your GPU support PCI-e Gen 4 and you put it in a PCI-e Gen 3 slot it will operate at Gen 3. It won't cripple the CPU any more than that: all other PCIe lanes from the CPU will operate at the maximum generation
I see then it's fine.

Plugging the GT710 in the second slot when you buy a new video card and using it is pretty much getting into advanced operating modes with dual GPU's. While it can possibly be done, it gets a bit complicated installing drivers to handle both cards simultaneously. Each system will handle it and you DO have to worry about one conflicting with the other. Don't get me wrong it can (usually) be done but it's well out of the ordinary. You can try it but don't be surprised if it doesn't work as expected and getting help will be dicey.
Yes i kinda know that because tried to run my Gt 630 on the x4 slot before I knew any of these things. So after I plugged it in, and installed the 630 driver and then updated the 710 driver the gt 630 was not detected at all. But the 630 is a broken anyways so I don't mind that.

So can you provide me a guide to do this, I really wanna use 2 gpus for editing and using 1 (weak one) for recording.

you can set up dual virtual machines and dedicate the video resources to specific machines. It's been something they've done in Linux for quite some time and lately doing in Windows 10.
I actually have VM Ware but I don't use it often, only for specific purposes. But I really want to improve day to day stuff, like gpu1 for game, wile gpu0 for browser. Or gpu0 while rendering and gpu1 for playing, etc.
gpu0= 1st GPU and gpu1= 2nd GPU.

Edit: Btw, it's becoming late here so I'll only be able to reply after like 6-8 hours . Sorry and thanks! again xD!
 
...
So can you provide me a guide to do this, I really wanna use 2 gpus for editing and using 1 (weak one) for recording.
...
I actually have VM Ware but I don't use it often, only for specific purposes. But I really want to improve day to day stuff, like gpu1 for game, wile gpu0 for browser. Or gpu0 while rendering and gpu1 for playing, etc.
gpu0= 1st GPU and gpu1= 2nd GPU.
...
I have no clue where to find a guide for fool-proof ways to set up dual GPU system. Maybe someone else would pitch in here. Even if there were one it would be highly variable in success as different motherboards and different GPU models requiring different driver packages creates so many variables. Not many are likely to have yours or even similar configuration.

I'd connect a second monitor to my GPU before I'd put in a second one to do what you want. Most modern GPU's will easily handle a second display interfacing benign stuff like web browsing, word processing, spreadsheets, email and etc. while playing a game on the main display. CPU rendering (not GPU rendering using OpenCL or something) shouldn't be a problem either if you have a powerful enough system.
 
Mar 24, 2019
37
3
35
0
guide for fool-proof ways
Full? Or you meant I'm a fool xD...

Most modern GPU's will easily handle a second display interfacing benign stuff like web browsing, word processing, spreadsheets, email and etc
So everything on the second screen will only be powered by the second GPU if i plugged it to the second card? Sweet. So that means rendering and recording. I have a i5-8400 with iGPU UHD 630 so it's not so powerful when it comes to rendering.

Thanks again!
 
Mar 24, 2019
37
3
35
0
Most modern GPU's will easily handle a second display interfacing benign stuff like web browsing, word processing, spreadsheets, email and etc
Whether you want to use multiple graphics cards is up to you
Obakasama or drea.drechsler, can you answer the question quoted below, please? Thanks if you did in advance :p

So everything on the second screen will only be powered by the second GPU,? if I plugged it to the second card? Including rendering and recording and any other stuff?
 
So everything on the second screen will only be powered by the second GPU if i plugged it to the second card? Sweet. So that means rendering and recording. I have a i5-8400 with iGPU UHD 630 so it's not so powerful when it comes to rendering.
You're probably getting a bit confused and need to understand that the iGPU isn't a discrete GPU. I can't say whether you can operate the iGPU simultaneously with a discrete GPU, it probably depends on specific model anyway, so this answer assumes you've disabled the iGPU when you've installed a discrete GPU.

If you plug two monitors into a single discrete GPU there is no need for a second discrete GPU at all. The one single GPU will provide an output for both monitors. Windows10 is perfectly capable of dealing with such a dual-display system in that it provides options for putting separate desktops on each screen, extending the one desktop to both screens or mirroring the one desktop on both screens. By doing it this way you can completely side-step any and all issues with trying to get two driver packages working comfortably together.

Set up Windows for two desktops and play a game on one desktop(screen) while browsing on the other desktop(screen). Most any modern GPU is very comfortable doing this and you'll still get very near it's best performance for the game assuming low-impact useage on the other screen.
 
Mar 24, 2019
37
3
35
0
You're probably getting a bit confused and need to understand that the iGPU isn't a discrete GPU. I can't say whether you can operate the iGPU simultaneously with a discrete GPU, it probably depends on specific model anyway, so this answer assumes you've disabled the iGPU when you've installed a discrete GPU.

If you plug two monitors into a single discrete GPU there is no need for a second discrete GPU at all. The one single GPU will provide an output for both monitors. Windows10 is perfectly capable of dealing with such a dual-display system in that it provides options for putting separate desktops on each screen, extending the one desktop to both screens or mirroring the one desktop on both screens. By doing it this way you can completely side-step any and all issues with trying to get two driver packages working comfortably together.

Set up Windows for two desktops and play a game on one desktop(screen) while browsing on the other desktop(screen). Most any modern GPU is very comfortable doing this and you'll still get very near it's best performance for the game assuming low-impact useage on the other screen.
Yes I know quiet well that one GPU is enough for 2 monitors assuming I'll do low impact work in the other monitor. But I Do want to use 2 gpus so that one can be used for gameplay and other for record which is a high impact work. Because I don't want to sell/waste my old one cuz it works fine, just not enough speed as a newer model of GPU.
 
... But I Do want to use 2 gpus...
If that's your goal then by all means proceed with the experiment.

I'm not sure how much better a GT630(or GT710) will render while you're playing a game on a 1050ti....vs just using the 1050ti for both...since there is also the CPU impact (I5-8400 = 6 threads maximum meaning not a great multi-tasking CPU) to consider. But it's probably going to be a zero (additional) cost endeavour so why not try it?

EDIT add:

OK...let's consider something else too. You'll be plugging that GT710 (I'm assuming that's the one you'd want for your second discrete GPU) into the second PCI-e x16 slot, the one that's wired for x4. That means it will be hobbled with 1/4 the maximum potential PCIe speed (x4 bandwidth vs. x16) and it already operates at PCIe Gen2 speed which is 50% the speed of Gen3. In other words: you'll have very slow PCIe bandwidth to that GT710.

A GT710 surely can't render very fast, but then GPU rendering (using Luxmark in my case) uses the PCIe bus quite heavily. It will be curious to know how much that impacts overall rendering performance in that scenario. I'd actually consider putting the 1050ti in the second PCIe-X16 since games use the PCIe bus very lightly in the midst of gaming action.
 
Last edited:
Reactions: Mr.Stork
There has been mention on the forums of getting separate displays using a discrete graphics card and the Intel's integrated graphics. I'm not too sure about two discrete graphics cards. In just display cases, and as drea.drechsler states, it's easier to just use the one graphics card.

But it sounds like your aim to use two different and discrete graphics cards for different purposes. I suspect this may be dependent on the software you're using. For example, Blender has this capability of choosing between devices to use for rendering.

As a note to the GT 710 and bandwidth.... I'm not too sure about that. On a theoretical level it sounds right, but looking at the graphics card's interface it doesn't look like it spans the entire x16 slot dimensions. If it doesn't span the entire length of an PCI-E x16 slot, then it wouldn't be able to maximise the bandwidth.
 
Reactions: Mr.Stork
Mar 24, 2019
37
3
35
0
If that's your goal then by all means proceed with the experiment.

I'm not sure how much better a GT630(or GT710) will render while you're playing a game on a 1050ti....vs just using the 1050ti for both...since there is also the CPU impact (I5-8400 = 6 threads maximum meaning not a great multi-tasking CPU) to consider. But it's probably going to be a zero (additional) cost endeavour so why not try it?

EDIT add:

OK...let's consider something else too. You'll be plugging that GT710 (I'm assuming that's the one you'd want for your second discrete GPU) into the second PCI-e x16 slot, the one that's wired for x4. That means it will be hobbled with 1/4 the maximum potential PCIe speed (x4 bandwidth vs. x16) and it already operates at PCIe Gen2 speed which is 50% the speed of Gen3. In other words: you'll have very slow PCIe bandwidth to that GT710.

A GT710 surely can't render very fast, but then GPU rendering (using Luxmark in my case) uses the PCIe bus quite heavily. It will be curious to know how much that impacts overall rendering performance in that scenario. I'd actually consider putting the 1050ti in the second PCIe-X16 since games use the PCIe bus very lightly in the midst of gaming action.
There has been mention on the forums of getting separate displays using a discrete graphics card and the Intel's integrated graphics. I'm not too sure about two discrete graphics cards. In just display cases, and as drea.drechsler states, it's easier to just use the one graphics card.

But it sounds like your aim to use two different and discrete graphics cards for different purposes. I suspect this may be dependent on the software you're using. For example, Blender has this capability of choosing between devices to use for rendering.

As a note to the GT 710 and bandwidth.... I'm not too sure about that. On a theoretical level it sounds right, but looking at the graphics card's interface it doesn't look like it spans the entire x16 slot dimensions. If it doesn't span the entire length of an PCI-E x16 slot, then it wouldn't be able to maximise the bandwidth.
Thanks to both of you for helping me and answering all my questions as I asked even more questions making the thread longer. I'll just have to experiment with both slots and gpus. My concern was will it support, now I know there's a chance it will not work, so I don't mind.

Thanks again and I look forward to getting more answers in future and I will be closing the thread. :) 👍
 

ASK THE COMMUNITY

TRENDING THREADS