PCIe - (x16+x4) vs (x8+x8)

Majestic One

Distinguished
May 1, 2011
257
0
18,790
Can some one please explain what I cant seem to google up at all about MoBo's w/ 2 PCIe sockets.

x16 speed on single GPU but the second PCIe/GPU socket runs @ x4 speed (seems common).

i was wondering if this is a big concern because I have found a couple of boards that are x16 w/ a single GPU but with 2 GPU's it divides evenly to an x8 + x8 for each GPU socket.

1] Not too critical, but was wondering why this happens/is like this.
but more importantly
2] is this anything to be affecting my decision on my next MoBo purchase?

thanks crew

EDIT:
I just found this article HERE @ Tom's similar to my question; can anyone elaborate any more.

I SWEAR everytime I go to-a-lookin' for my solution here @ Tom's - I never seem to score close enough - and this time I didn't even look at all and there was the above mentioned thread right at the top on my face.

My sincerest apologies for that MODS. Lesson learned.
 
Solution
but #1 slot will always be x16?

Yes, unless you have an 8x/8x board. If you have an 8x/8x board it will be 16x if you have nothing in the other slot, but will go down to 8x as soon as you do. If you have a 16x/4x board then the primary slot will always be 16x.

it is better to have an x8+x8 board if this is the case?

I would say yes, as there are some cases where 16x/4x doesn't provide frame rates as good as 8x/8x. I haven't really seen any cases where 16/4 provides horrible frame rates, but I suppose there could be a rare case where it does. I just know its usually lower.

man seems like people are getting ripped if they spen 5 bills plus on some GPU's and get HALF their intended usage!!!

It's not like the...
As I understand it a 16 X PCIE lane runs at the full bandwidth that a GPU is capable of. If you have a 2 lane 8 X 8 then each GPU is allotted half the bandwidth. A 2 lane 16 X 4 uses a 16 X lane for the first and a secondary 4 X lane for the second GPU.

On a single monitor rig there is no perceptible difference between a 16 X 16 mobo and an 8 X 8 mobo. The monitor has its bandwidth limitation too (and I think that it is a 16 X limit). It is only with multiple monitors that you have any perceptible difference because each can use a 16 X lane by itself.

 
do i understand this correctly:

if i do NOT SLI and use a monitor for EACH GPU then they both will run @ x16?

yeah, i guess i kinda failed to mention if and when in SLI mode.
AND i failed to mention YES, PCIe 2.0 slots...

does this still change/alter anything?

(man I HATE starting a thread unorganized - sorry 'bout that folks.)
 
No. As I understand it as soon as a device is plugged into that slot, it gets cut down. Meaning if you have an 8x/8x board, as soon as a card gets plugged into that second slot the first goes from 16x to 8x. Even if its a 1x sound card.

It doesn't matter if you enable CF/SLI. Its a matter of PCIe lanes and we still don't have enough of them. Even if you leave CF/SLI turned off, you won't get 16x on both.
 
not to be a rut, but #1 slot will always be x16?

AND it is better to have an x8+x8 board if this is the case?

man seems like people are getting ripped if they spen 5 bills plus on some GPU's and get HALF their intended usage!!!

sheeit, i might as well go with a single PCIe 2.0 or what??

THIS is crazy - what a RIP!!!

like you said - they need to fix or quit this crap of x16+x4 - or even x8+x8!!!!

what the hell ELSE would you put in a 2.0 slot??
 
but #1 slot will always be x16?

Yes, unless you have an 8x/8x board. If you have an 8x/8x board it will be 16x if you have nothing in the other slot, but will go down to 8x as soon as you do. If you have a 16x/4x board then the primary slot will always be 16x.

it is better to have an x8+x8 board if this is the case?

I would say yes, as there are some cases where 16x/4x doesn't provide frame rates as good as 8x/8x. I haven't really seen any cases where 16/4 provides horrible frame rates, but I suppose there could be a rare case where it does. I just know its usually lower.

man seems like people are getting ripped if they spen 5 bills plus on some GPU's and get HALF their intended usage!!!

It's not like the frame rates get cut in half. As the CF/SLI articles show there is nearly no difference between 16x/16x and 8x/8x. Just the bandwidth between the CPU and the GPUs get cut. Thankfully this is a minor piece of the graphics puzzle.

what the hell ELSE would you put in a 2.0 slot??

You can put all kinds of things in a PCIe slot. A 16x slot will take anything, even the more popular 1x cards. Sound card? RAID card?
 
Solution
PCIe slot is great for a wireless card, or as someone mentioned above, a sound card. Initially, I plugged in a Creative sound card, but could not hear any difference between it and on-board sound. Maybe bad ears. No intention to see the ear doctor because I do not want improved hearing, and be able to hear everything that others have to say about me. :)

I used mine for a USB 3 host card in one computer, and for a wireless card in another computer.
 
am I correct in saying EVEN IF i only plug a measly lan card into the second slot being x8, my primary gets cut to x8? but with a x16+x4 the primary WILL stay x16? and is that why most boards have this x16+x4 config?

if this IS the case - what would you guys choose:

A] a x16+x4 with onboard video for back up (i will be on one GPU to start). ($100 + 8 core support [AM3+])
http://www.newegg.com/Product/Product.aspx?Item=N82E16813157243
or
B] an x8+x8 with no onboard video for back up and be down completely if GPU fails ($90)
http://www.newegg.com/Product/Product.aspx?Item=N82E16813157198&cm_re=x8%2fx8_mode-_-13-157-198-_-Product

the x16+x4 board is quite a bit better for $10 more.

i'm tryin' here guys. thanks.

@Ubrales:
oh btw Uber, Iwas gonna tell you in that thread the other day about the Dead Kennedys - Jello Biafra actually ran for the mayor of Oakland in 1979 - thus all your honking that time on the freeway in the Bay area about your licence being Uber alles <- can you believe Firefox thinks these are misspelled words??.
 

Misspelled in English, yes! :)
 
RIGHT ON, men! You all just save me another $10 !!! LOL

you know what's funny but the absolute truth - I sometimes don't know crap about a specific something (slot speeds in this thread), but I have chosen a product that seems the best in all the other fields of what I do know about or have recently learned about thanks to most of you right here on this thread. I always seem to go back to the original choice I had made from the very beginning.

I am like everyone else learning and trying to get the best pop for the buck. i.e.; i saw the 8 core support - it caught my attention, and I really like all the features on the ASRock middle to higher end stuff, but then I see that x16+x4 again and went to solving that riddle. Thank to all of you, you make life not only easier but fun and quick.

So my hat's off to you all once again. Thank you.
 
nVidia will never allow {BIOS Key} x4 PCIe 2.x to be used as an SLI port whereas AMD CF will work on a x4 PCI. The frame rate loss depends upon the amount of 'raw' data being pushed to the GPU. In other words, 0xAA shows the 'raw' data limitations whereas 8xAA or 16xAA is a computational limitation of the GPU.

To confuse you more there are different flavors of PCIe, the newer e.g. P67's and Z68's have PCIe lanes that are NOT 'GPU' PCIe lanes. The LGA 1155 CPUs have 16 PCIe lanes going directly to the CPU and the chipset is not an intermediary like the X58. Chipsets like NF200 add 'pseudo' lanes but also add latency similar to the X58.
p67_block_diagram.jpg


Most high-end GPUs do saturate the x8 PCIe 2.x, however with the next rendition of PCIe, namely PCIe 3.x, nVidia 'might' allow x4 PCIe 3.x since its essentially the same potential bandwidth of x8 PCIe 2.x.

Here's a nice article -> http://www.techpowerup.com/reviews/NVIDIA/GTX_480_PCI-Express_Scaling/1.html
 
OH you know one more thing - if you were to choose a single pcie x16 slotted board what would you choose? because it may be a short while before I match my GTS 450 with it's twin.

I am off to see the wizard of humpty (dumpty). <-stupid/lame-o newegg joke LOL

i am a building freak - i have had a month and a half SOLID of browsing and shopping - it's true - it IS an addiction!!!
 
Take a look @ THIS board.
PCI Express 2.0 x16 = 3 (PCIE2/PCIE3: single at x16 or dual at x8/x8 mode; PCIE4: x4 mode).
(Supports ATI Quad CrossFireX, 3-Way CrossFireX, CrossFireX and Hybrid CrossFireX):
http://www.newegg.com/Product/Product.aspx?Item=N82E16813157207&cm_re=x8_mode-_-13-157-207-_-Product

Let's see if I get this PCIe x16 language;

SLOT # (if & when used in order from the CPU down filling them up):

1] only = x16
+2] SLI/CF (or what ever) = x8
+3] is a stand alone and runs @ x4 regardless if slots1 & 2 are filled?
 
^
that's a X-Fire only board..
SHUTE!!! LOL - I just checked my email for responses, and was going to ask if what you said was in fact CF only - that's what the reviews are saying anyway.

GHEEZE - it's time for you guys to kick down all your super duper links to good places to buy MoBo's (and I don't do tiger direct to easily - BAAAAD experiences w/ that crappy company with incorrect photos, descriptions and over all SHITE customer svc over the last year. If I HAVE to I will, because newegg only has a select F.E.W. x8+x8 AMD boards, man. SUCKS!
x8 Mode search @ newegg (ONLY 3 AMD mobo's!:
http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&Description=x8%20mode&bop=And&Order=PRICE&PageSize=20

btw, considering i have a unopened GTS 450 sitting here, I think I need to find something else. QUICK! I am ordering the 15th of this month.
 
no way mal. why would you say send it back?
evga superclocked edition for $99! can't beat that!

i have realized i amstuck in a rut looking for x8/x8 boards... then it hit me, WTF,man!!! whynot just look for damned x16/x16 boards! LOL

and I found this:
-890FXE
-4 x PCI-E x16 2.0 Slot (x16,x16, x4, x1)
-1600(OC)
-16GB
-$139.99
http://www.newegg.com/Product/Product.aspx?Item=N82E16813138193

EDIT:
CRAPOLA is this damned thing CF only TOO!??

i am begining to see what you're saying MalMental...!!! CRAP man i have already registered the GPU, got the bonus software and that means the box has been cut for the UPC code for the rebate...


no SLI, butmay just invest into this board anyway. I will x-Fire later on.
 
heeeey, MAN... that is a SLICK MOBO FOR THE PRICE - and it's an ASUS TOO!!!
THAT'S THE ONE!!!
http://www.newegg.com/Product/Product.aspx?Item=N82E16813131636
AS04M4N98TD-E.jpg


I REALLY like the heat pipe situation too! copper vs. aluminum - man can't go wrong with that!
shute, and the black and blue colors keep with my current theme also! LOL!!!
8+1 Phase (not 8+2 - but MAN - can't always have it all!!!)

THANK YOU, MalMental - I will be purchasing this board for sure - it looks to be a very good investment for my needs and wants right now. ALSO, just to let you know, I will be slapping a 955BE into this board w/ my a cooler master V6 GT into my Dragon Rider Case. Looks - well I am satisfied - and when that is satisfied... that means all the performance I need is there too.

I have realized I have been hell bent on AMD chipsets and have flat refused to look at the nForce stuff. I don't KNOW why I do some things - because I was under the impression nForce stuff was not up to par these days.

also idon't have a CLUE what Bulldozer is - but I am gonna go find out, RIGHT NOW!

 
Most people avoid the Nvidia chipsets because they run to hot. AMD chipsets don't need all that copper tubing to keep the parts cool. Nvidia chipsets need more power, and emit more heat compared to AMDs. Nothing horribly wrong with using that one, but I'd rather have the "new" AMD chipset so I can run CF or SLI. I like having options, not being tied to just one.
 

aaaaand what IS the "new" AMD chipset called???
because THIS i am Definitely interested in!!
 

Latest posts