GeForce And Radeon On Intel's P67: PCIe Scaling Explored

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

jdwn1976

Distinguished
Apr 6, 2011
3
0
18,510
Well this is surely a good review for my upcoming builds...

I hope to see similar reviews for the the upcoming Z68 board(s) & LGA 2011

Wulf Tech Howla!

http://jdwn1976.promodj.com/
joedawulfnoir @reverbnation, @soundcloud & @mixcloud
http://tinyurl.com/3ny37uv
 

Zeh

Distinguished
Dec 7, 2010
169
0
18,690
Would be very interesting to know how would two "lesser" cards behave in x16/x8/x4, and in XFire x8/x8 versus x16/x4, like the HD6850.

Also, a subjective opinion would be appreciated: did the performance drop significantly using x16/x4 instead of x8/x8? Any freezes, delays, artifacts, unexpected framerate drops, things like that?
 

Crashman

Polypheme
Former Staff
[citation][nom]Zeh[/nom]Would be very interesting to know how would two "lesser" cards behave in x16/x8/x4, and in XFire x8/x8 versus x16/x4, like the HD6850.Also, a subjective opinion would be appreciated: did the performance drop significantly using x16/x4 instead of x8/x8? Any freezes, delays, artifacts, unexpected framerate drops, things like that?[/citation]Bandwidth bottle necks tend to affect the faster parts of the game most, whether that narrow bandwidth is the CPU, DRAM, or PCIe system.
 

Zeh

Distinguished
Dec 7, 2010
169
0
18,690
[citation][nom]Crashman[/nom]Bandwidth bottle necks tend to affect the faster parts of the game most, whether that narrow bandwidth is the CPU, DRAM, or PCIe system.[/citation]

But how often - if at all - the PCIe x4 bandwidth is the bottleneck that causes microstuttering?

As an example, I could tolerate 30fps min instead of 33fps and 40avg instead of 50. Minimum framerates and microstutterings are quite important, imho.
I'd rather have 30min/45avg fps than 15min/60avg fps.
 

CyberAngel

Distinguished
Dec 11, 2008
113
0
18,680
[citation][nom]wrxchris[/nom]In the next article, you guys should address performance hit taken by a multi-GFX card, multi-monitor setup when switching from x16/x16 to x8/x8, as it is quite significant. This is what has kept me from upgrading my long-in-the-tooth Q9550 to a 2600k. I'm waiting for LGA2011 or Bulldozer.[/citation]

me, 2

I wonder how PCIe x32 would do in v3.0? eg. Quad banwith

since we are going to have 8GB or 12GB systems anyway the bigger RAM will hold bigger textures thus bandwidth becomes more important. CF/SLI is the only way to look at these things but are people having 2 similar cards or do they buy a new bigger card? so how about a test wikt fast 1-core Gpu together with the last generation main stream?
 

bak0n

Distinguished
Dec 4, 2009
792
0
19,010
I don't understand the extreme gpu's in a x4 slot. Why not do the test using a m/b that only has a x4 slot in it with a few different gpus to see where bandwidth cap of the slot is?
 

Zeh

Distinguished
Dec 7, 2010
169
0
18,690
[citation][nom]bak0n[/nom]I don't understand the extreme gpu's in a x4 slot. Why not do the test using a m/b that only has a x4 slot in it with a few different gpus to see where bandwidth cap of the slot is?[/citation]

Exactly what I want.
x4 may not be enough for 2 6950s, but is it enough for 2x6850/460?
 

neiroatopelcc

Distinguished
Oct 3, 2006
3,078
0
20,810
[citation][nom]Zeh[/nom]Exactly what I want.x4 may not be enough for 2 6950s, but is it enough for 2x6850/460?[/citation]

It isn't

At work we've got some (500+) hp systems with the motherboard and everything else on the 'wrong' side. This has the effect that the pcie x16 slot is in the uppermost slot facing the wrong way. Thus double width graphics can't fit neither in the retaining grid nor due to cpu cooler being in the way. Those systems feature a secondary pcie x16 slot with x4 wiring. I tried running a geforce 9600gt 512mb card in the x4 and in the x16 (modified the cooling, removed the back bezel). The result was something in the order of a 30% performance reduction. And we're talking bandwidth starving an old midrange card on a q45 based board.

I did this test in 2009, so it's fair to bet that a modern mainstream card will be killed in a x4 slot as well.
 

Zeh

Distinguished
Dec 7, 2010
169
0
18,690
Maybe it was a PCI-E 1.0 board? I believe it has half the bandwidth per lane a pci-e 2.0 has. I might be wrong tho.

Because looking at the charts on the Bonus page, two 6950s take a 2~4 fps drop most of the time.
 

Crashman

Polypheme
Former Staff
[citation][nom]Zeh[/nom]Maybe it was a PCI-E 1.0 board? I believe it has half the bandwidth per lane a pci-e 2.0 has. I might be wrong tho.Because looking at the charts on the Bonus page, two 6950s take a 2~4 fps drop most of the time.[/citation]It affects a single card more than a CrossFire pair because the more graphics cards you add, the more the CPU and RAM becomes a bottleneck.

Perhaps next time a 4.6GHz overclock is in order...
 

neiroatopelcc

Distinguished
Oct 3, 2006
3,078
0
20,810
[citation][nom]Zeh[/nom]Maybe it was a PCI-E 1.0 board? I believe it has half the bandwidth per lane a pci-e 2.0 has. I might be wrong tho.Because looking at the charts on the Bonus page, two 6950s take a 2~4 fps drop most of the time.[/citation]
945 and everything up until and including X38 is pcie v1. Everything later is v2.
 

Crashman

Polypheme
Former Staff
[citation][nom]neiroatopelcc[/nom]945 and everything up until and including X38 is pcie v1. Everything later is v2.[/citation]No. To begin with, the X48 is after the X38 even though it's the same chipset under a different name. And then there are the x4 slots on P55's, which are called PCIe 2.0 but operate at PCIe 1.1 bitrate.
 

neiroatopelcc

Distinguished
Oct 3, 2006
3,078
0
20,810
[citation][nom]Crashman[/nom]No. To begin with, the X48 is after the X38 even though it's the same chipset under a different name. And then there are the x4 slots on P55's, which are called PCIe 2.0 but operate at PCIe 1.1 bitrate.[/citation]

Ye forgot the x48 - but the p55 one's sort of pointless as nobody'd use that slot for graphics - it's not even always implemented, and if then not always with room for a physical x16 ...
 

Crashman

Polypheme
Former Staff
[citation][nom]neiroatopelcc[/nom]Ye forgot the x48 - but the p55 one's sort of pointless as nobody'd use that slot for graphics - it's not even always implemented, and if then not always with room for a physical x16 ...[/citation]Yes, but even though it's only in "some" implementations, it's they one we're constantly being asked about!
 
[citation][nom]baozhi[/nom]Would be nice to see this test on a GTX560 Ti, since it has a lot of headroom for OC, then compare oced version vs non oced. Also this might be interesting in GPUs that have diffrent versions with more and less RAM.[/citation]About 8 months late buddy. Try a fresher review.
 

chargeit

Honorable
Oct 5, 2012
429
0
10,860
So basically, running at x4 or x8 is fine as long as you're ok with not getting 100% benefit from your added card.

What I had assumed, too many people act like they know what they are talking about, when in reality they are just spewing off bs that someone else told them.

I also think most don't realize when someone is looking for a graphics boost, how much of a difference throwing a 2nd card in, even at x4 will make. I see so much poor advice on this subject that it makes me sick.

Just because you can't stand the idea of your 2nd card being slightly handicapped doesn't mean it isn't a viable option for someone hoping to get a fps boost.
 

Crashman

Polypheme
Former Staff
Since this article was published, cards got faster, they use data faster. On the other hand, we have now have PCIe 3.0 available as x8-x4-x4 on some Z77 and Z87 boards.

What I'm saying is that people have even more reasons to stay away from PCIe 2.0 x4 for graphics arrays.
 

chargeit

Honorable
Oct 5, 2012
429
0
10,860


I personally wouldn't do it using a high end card, but, it does seem that if you're working off of lower end cards, than you'd get a boost even at 4x.

I think people overestimate what the average user would consider a boost. I'd think that even adding 15 - 20 fps would be a huge boost, well worth the 100 - 150 bucks for a new lower end card.

For instance, with my single HD 7850, playing planet side 2 on medium settings I get 40 - 45 fps (While fighting more if in non-combat area), now add 15 - 20 fps to that, and all of a sudden I'm sitting on 60. Well worth 150 for a 2nd 7850. Really, how much would I have to spend on a new single card to expect that kind of performance increase?

I'm not so sure that the hardcore user would agree, they might want, and need a extra 30+ fps because of the 2nd card, but, that doesn't mean that all expect to have such drastic results.

If adding a 2nd card would get you from 40ish, to 50+ fps, why not? Sure, it might be better to buy a new mobo with x16 x8, but, I don't think that missing out on a little performance should be enough to totally remove the option.

 
Status
Not open for further replies.