Multi-GPU Setups: The Basics Of CrossFire And SLI

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

zuke

Distinguished
Sep 11, 2008
27
0
18,530
Great article. I like the fact that you tested the performance hit the x16/x8 combo would potentially cause.

I'm still managing with two 8800GTS in SLI on XP. I didn't know you couldn't run dual monitors in SLI on XP until after I bought & installed the 2nd card, so I essentially never use my 2nd monitor now because I hate having to go in and change the setting everytime.
 

singjai

Distinguished
Feb 2, 2009
9
0
18,510
I know this is more a gaming article, but it's frustrating that the Cyberlink software bundled with my new Blu Ray drive doesn't support SLI or Crossfire. Cyberlink even advises me to disable SLI. Amazing.
 

evanuisance

Distinguished
Aug 5, 2010
1
0
18,510
Hi guys,

I was of the understanding that if you have two cards with different memory capacity then the lower amount of the two is used from the second card that has higher capicity.
i.e. 8600 GT 512MB DDR2 and 8600 GT 1GB DDR2 in SLI will result in only 512MB being used from the second card.
I am planning to use this setup with a second card my friend's offloading onto me.
Could you please clarify?
 

Crashman

Polypheme
Former Staff
[citation][nom]zipzoomflyhigh[/nom]Again Tom's tests 16x vs 8x with xfire/sli but fails to test 16x/4x vs. 16x/8x or 16x/16x. With PCI-E 2.0 (double bandwidth of 1.0), many want to know how much loss is there from a second card in 4x. The burning question nobody can answer.[/citation]

With a performance loss of only 8% when moving from a single x16 to a single x4 v2.0 slot, testing x16/x4 at v2.0 wasn't a priority. Had the losses been bigger, it would have been.

The bigger problem is that the system was essentially CRIPPLED by the CPU in SLI, to the point where the 4% difference between x16 and x8 (seen in single cards) vanished at anything less than 2560x1600. With that difference gone, the next step down (x8 to x4) was also 4% with a single card, so you're looking at a difference of less than 4% at 1680x1050 and 1920x1080, again due to CPU limittations.

If we get a system that can stay at 5GHz for more than a few days without burning down the lab, we might consider further testing at resolutions most users can take advantage of. Otherwise, our future test plans are focused on going beyond 2560x1600 when testing multiple GTX 480 configurations.
 

double_d1

Distinguished
Dec 5, 2010
34
0
18,530
I have two XFX Radeon HD4850's running Crossfire and always wondered if they were both working, I honestly can't tell the difference when running Flight Simulator X. I have the Crossfire logo lit up in the upper right hand corner. I'm not impressed with it, I was expecting more of something.
double_d1
 

technogiant

Distinguished
Oct 31, 2007
80
0
18,640
Sorry to drag this post back to life......but now that we have PCIe 3.0 enabled motherboards and graphic cards with huge pcie bandwidth do we still really need to use sli bridges?....they only carry 1Gb/s data...tiny compared to the pcie bandwidth.

Reason I ask is that nvidia is being quite clever only putting 2 way sli connections on its mid range cards.....if you didn't need the bridge then with an appropriate mobo you could 3 or 4 way sli them.

Certainly AMD's HD7750 doesn't seem to be hampered by no crossfire bridge according to the below report showing 100% scaling in many instances.

http://www.tweaktown.com/reviews/4588/amd_radeon_hd_7750_1gb_reference_video_cards_in_crossfire/index4.html

But the HD 7750 is a little too weak in the memory system with a 128 bit memory bus and only 1GB of GDDR5 to be worth while in a 3 or 4 gpu setup.

But a more mid range card with a 256 bit memory bus and 2+ GB of vram may be worth considering if they would work without the bridge connector.

3 or 4 gpu's would produce a lot of gpu power and I've read that 3 way gpu's produce less micro stutter than 2 way gpu's.

Having said such a use would probably be locked out by the drivers as amd/nvidia want us to buy the flag ship products at premium price.
 
Status
Not open for further replies.

Latest posts