SLI / CrossFire FAQs

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

emp

Distinguished
Dec 15, 2004
2,593
0
20,780
Early testing shows the SLI GT outperforming consistently the GTX (Which is double the price), but we have to wait and see if a price adjustment occurs in the upcoming weeks before you jump into any conclusions, for the GT, GTS, and GTX.
 
a single 8800GTX costs a little more than a 2 8800GT in SLI, and it consummes less power ,less heat and also more memory , however in games that benefit from SLI, 2 8800GT beat a single 8800GTX
 

jive

Distinguished
Mar 10, 2007
213
0
18,690
Hi, I have a small question.

when you go sli it's better to have the same card two times or it's possible to have cards coming from different company. I already have a evga 8800gts 640mg superclocked and want to add a second one. But I find some whith a lower price and it's not evga. Can you tell me if there some problems you know could happen from the mixed up of two card.

thanks
sorry english is not my first langage.
 
What about brands ?
Well the brand doesnt matter , again for example , you can use XFX card with EVGA card , or SAPPHIRE card with
DIAMOND card , just make sure they have the same memory and same clocks.
 

kpo6969

Distinguished
May 6, 2007
1,144
0
19,290
Can I mix and match graphics cards with different sizes of memory?
While it is not recommended, NVIDIA does offer this flexibility using Coolbits. When purchasing a second graphics card, you should try to match the memory size so that you are ensured full value and performance from your purchase. For example, if your first card is a GeForce 6600 GT with 128MB of memory, you should purchase a second GeForce 6600 GT with 128MB of memory. However, using Coolbits (value set to 18), you can force both of the cards to use the lower of the two memory sizes and operate together in SLI mode. When dissimilar memory sizes are enabled to work together using Coolbits, the effective memory size for each card becomes the smaller of the two memory sizes. Instructions to enable this feature can be found here.

Can I mix and match graphics cards is one of them is overclocked by the manufacturer?
Yes. A GeForce 7800 GTX that is overclocked (for example BFG GeForce 7800 GTX OC) can be mixed with a standard clocked GeForce 7800 GTX

http://www.slizone.com/page/slizone_faq.html
 

ira176

Distinguished
Mar 19, 2006
240
0
18,680


I have two of the same model X1950 Pro's 256 MB working together. Neither is branded as a crossfire edition. The two that I have are made by Sapphire. This may not be the case for all X1950 varieties out there though.
 

Blue frog

Distinguished
Nov 6, 2007
28
0
18,530
Thanks for all this info - really helpful but still a bit confused.
I have just upgraded to a Asus mobo as it has 2 PCIE slots which both run at x16 as I want to go to crossfire later- the manual says that the board supports crossfire. Does this mean I don't need a crossfire version of the second graphics card and can just buy a second "standard" card which matches my current one?
 

ira176

Distinguished
Mar 19, 2006
240
0
18,680


Maziar,
I know it sounds strange, but maybe these cards are natively "crossfire edition" each. Here's a pic of the two exact same card models out of the box and inside the box:


 

ira176

Distinguished
Mar 19, 2006
240
0
18,680
I guess this information from Wikipedia.org sums it up.


"X1900 and X1950 series
The X1900 and X1950 series fixes several flaws in the X1800 design and adds a significant pixel shading performance boost. The R580 core is pin compatible with the R520 PCBs meaning that a redesign of the X1800 PCB was not needed. The boards carry either 256 MiB or 512 MiB of onboard GDDR3 memory depending on the variant. The primary change between R580 and R520 is that ATI changed the pixel shader processor to texture processor ratio. The X1900 cards have 3 pixel shaders on each pipeline instead of 1, giving a total of 48 pixel shader units. ATI has taken this step with the expectation that future 3D software will be more pixel shader intensive.[10]

In the latter half of 2006, ATI introduced the Radeon X1950 XTX. This is a graphics board using a revised R580 GPU called R580+. R580+ is the same as R580 except for support of GDDR4 memory, a new graphics DRAM technology that offers lower power consumption per clock and offers a significantly higher clock rate ceiling. X1950 XTX clocks its RAM at 1 GHz (2 GHz DDR), providing 64.0 GB/s of memory bandwidth, a 29% advantage over the X1900 XTX. The card was launched on August 23, 2006. [11]

The X1950 Pro was released on October 17 2006 and was intended to replace the X1900GT in the competitive sub-$200 market segment. The X1950 Pro GPU is built from the ground up on the 80 nm RV570 core with only 12 texture units and 36 pixel shaders. The X1950 Pro is the first ATI card that supports native Crossfire implementation by a pair of internal Crossfire connectors, which eliminates the need for the unwieldy external dongle found in older Crossfire systems."


It seems that only the X1950 Pro which is based on the RV570 core supports crossfire natively. The other X1900/50 cards, may still need the crossfire edition card, and use the external crossfire dongle cable.

 

Zeddicus

Distinguished
Nov 6, 2007
4
0
18,510
Hi everybody,

All this info is really nice, but doesn't solve my problem

I run Vista Ultimate x64 on an ASUS M2N32-SLI Deluxe motherboard with 2 Asus EN7950GX2.

I've tried everything and searched all over the web, I just can't get any SLI options in the Nvidia Control Panel.

I've re-seated both card, got the latest driver, udated my motherboard's bios, did all of Vista's Update. It's no use.

Anybody know the solution to this ?


 

4745454b

Titan
Moderator
I thought it was a bit odd that you'd take the time to write this as I've already posted such a thing several times in the forums. This website has answers to nearly all the SLI questions you could come up with. http://www.slizone.com/page/slizone_faq.html Frankly, I would have just told people to go there instead of trying to write what you wrote. As GGA said, there are some problems, so lets get at them.

BUT the 7800GTX 512 will lower its clock to operate with the 256 one , so it wont have its true power.

Incorrect. The 512MB card won't lower any of its clocks at all. It will simply use only half of its ram. The clocks will stay the same, so it will still use its "true power". I would also mention that to do this, you have to use coolbits. If this were to be a sticky, I would post what you need to do in coolbits to make this work. From what I've heard, GF8 cards don't work with coolbits, so I'm not sure this can be done. You also failed to mention whether this is possible with Crossfire.

just make sure they have the same memory and same clocks

Uhmmmm, I thought we said memory and clock speeds don't matter? I know what you are trying to say. Why spend the money on a card with more Vram if you aren't going to use the extra ram? If you already have a card with 256MBs of ram, buying another card with 256MBs of ram would be more cost efficient, as buying one with 512MB would cost more.

WARNING: Some RAMs have SLI or CrossFire logo on them.

Seeing as you brought these up, you need to explain (briefly) what these are.

WARNING: Therer are some Power Supplies which aren't in the List

There are, or There're

(I dont mean SLI or CrossFire wont be good for resolutions like 1600x1200 or lower , i am just saying that SLI or CrossFire shines in higher resolutions .

You have a starting bracket, but no ending one. (you are missing one of these ")" ) At the start of the next sentence you start with the word Also. This is a grammatical no no.

Do SLI or CrossFire double the memory ?

Does....

This may not be the case for all X1950 varieties out there though.

There are three x1xxx cards that support crossfire without a crossfire edition card. The first is the x1950pro, next is the x1950GT, and the last is the x1650XT. These are the last three cards to come out from ATI, and the only three that have the internal crossfire bridge. The only card(s) that allows for Crossfire is another card with the same chipset. Read this for more info. http://ati.amd.com/technology/crossfire/howitworks.html (look for the squares that are colored red in the middle with a dot in them.)

Seeing as at one point you used the wrong abbreviation for crossfire, I thought I'd point out the right one. The first few times I'd write Crossfire, but after that, you can switch to CF. I'd also try a bit more with the "2 vs 1" section. Getting two 8800GT is cheaper then a single ultra, and is probably a much better idea. Last, you seem to be writing this a a SLI page, if its going to include CF information, include CF information. Reading this, it seems to be SLI first, with a CF afterthought.
 

Ironnads

Distinguished
Sep 5, 2007
278
0
18,780
I noticed Toms latest review of Crysis in SLI states "two gts cards = 1500 meg - or whatever" but you say it doesn't work like that. I had also read that only one cards memory counts in dual GPU configurations.. Where's the truth then??

Ryan
 

Ironnads

Distinguished
Sep 5, 2007
278
0
18,780
here:



• Two SLI-enabled Nividia GeForce 8800 GTX @ 769 MB (total available graphics memory 1535 MB)



Is this shaboddle then?
 

4745454b

Titan
Moderator
There might be 1.5GBs of memory total across both cards, but when looking at the amount a SINGLE card has access to, then no. It important to think of the frame buffer. Each card would only have 768MBs of space for the frame buffer. This is important because the more things you want to do in this frame buffer, the more space you need. If you want to enable 8 levels of AA, you are going to need more memory to handle that amount then if you wanted only 2 levels. 1600x1200 needs more memory then 1280x1024.

Is it "shaboddle"? You could say no, as each card has 768MBs of memory, with the "total available graphics memory 1535 MB". You could also say it is, as each card is still limited to 768MBs of memory.
 

kpo6969

Distinguished
May 6, 2007
1,144
0
19,290

That includes card's memory and shared system memory.
I have a 8800GT SC 512mb and 3GB system memory.
total available graphics memory 1791mb
dedicated graphics memory 512mb
dedicated system memory 0mb
shared available system memory 1279mb
 
Status
Not open for further replies.