Gigabyte Gtx 650, which other gpu`s can i run SLI with?

Henrik Rye

Honorable
Apr 9, 2013
5
0
10,510
0
how do i find witch graphic cards i can use sli with? what is the recuirement for SLI to work? i have heard that as long as the clock speed is the same it shud be fine? do you aprove? or else, what is it that does mather?
 

ubercake

Splendid
Moderator
You can't run SLI with the GTX 650. So that's out of the question.

If you're asking how many watts you need to run 2 cards in an SLI configuration, it depends on the cards (not GTX 650s because you can't run these in SLI). At any rate a good 750W supply covers most situations with power to spare.
 

Onus

Titan
Moderator
A single GTX650 needs about 80W running flat out. A system with one of them can run on a quality 380W-400W PSU, so a second one can be added with a quality 500W PSU. The operative word in both cases is quality. Anything built by Seasonic (their own, XFX, some Antec, some PC Power & Cooling), Enermax/LEPA (their own), FSP (their own, some Antec), or Super Flower (Rosewill Capstone and new Kingwin) will be good.
 

ubercake

Splendid
Moderator


Again, just to be clear, you cannot run SLI with two standard GTX 650s. The Nvidia spec 650ti Boost editions are rated at a max of 134W each. Anything factory OC'd will use more.

So if we're talking two 650 ti boost editions, those power supply brands Onus listed at 500W or more would still be good for most systems with two 650 boost editions, but you should look for a PSU with at least two 6-pin PCIe connectors.
 

ubercake

Splendid
Moderator


If you could, it would be for the sake of affordability, but I don't think Nvidia or AMD want most people to be completely satisfied with a $220 SLI video solution.
 

Henrik Rye

Honorable
Apr 9, 2013
5
0
10,510
0
yes, but if i have two gpu`s and i want to sli them, if one of the cards needs 200v and the other one also needs 200v, when i have both, do i need at least 400v power supply? this is an example, the numbers are just made up **
 

Onus

Titan
Moderator
I think the unit you want is Watts (W), not Volts (V). Volts will depend on where you are located; e.g. in many parts of the world (including the USA), line voltages are ~115V-120V, but in other parts they may be 230V-240V; a few places have other voltages, such as 100V. What varies based on the graphics card(s) is wattage needed. The PSU sizes recommended are for the entire system, with a single card.
A quick and dirty way to identify the absolute maximum amount of power a graphics card needs is to count the number and type of PCIe power connections it has.
A PCIe slot can provide a maximum of 75W. A six pin PCIe connector can provide 75W, and an eight pin connector can provide 150W. In the case of a card with a single PCIe six-pin connector, the most that card might draw is 150W (75W from the slot plus 75W from the single 6-pin connector). So, if the PSU recommendation for one card is 400W, adding a second one means you will need no more than 550W. This number is obviously a high estimate, as the card may need as little as 76W (just enough to need that connector), in which case a 500W PSU would easily be sufficient.
The best thing to do is use a PSU sizing calculator such as the one at http://www.extreme.outervision.com/psucalculatorlite.jsp which has built-in tables of what graphics cards need, or find a review of the card in question and see if it states how much power it uses.
 

ASK THE COMMUNITY