NVidia Geforce GTX 260 SLI

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I guess we all have to at least have the same OS and a quad (around 3.2Ghz) in order to get some good comparisons. Try comparing your scores to others on the website, maybe you will find someone that has more or less your configuration.
 
Keep in mind he's running a new i7 920 with the X58 chipset. He's going to beat us both hands down anyhow. Seeing his scores, ultimately I believe my 680i chipset, and older Q6600 are what limit my GPUs. Really can't be any other way around it.
 
My friend has a Q6600 with 2 9600GTs. With his CPU @ stock speeds (2.4GHz), he gets around 11k, when he overclocked his CPU to 2.8, he got ~14.5K. Overclocking makes a huge difference. With my CPU @ 3.7GHz and 2 GPUs mildly overclocked I can get around 22.5K on 3DMARK06. Keep in mind, though, that I have the Core 216 variant which allows for a little more performance.
 
On a side note...
I'm finally getting a new video card 😍 for my rig that I built 12 months ago-(after waiting for years). I was using a friends old Radeon x1600, it's the only thing I haven't really replaced.

Specs.
Gigabyte X48-DS4
3.00 gigahertz Intel Core2 Duo Q6600
DELL S2409W [Monitor]
3328 Megabytes Usable Installed Memory
Windows XP

So I see this conversation as relevent. I was planning on getting the 260GTX for roughly 200$ now and hopefully going SLI within 6 months give or take whenever I see a reasonable price drop. My real question is do you guys think it is a good value for the money or is it the best fit with my system?
I just got RE5 and the x1600 is too old to play it on and well mostly I am just looking forward to spending a lot of close personal time with Dragon Age: Origins.
Any recomendations are helpful, I look to purchase by the end of the month with the top of my scale being at 200-250$.
 


At $200 the GTX 260 is way overpriced. Get an HD 4890. It's cheaper and is on par with the GTX 275 in most games and GTX 285 in some games. SLi really isn't cost effective right now, especially with a generation old card. For cheaper ($380) there's a better option: the 5870 (or a 5850 for $260); if you can wait [who knows how long], whenever the Nvidia's 3 series comes out you could get that.
 
If I were to buy now I'd get a 5850 and if I couldn't find anyone of those I'd get 2 4890s for Xfire. A GTX 260 C 216 should be well below $200 now. I would get one GPU and see how it plays all of your games. If you do SLI or Crossfire you need to overclock your CPU to get more out of it. So again, try one card, if it works great, you don't need another.
 
Hey bro its your cpu that has become the bottleneck now. Try running with same settings and a higher resolution, i bet you get near the same fps...
 

So your saying that the CPU is the reason me and jerreece are getting similar 3Dmark numbers while oneshot is getting much higher ones? (720 vs Q6600 vs i7) This is the first time I've encountered CPU bottnecks before, I always thought bottlenecks would occur at some other hardware, like the hard drive or something.
 


q6600 at 3.0 is pretty enough for gtx 260 sli setup...so dont worry maybe oc to 3.2 will help a little but not too much
 
@iode

System Specs:
AMD Phenom II 720 x3 @ 3.6 ghz
nVidia Geforce GTX 260 SLI'd with a core 192 and core 216
ASRock K10N780SLIX3 Motherboard
Corsair 750W Power Supply
4 GB DDR2 800 5-5-5-15

First off I see a problem with you pairing a 192 SP card with a 216 SP card, just because it seems to be working OK in SLI, you're really only supposed to pair identical cards, IE; Same Clock, Memory, and definitely same SP[Stream Processors] count.

Also 3DM06 favors Intel over AMD anyway, you'll get truer results running 3DMVantage on either Vista or Win7 as far as a AMD vs Intel comparison, when you ask others to post their scores to see if your SLI pairing is a problem, theres no true comparison unless they test on the exact same setup as yours.

You can get an idea as to whats going on if you test 3DM06 on each card individually hardware wise, hardware wise meaning only one card physically installed at the time in the machine, that will tell you how much performance difference the 2 cards produce individually.

I believe one card is hurting the other in SLI simply because they're not identical cards, and if you discover a major performance difference between the two cards individually, that would probably indicate the pairing being a problem.
 



I ran the bench with your same settings 1440 x 900, Anti Aliasing 8, Anistropic Filtering, All tests enabled.

Scores:
19,582 3Dmarks
SM2.0 Score 8102
HDR/SM3.0 Score 8298
CPU Score 6248

Disregard my CPU Score, as I already told you 3DM06 favors Intel anyway, the reason I posted my scores are because of the SM2 and SM3, even with a lower CPU score the SM2 and SM3 should be higher because thats graphics dependent, granted the CPU does have an effect there but its more GPU in those tests period.

FYI; When you run 3DM06 you really don't have to run all the feature and batch tests to get a score, just the GPU and CPU tests are required to get a score, takes less time.

I still think if your GPUs were identical your SM2 and SM3 would be higher, are there any differences in the GPU core Clock Speed and Memory Speed on those cards of yours? Just Curious?

Sorry I should have mentioned I'm running 2 MSI 260GTX 216SP cards in SLI.
 

Here is the GPU-Z screenshot of both my gtx260s

As you can see, NVIDIA SLI is detected and enabled. The Core 192, with its higher GPU clock seems to outperform the core 216 in both pixel fillrate and texture fillrate as well as a faster bandwidth. (*note: I never touched these cards out of the box, apparently, my 192 was factory overclocked) I read somewhere that Nvidia designed the 192 and the 216 so they can be paired without detriment, they are essentially identical cards, so I'm not sure if its the difference between the 216 and the 192 shader cores that's causing possible lesser values on 3Dmarks. I also read somewhere that when two cards are in SLI, their clock, memory, and shader speeds are aligned and made identical, why isn't that the case here?

PS I'm running Windows 7 Ultimate Retail (as provided by my university)
 
The biggest negative I see in your screenshot is the fact its 2 completely different core manufacture process a 55nm vs a 65nm, stream processor count difference, core clock difference, memory clock difference, shader clock differences?

Have you consulted Nvidia support?

Actually its amazing they'll even run in SLI, I'd say kudos to Nvidias new drivers, but what are they actually clocking down to, to be able to run together in the first place?

Since its 2 completely different GPU manufacturing processes, you cannot even BIOS flash them to the same settings, I'm surprised Nvidia would have allowed 2 different core manufacturing processes to have the same GTX260 branding period.

I read somewhere that Nvidia designed the 192 and the 216 so they can be paired without detriment, they are essentially identical cards

Post that article if you will, because in your screenshot they're definitely not 2 identical cards.

Also what are their individual 3DM06 scores?
 

No, I've never consulted Nvidia support, because I don't know what's wrong, if there really is something wrong. I just wanted to know if my results were consistent with the rest of the world's, I don't want to be lacking in performance especially after spending so much money buying the cards, it feels like a waste.

At your request I dug up the article where I read this: http://www.firingsquad.com/hardware/nvidia_geforce_gtx_260_216shader/page2.asp

They say that "If you’ve already got a GeForce GTX 260 and would like to purchase another for SLI, we can confirm that the new 216 shader GTX 260 boards are 100% compatible with the 192-shader GTX 260, allowing both GPUs to be combined together for SLI. Each board will run with all its shaders enabled, giving you a grand total of 408 shaders for the SLI system."

Is there a way to test individual graphics cards without manually taking them out of my motherboard? I'm currently in college and don't have the tools to perform maintenance on my computer at the moment.
 
Interesting article, Thanks for looking that up!

Is there a way to test individual graphics cards without manually taking them out of my motherboard? I'm currently in college and don't have the tools to perform maintenance on my computer at the moment.

Well you could disable SLI in the Nvidia controll panel, and swap the monitor connection back and forth.
 
One of my GPUs was factory overclocked even when I ordered the same version as the other one. The faster one just down clocks to the slower card's speeds. But I set them both to run at the overclocked card's speed with is 626-1350-1053, or they run at 576-1242-999. I would take each card out and manually test each one to try and see if indeed there is a problem. You also have all power connectors plugged into the GPUs, right?
 
q6600 at 3.0 is pretty enough for gtx 260 sli setup...so dont worry maybe oc to 3.2 will help a little but not too much

I'm actually @ 3.2Ghz now, and unfortunately this particular 680i LT board (or CPU) won't let me do anything higher. Haven't been able to get a stable OC higher than that. 🙁



I'd be interested in your CPU details. Since you're running the same video card setup I am. My scores are far less than yours. Likely that you just have a higher CPU clock speed I'm guessing?
 


He's using:

Intel Q9550 @ 3.83G w Xigmatek HDT S1283 Cooler
 
Thanks. :) I didn't get the chance to check his member config. At work, and being interrupted every 2-3 minutes doesn't help with that sort of thing. :) The difference between 3.83Ghz & 3.2Ghz I think is the telling factor between his score and mine.

I think it's somewhat safe to say my SLI config is being limited by my CPU.
 

Did you look at the screenshots I posted earlier? My two 260's running at different clock speeds even in SLI. Can anyone explain that anomaly to me?
Yep, I have two 6-pin power connectors plugged into each card.
 

:??: Do you think mine is as well? I have my CPU clocked at 3.6 Ghz yet me and you get similar numbers.
 
I guess that depends on how well 3dMark06 is capable of using multi-cores. One could theorize that my 4 cores versus your 3 cores could make them equal. But I'm not thinking 3dMark06 was optimized for quads... though I could certainly be wrong.

Although with a 192 core and a 216 core, I don't know if one card might theoretically force the other to run at the slower card's abilities to make things seamless or not.
 


Mine also say different speeds in GPUz, but one is still downclocked to the slower card's speeds. The same things applies when using different speed RAM in a system. I use EVGA Precision for my GPUs and I have them set to run at the overclocked card's speed. Otherwise they would run at the slowest speed. It sounds like a CPU limitation, but if games play well it should be fine until it becomes a big issue.
 

TRENDING THREADS