Poor performance -3570k+2 x 680gtx sli

shtubby

Honorable
Apr 22, 2012
19
0
10,510
I recently purchased a second rig for gaming and I am having a heck of a time figuring out what is going on. Now that I write this, it seems to me that it may be more appropriate in the graphics forum but I just can't seeing that be the cause.

Recently bought a 3570k, 16 gigs of g skill pc3 12800, gigabyte up7, and threw in my two used 680gtx. Running at default speed performance is terrible. I am seeing numbers like 8500 from 3dmark, 1080p performance settings. The rig I just pulled em out of would do minimum 14900. Processor is a little more beefy, 3930k not oc, slower 16 gig ram on Asus sabertooth.

Really don't care about the bench marks but these rigs are used for race simulator games and it is almost unplayable in surround. I've checked and double checked the bios, Mb settings, latest drivers,etc. Its almost like only one card is working but there are no errors anywhere that I can see and I have monitors on both cards and they all have video.

Can the difference in cpu be all it is? I find it hard to believe but maybe. As a test I put a 2700k in that rig (the Up7) and it performed about the same.

So I'm down to the Mb being the culprit or the cards. Put the 2700k back on the p8z77, both are running windows 7, the 2700k with one Ati 7970 and it beats the sli 680s by a good 4000. Any idea how I can narrow it down further?

I read everywhere that not many games take advantage of the 3930s horsepower but I sure wish I would have bought another one of those instead of this cheaper 3570 and the expensive up7.

Latest drivers , even tried beta. I'm about to toss the 680s in with the 3570 but I'm afraid that will make nuts when I see it flyin :).
 

shtubby

Honorable
Apr 22, 2012
19
0
10,510
So that is a something I am not sure of. I've tried it both ways. Nvidia has an optimize guide that say no, they should be divided:

http://www.geforce.com/optimize/guides/how-to-correctly-configure-geforce-gtx-680-surround#3

but elsewhere on their site it does say they should all be on the master card. I've tried it both ways. Right now I'm down to just one monitor being driven in sli and still scoring low with choppy frame rate in games with fairly high settings but not maxed.

@ Rockdpm - got her cranked up to 4.2Ghz and still seeing the sane crap. I guess I got a bum card or something... 3570 at 4.2 with sli and you can't run a game like iracing at high settings? That doesn't seem right.
 
Hmm, this is very strange. Have you ticked 'Prefer Maximum Performance' in the NVIDIA control panel as opposed to 'Prefer Adaptive Performance'? You'll find it under global settings one of the first tabs.

Also, what drivers did you use? Did you also uninstall the previous ones you were using correctly? Just to clarify, ops, double check this setting please:

Go into NVIDIA Control Panel --> Manage 3d Settings --> Global Settings Tab ---> Power Management Mode ---> Prefer maximum performance

Also, what programs do you have in the background? There is a known problem with the older versions of flash player that can reduce your core clock speeds if there's something like a browser video or stream running in the background. Could you also post a pic of CPU-Z? I want to check on another thing too.Also have you tried running sfc /scannow to see if you have any errors?

open command prompt as administrator

type sfc /scannow and press enter

theres a space between sfc and /scannow
have you tried driver sweeper?

http://www.guru3d.com/content_page/guru3d_driver_sweeper.html

uninstall nvidia drivers then run this

reboot in safe mode run sweeper again

then install new driver

also right click nvidia driver installer and run as administrator
Full Scan system with http://majorgeeks.com/download.php?det=5756


 
G

Guest

Guest
Why are you using 2 680s with only a single 1080p display? Wasted. Spending that much on a new build you should have went with a 27" 2560 x 1440 or 1600 display. The display is the most important part then GPU, CPU on down. Sounds like a driver issue to me.

DSC00905.jpg
 

Mozart25

Distinguished
May 9, 2011
183
0
18,690
There IS a slight bottleneck, about 10% going from PCIe 3.0 16x to 8x at 1080p. That bottleneck only becomes worse as the resolution increases, and is about 26% at 1400p, to 50% at 3x 1080p.

That said, it sounds like one of the PCI slots is downclocking. I second what spentshells said--ensure one of the PCI slots is not running at 4x.
 

Mozart25

Distinguished
May 9, 2011
183
0
18,690


Lol, show me that a quad-core CPU can't push two 680s. That's nonsense and pure speculation.

Meanwhile, there is plenty of proof that 3.0 @ 8x WILL bottleneck dual, high-end GPUs.
 

I am saying at stock.. of course a ******* 4.5 OC can push two 680's.. I don't have to prove to you that a stock clocked i5 can't push two 680's without bottlenecking... its common sense
 

shtubby

Honorable
Apr 22, 2012
19
0
10,510
The single monitor was for testing purposes. I actually run three 24 inch BENQ 3D gaming monitors with a 30 inch HP ZR30w overtop for good measure. I really appreciat the feedback as this is driving me crazy. I just got home from a day of family birthday parties and will post the results of the tests I was asked to run as soon as I finish this post and gather them.


A little more background. These two 680 were in my other box, a 3930K which absolutely rocked with them in there. I recently decided to buy a secon gaming rig so my buddies and I can race in the same room together. I didn't want to spend a ton (lol why the hell did I buy that motherboard then:) ) and everything I've read said you would do well with the 3570K.

So knowing what a powerhouse the 3930 was, I bought the AMD card to put in it and then swapped the 680s into the 3570. Fresh install of windows 8, and now windows 7 but still terrible perf.

Not having another SLI connector handy (the one from the ASUS board is to short) as I test, I clipped on the the three way SLI connector, you know the one that has 3 sets of two connectors and tested. To my eye, I thought I noticed an improvement on 3Dmark but it came out about the same.

The drivers I have tried are both the latest NVIDIA WHQL and the BETA. 310.something and I think the BETA is 313 or 317? I'll post screenshots with the rest shortly.

Monitors are plugged in on the 4 DVI connectors. I've read both all on one card and splitting them between the cards all on the nvidia website.

Re-installed FarCry 3 as a test and the game engine defaulted to ultra settings and playability was OK but some framerate dropping.

I do have a 2700k 2nd gen here too to swap in and out of the game rig for testing but that did not yeild much difference. Believe it or not, I am starting to wonder about the motherboard since nothing I do makes any difference.

Gimme 20 minutes or so to get the screenshots.

Thanks a ton!
 
Either the simulator your playing doesn't like to be optimized by your hardware..or something is bugged.. for you to have spent all that money jumping from hardware to hardware and still not being about to play it right.. sounds to me like its the game. not your hardware
 

shtubby

Honorable
Apr 22, 2012
19
0
10,510
Sorry for my late reply, I was called out of town to a customer site...

There must be something wrong with that Gigabyte board, I just don't know how to prove it.

After trying EVERYTHING I went ahead and grabbed another 3930K and a P9X79-PRO. Dropped the two 680s in and BAM! Back to where it should be. 16,000 on 3DMark and Ultra looks great in all of my games. An interesting thing occurred though. After a reboot I went back to play a game an the performance was awful. Exited and started the benchmark and damn if my numbers weren't back to where they were with the UP7.

I freaked out and screwed around with for 4 or 5 hours before I realized I was back on the X79 chipset and that you have to run ForceEnable to get PCIe 3.0 speeds.

Ran that and back to good times.

So this leads me to believe that the PCIe 3.0 slots on the MB are running at 2.0. Ran GPUz and it reports 3.0 speeds so I am stumped.

What do I do with that board? Can I send it back to Gigabyte? I'm sure that will be one big hassle when I tell them it's just 'slow'. But shoot, that thing cost me $400+.

Any advice for me on what I can do to either fix the issue or get my money back? Has anyone had an experience with an issue like this?

Thanks for all the help!
Chris
 

shtubby

Honorable
Apr 22, 2012
19
0
10,510
Yes that is exactly right. Comparing apples to oranges here but it's all I have for comparison and hopefully the CPU difference shouldn't skew it too much. Here is what I see:

Gigabyte UP7 board I5 3750K, bios reset, optimized default settings (except for RAID) w/ single 680 3dmark11 Basic Performance = ~6500

Gigabyte UP7 board I5 3750K, bios reset, optimized default settings (except for RAID) w/ Dual SLI 680 3dmark11 Basic Performance = ~8500

Gigabyte UP7 board I5 3750K, bios reset, optimized default settings (except for RAID) w/ single 7970 3dmark11 Basic Performance = ~6500

Asus Sabertooth X79 I7 3930K, bios reset, optimized defaults settings (except for RAID and ForceEnable to attain PCIe 3.0 speed) w/ single 680 3Dmark11 Basic Performance = 10500 - 11500

Asus Sabertooth X79 I7 3930K, bios reset, optimized defaults settings (except for RAID and ForceEnable to attain PCIe 3.0 speed) w/ Dual SLI 680 3Dmark11 Basic Performance = 15500 - 16600

Did not retry the 7970 (tired of pulling and installing cards :) )

But did pop the 680s on P9X79-Pro (which also needs ForceEnable because of the X79 chipset) and see about the same numbers. Made sure to use the same RAM (maybe that's a mistake) and same make/model of hard drives (Intel SSD 530 series RAID 0 ) in both rigs.

I can't imagine what I am missing. But I suspect the PCI bus is somehow lying to GPUz. If I force the X79 to run the cards at 2.0 8x with the 680s I see roughly the same numbers I do on the Gigabyte with reports 3.0 16x speed.

Will they RMA something like this? I have not experience with this sort of issue. I mean the computer does run, and boot and not crash. Maybe there is a cracked circuit or something, Heck I don't now but I do know I'd sure like to be rid of it. Probably a good brand but I'll never buy one again.