980 SLI Enough?

bwrlane

Distinguished
Oct 5, 2010
449
0
18,860
I've long been a believer in two things.

1. There is no such thing as enough GPU power. Every GPU upgrade I have made has disappointed me. Yes it is better than my last but no, it is not the perfectly smooth experience I had been expecting.

2. Never believe reviews. Reviews quote accurate data but don't tell the full story. The true picture is what it actually feels like to the user, not the numbers you see on the charts.

1 is now invalid. I have a 1920*1200 display (60 Hz) and 2*GTX 980. There is no setting in any game that gives me better performance in SLI mode than single GPU mode. This tells me that a single 980 basically maxes out a 1080p 60 display. First GPU I have had that has done so.

Time for a new display! I will probably get a gsync one.

My dilemma is this: should I get 4k or 1440p? Bear with me and my reasoning.

One 980 gives the same performance as two in 1080p. 1440p is twice as many pixels as 1080p. This means I should get the same smoothness in 1440p as in 1080p. This is the safe option.

Can I push it to 4k? The reviews say yes. They say two 980s are plenty for 4k and trounce a single Titan X and deliver impressive FPS at 4k resolutions.

But really? My real concern is the 4GB frame buffer. A too-small frame buffer will often deliver good average results but slow down so much in the more complex scenes that you would not really consider it adequate. 4k resolution is 4 times the pixels as 1080p. I have no idea whether memory requirements scale linearly with resolution. If they did then 4k and 4GB is the same as 1080p and 1GB. 1GB at 1080p is nowhere near enough so it follows that 4GB and 4k isn't either. But how does memory requirement scale with resolution? I suspect it is less than linear.

My question is therefore this. If you have 980SLI and 4k, how are you finding it? Can it deal with 4k settings comfortably in the real world?

Your answers will influence whether my display upgrade is 1440p or 4k.

Thanks!
 
Solution


Thank you. And yes it's all about understanding your system and its constraints. The good experience continued last night when I tried Crysis 3, normally a very demanding game. The Nvidia...
1440p. At 4K resolutions we see a lot of complaints, hence the reason why so many folks are happy to see the 980 TI at the price point it's at. With 6GB of VRAM and dual GPUs it's a good option in SLI for 4K gaming. A standard GTX 980 in SLI WILL game at 4K, but you will NOT be maxing anything out for any seriously demanding titles. I personally don't have either configuration, but have dealt with enough threads here that were related, and few customer builds that were similar enough to feel pretty confident that, for example, the 980SLI configuration at 4K playing Witcher 3 or GTA V is going to require you to disable some settings. We expect more and more future releases to begin requiring similar system and GPU resources for Ultra settings with the gamut of Gameworks features enabled.

Your CPU could be playing a role in this as well. What is your full system hardware specs?
 


Thanks! That's a really good answer. And much in line with my suspicions. Basically, 2*980 have enough rendering power but not enough memory. Darn! Should have waited for the Ti, but there you go. Other relevant system specs, 5930k Haswell-E, 16GB Ram. I overclock the 5930k to about 4.3GHz but have yet to find a situation where this gives any different experience than stock. Measurable in benchmarks yes, but tangible in real life, don't really think so.
 


Thanks! Do CPU requirements increase with resolution? I thought GPU requirements did but not so much the CPU.

Re overclocking, I have Asus Rampage V extreme and Corsair H100i cooling block. Problem is I think I don't have a very good CPU example, they always say mileage varies but so far have had little luck pushing it beyond 4.3. Temps don't seem to be a problem as it rarely hits 60 degrees. Admittedly I have not tried very hard due to the lack of any real practical necessity. May give it another go.
 
CPU requirements don't specifically increase with resolution, they actually usually reduce because more demand is driven onto the GPU which creates more downtime and less overhead for the CPU. However, with SLI at a higher resolution, your GPU will be processing frames at a much higher rate which then DOES increase the demands on the CPU since it won't be waiting as long for things to be rendered.


Is the problem stability or thermal issues? It's likely you don't have enough voltage for the overclock and adding the necessary voltage of course creates a higher thermal demand on the cooling system. Be sure all case fan locations are populated, that you have a case with plenty of case fan locations and I'd also consider a different cooling system at some point since the H series Corsair coolers are well known to be underperformers. Plus it's only a 240mm cooler. A good 280mm cooler like the Cooler Master Nepton 280, one of the 280mm Swiftech coolers or a custom loop would likely provide a much better cooling solution. I'd also be sure that you're using the radiator fans in an intake configuration, pulling cool air from outside the case rather than trying to use already heated air from inside the case.

What is your case model?
 


Got it. My system is currently able to deliver 60fps in "idle" (ie without any individual component stressed). The 60Hz scan rate of the monitor is the bottleneck. 1440p and a higher scan rate will place higher demands on the CPU. Not only would the CPU constrain the >60Hz frame rates that the system is otherwise capable of delivering, it could also act as a constraint in other situations, such as in online games with many other players.

I am quite sure that the problem with overclocking is not a thermal one. My cooling seems very efficient and it rarely reaches 60 even in cinebench. (Case is Entho Primo, this one http://www.scan.co.uk/products/phanteks-enthoo-primo-rev-2-full-tower-performance-case-with-side-window-e-atx-atx-w-o-psu).

The problem with overclocking is most likely due to the fact that I have never really tried. I have increased the multiplier until the system is not stable and the limit seems to be 4.3 GHz (all cores), won't go to 4.4. I have yet to touch the voltage because I have noticed that the BIOS automatically adjusts the voltage when I increase the multiplier. (But maybe not enough?) Maybe I need to examine the LLC and other settings.

Given my temps, I expect you would agree that the problem is most likely not a thermal one?
 
Oh, nice case. Really nice case. Really nice entire configuration actually. Somebody invested a couple of house payments on that rig. Heh.

You don't want any automatic voltage control going on. Use the following guide for overclocking. Start where you are and bump the voltage a notch, then multiplier until unstable, then multiplier, etc. as outlined here:


http://www.overclock.net/t/91/ultimate-overclocking-guide


You can worry about fine tuning things like load line calibration once things are relatively stable. I would turn off Turbo boost in the bios though. It's undesirable to have turbo boost enabled with a manual overclock. You want control of the voltage and multiplier, not the system. I'd probably turn off speedstep as well. System reductions in voltage might make an otherwise stable overclock exhibit issues.


Use Prime95 version 26.6 set to Small FFTs, and ONLY Prime95 version 26.6 (No other version) for testing stability and thermal limits. Thermal limits are stabled at 15 min of Small FFT settings. If the system is stable, you can move another .5 on the multiplier. Make sure your fan profiles for the case fans and radiator fans are where they need to be.


Prime 26.6: http://windows-downloads-center.blogspot.com/2011/04/prime95-266.html (Small FFT only)

Regardless of architecture. P95 v26.6 works equally well across all platforms. Steady-state is the key. How can anyone extrapolate accurate Core temperatures from workloads that fluctuate like a bad day on the Stock Market?

I'm aware of 5 utilities with steady-state workloads. In order of load level they are:

(1) P95 v26.6 - Small FFT's
(2) HeavyLoad - Stress CPU
(3) FurMark - CPU Burner
(4) Intel Processor Diagnostic Tool - CPU Load
(5) AIDA64 - Tools - System Stability Test - Stress CPU

AIDA64's Stress CPU fails to load any overcloked / ovevolted CPU to get anywhere TDP, and is therefore useless, except for giving naive users a sense of false security because their temps are so low.

HeavyLoad is the closest alternative. Temps and watts are within 3% of Small FFT's.
Since the radiator is taking up some of your case fan locations, are the remaining case fan locations all populated? If not, I'd recommend doing so. Where do you have the radiator mounted and in what configuration? Your temps are ok now, but that might change if you increase the level of the overclock so it could become rather important.
 



Thanks for overclocking guide! Will look at that. Re case fans yes I have all the original fans installed that came with the case, but there are still some spare slots. But that would be part of the solution if my problem were excessive heat, though I don't think it is. It's more about the correct voltage settings (I think).

Re my original question, the advice seems to be go for 1440p for 980 SLI. Any particular monitor recommendation? I am thinking about this one. https://www.overclockers.co.uk/showproduct.php?prodid=MO-077-AC&groupid=17&catid=948
 
I meant as a pre-emptive solution for heat related to when you go to overclock the chip further. Once you start having to add voltage, your thermal situation is going to change rapidly and the preinstalled case fans may not be enough to compensate, especially since some of the locations are being used by the radiator and it's fans, which is also introducing additional heat into the case if you have them in an intake configuration.

Yep, that's a very nice monitor, but it's also very expensive and it's 144hz. I really dislike 144hz displays. They often require dropping settings to maintain the necessary frame rates to avoid tearing and other issues. I'd probably recommend just a 1440p 60hz unit or maybe a 120hz unit. Honestly, I think 144hz is an unnecessary factor. I know it eliminates SOME ghosting and other issues, but it also introduces issues not present with lower refresh rates. It also has 4mz response time, which is a little more than we'd like to see on a "gaming" monitor, but is still certainly low enough. That's you call though. That monitor is a good choice though if you don't mind the 144hz. One of these would be good as well:

BenQ 60Hz- http://pcpartpicker.com/part/benq-monitor-gw2765ht

Acer 144hz- http://pcpartpicker.com/part/acer-monitor-umhg0ee001
 


Ah yes I see, well I'll see how the temps go after adjusting the voltage. Have to say I do love this Corsair cooler, seems very efficient and looks great in the case.

Re monitor, my reasoning is this. I use my computer for much else besides games. I use it for work, video editing (hobbyist), photography etc. For this reason I prefer IPS despite higher cost and compromise on input latency. True, that Acer monitor is pricey but it seems to tick most of the boxes: acceptable (thought not stellar) latency, IPS panel, 1440p. It also has gsync, which I find quite attractive in theory despite no practical experience. With regard to the 144Hz refresh rate, I had assumed that number is only relevant as a maximum because the actual rate should sync to the GPU's output - (afterall that is the main feature of gsync).

Many thanks for your excellent advice all round.
 
Yeah, that's true, but there are a number of issues people have with Gsync itself. Screen flickering, in some cases, major issues when using SLI configurations


http://www.overclock.net/t/1548582/nvidia-g-sync-and-sli-issues-solved

just as one example, micro-stutters and some other minor annoyances as well. If you google "problems with Gsync", there are pages and pages of examples. I can't say if they've been rectified through newer drivers, but I've seen pretty recent examples of issues with Gsync monitors so I don't know.
 
If you have a 980 in SLI, then there's no question, get a 4K monitor. Having to always max out every detail on a resolution is a bit of a sickness, because in my opinion, ~high settings on a 4K monitor (nearly) always looks better than max settings on 1440p. Same counts for high on 1440 vs. ultra on 1080.
If you have invested over 1000 dollars now in GPUs and get a mere 1440p monitor, then I tihnk you're gonna be really sorry you didn't get a 4K as it's really overkill for 1440p (unless you wanna hit 120-144 fps). And 4GB is still enough for 4K, okay some memory-intensive games you have to put textures on high instead of ultra for example, but like i said earlier it still looks better than ultra on a lower resolution screen.
 
Yes, you do want to hit those kinds of FPS, which isn't going to happen on Games like Witcher 3 or GTA V, or probably ANY of the upcoming releases like Doom and a few others that are going to have the same kinds of resource demands as those titles. Everybody is entitled to their own opionion, so long as they don't confuse it with fact and acknowledge the fact that not everybody is going to agree with them, which in this case, I don't. I think 1440p is the right resolution considering a Titan X can't even max out Witcher 3 at 1080p and maintain 60FPS.


http://wccftech.com/witcher-3-initial-benchmarks/





 
I think those Witcher fps issues have been adressed with driver updates, Ive seen recent benchmarks with the Titan X and 980 Ti hitting around 90fps average on 1080p, with ultra and hairworks off.
But yeah, it is indeed just my opinion, altho I do believe that 1440p with everything maxed out looks beautiful, a world of difference with 1080p. 4K and 1440p isn't that big of a difference I think, but still my preferance is 4K, even if it's only "high" settings (although with two 980s you can put the settings much more towards ultra most of the time, especially if you leave the AA off (which you don't really need on 4K, maybe 2x in some games...)).
 
Honestly, either way is fine. For me, if it was MY money, and considering how sharply the resource demands have increased on recent titles, I'd want to be sure I was going to be able to get the most out of my hardware for what's yet to be released as well as what's out now. Since most these cards were released AFTER the majority of games that are out, you can bet that unreleased titles are going to be at least somewhat more demanding of your hardware and likely a lot more demanding of it, for top notch settings anyhow.

My next monitor will likely be 1440p as I'm sitting on three 1080p monitors right now and haven't really found a way to convince myself that I can or need to spend the money on a 1440p or 4k monitor when I know I'll have to upgrade all three at some point. Maybe when prices come down somewhat. Probably during the holiday sales towards the end of the year. Then again, maybe not.
 
Hello people, just thought you would be interested in a quick update, since you provided me with such useful advice. Maybe my experience will be helpful to some of you, I hope so.

I was going to buy a 1440p gsync IPS monitor. However, I had a total change of heart and - while I cannot really entirely justify why - ended up getting this one. http://www.scan.co.uk/products/28-acer-predator-xb280hk-4k2k-g-sync-led-gaming-display-displayport-3840x2160-300cd-m2-100m1-1ms-usb


The original rationale for the 1440p solution was that 4K would be just a touch too much for my demanding, stutter intolerant and generally OCD eyes, so the plan was to go for a half-way house.

Well the more I thought about it, the more I reasoned that with judicious use of settings, it ought to be possible to squeeze more visual fidelity out of 4K than 1440p. 980 SLI, I figured, would not be GPU constrained but VRAM constrained. That should allow maxing all visual effects but turning down the odd memory chewing setting. Fortunately one of the worst culprits is AA, which can now be turned off completely. The higher pixel density means the jaggies have rather lost their jagged edge.

Result?

O.M.G.

Not since my wedding day have I clapped eyes on something so utterly beautiful. Frame rates are smooth as a baby's bottom and that clanging sound you can hear is my jaw on the floor. I assume some of the smoothness is due to gsync as my 1080p monitor was showing more stutter than this! I do have to turn down the odd setting to optimise the experience but the eye candy is a good 99% max!

The GeForce Experience "optimal settings" mostly does a good job. There is the odd clanger (like Witcher 3 being set to windowed rather than full screen). And it doesn't always get the balance between GPU and VRAM constraints right in my set up. I prefer to think of it as the starting point from which further tweaks can (and should) be made.

So if you are lucky enough to have a 980 SLI system then that 4K monitor cannot come highly recommended enough.

Some games, like Sleeping Dogs and Witcher 3 really do have to be seen to be believed. And with all settings within a whisker of their maximum, you will not detect even a momentary judder, not even in the most complex scenes.
 
Nice. Glad to hear it worked out well for you, better in fact that was expected. It's a good bit of information to have as well since many lab results have been entirely in contrast to yours, which is almost not a surprise at all. Real world situations with judicious use of settings often has a much different result than vanilla flavored settings in a test rig. Almost makes me want to pull the trigger on a second card and higher resolution display myself. Thanks for the feedback. I guess in this case the answer was, yes, 980 SLI IS enough.
 


Thank you. And yes it's all about understanding your system and its constraints. The good experience continued last night when I tried Crysis 3, normally a very demanding game. The Nvidia recommended settings included turning down object detail and having 2xAA. I just reversed these, turning off AA and maxing object detail. I also turned shadows down to medium but everything else maxed. The result truly is breathtaking. Massive, open expanses showing stunning detail and precisely zero judder. Like I said, I didn't even get this smooth an experience on my 1080p monitor so I do think gsync is playing a part in the experience.

I agree with you re testing. A standard test will typically involve 4x or 8x MSAA with every dial maxed. This is completely without regard to whether the settings make sense for the system.

Question answered: 980 SLI and 4k seem to get along just fine. :)
 
Solution