Nvidia's SLI Technology In 2015: What You Need To Know

Status
Not open for further replies.

PaulBags

Distinguished
Mar 14, 2015
199
0
18,680
Nice article. Looking foreward to comparing to a dx12+sli article when it happens, see how much it changes the sli game since cpu's will be less likely to bottlneck.

Do you think we'd see 1080p monitors with 200hz+ in the future? Would it even make a difference to the human eye?
 

none12345

Distinguished
Apr 27, 2013
431
2
18,785
They really need to redesign the way multigpu works. Something is really wrong when 2+ gpus dont work half the time, or have higher latency then 1 gpu. That fact that this has persisted for like 15 years now is an utter shame. SLI profiles and all the bugs and bs that comes with SLI needs to be fixed. A game shouldnt even be able to tell how many gpus there are, and it certainly shouldnt be buggy on 2+ gpus but not on 1.

I also believe that alternating frames is utter crap. The fact that this has become the go to standard is a travesty. I dont care for fake fps, at the expense of consistent frames, or increased latency. If one card produces 60fps in a game. I would much rather have 2 cards produce 90fps and both of them work on the same frame at the same time, then for 2 cards to produce 120 fps alternating frames.

The only time 2 gpus should not be working on the same frame, is 3d or vr, where you need 2 angles of the same scene generated each frame. Then ya, have the cards work seperatly on their own perspective of the scene.
 

PaulBags

Distinguished
Mar 14, 2015
199
0
18,680
Considering dx12 with optimised command queues & proper cpu scaling is still to come later in the year, I'd hate to imagine how long until drivers are universal & unambiguous to sli.
 
The article is very nice.
However, If i need to buy 2 980s to run a VR set or a 4K display Ill just wait till the prices are more mainstream.
I mean, in order to have a good SLI 980 rig you need a lot of spare cash, not to mention buying a 4K display (those that are actually any good cost a fortune), a CPU that wont bottleneck the GPUs, etc...

Too rich for my blood, Id rather stay on 1080p, untill those technologies are not only proven to be the next standard, but content is widely available.

For me, the right moment to upgrade my Q6600 will be after DX12 comes out, so I can see real performance tests on new platforms.
 

Luay

Distinguished
Sep 30, 2010
59
0
18,630
I thought 2K (1440P) resolutions were enough to take a load off an i5 and put it into two high-end maxwell cards in SLI, and now you show that the i7 is bottle-necking at that resolution??

I had my eye on the two Acer monitors, the curved 34" 21:9 75Hz IPS, and the 27" 144HZ IPS, either one really for a future build but this piece of info tells me my i5 will be a problem.

Could it be that Intel CPUs are stagnated in performance compared to GPUs, due to lack of competition?

Is there a way around this bottleneck at 1440P? Overclocking or upgrading to Haswell-E or waiting for Sky-lake?
 
Really wish they would have made 4GB 780Tis, the overclock on those 980s is 370Mhz higher core clock and 337Mhz higher memory clock than my 780Tis and barely beats them in Firestrike by a measly 888 points. While SLI is great 99% of the time there are still AAA games out there that don't work with it, or worse, are better off disabling SLI, such as Watchdogs and Warband. I would definitely be interested in a dual gpu Titan X card or even 980 (less interested in the latter) because right now my Nvidia options for SLI on a mATX single PCIE slot board is limited to the scarce and overpriced Titan Z or the underwhelming Mars 760X2.
 

baracubra

Distinguished
Jan 24, 2008
312
0
18,790
I feel like it would be beneficial to clarify on the statement that "you really need two *identical* cards to run in SLI."

While true from a certain perspective, it should be clarified that you need 2 of the same number designation. As in two 980's or two 970's. I fear that new system builders will hold off from going SLI because they can't find the same *brand* of card or think they can't mix an OC 970 with a stock 970 (you can, but they will perform at the lower card's level).

PS. I run two 670's just fine (one stock EVGA and one OC Zotac)
 

jtd871

Distinguished
Jan 26, 2012
114
0
18,680
I'd have appreciated a bit of the in-depth "how" rather than the "what". For example, some discussion about multi-GPU needing a separate physical bridge and/or communicating via the PCIe lanes, and the limitations of each method (theoretical and practical bandwidth and how likely this channel is to be saturated depending on resolution or workload). I know that it would take some effort, but has anybody ever hacked a SLI bridge to observe the actual traffic load (similar to your custom PCIe riser to measure power)? It's flattering that you assume knowledge on the part of your audience, but some basic information would have made this piece more well-rounded and foundational for your upcoming comparison with AMDs performance and implementation.
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640
I feel like it would be beneficial to clarify on the statement that "you really need two *identical* cards to run in SLI."

While true from a certain perspective, it should be clarified that you need 2 of the same number designation. As in two 980's or two 970's. I fear that new system builders will hold off from going SLI because they can't find the same *brand* of card or think they can't mix an OC 970 with a stock 970 (you can, but they will perform at the lower card's level).

PS. I run two 670's just fine (one stock EVGA and one OC Zotac)

What you say -was- true with 6xx class cards. With 9xx class cards, requirements for the cards to be identical have become much more stringent!
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640


Keep in mind we're talking about ~120-150 FPS bottlenecks here ... where most people would consider anything 30 FPS and above "playable" and the vast majority of displays top out at 60 Hz refresh rates. In my personal experience the "best" mode of the Asus ROG Swift I use is the ULMB mode, and that is capped at 120 Hz (not 144 Hz.) The i7 does fine in my experience overall.

The i7-4770k in the test was already running at a fair overclock at 4 GHz. Haswell-E will probably do a bit better, but I don't have such a platform on hand so I can't say how much better.

Filippo
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640


Don't think we'll see 200 Hz+ displays anytime soon. Most traditional video content is still recorded at 24/25/30 FPS. "High Frame Rate" content is 60 Hz (and got poor reviews in theaters.) Anything 60+ FPS and above is solely the realm of gaming the quality / framerate tradeoff drops quickly above 60 FPS.

As to whether it would make a difference ... probably somewhat - video would appear to be even more "smooth" than it does at 120 Hz, but not enormously so.

Filippo
 
The only game I noticed microstuttering at 4k is Watch Dogs. It is due to the really bad game code.

Also, dropping your AA and your texture setting remove all of it. The problem is the low amount of video memory at 2160p.
 
heres what these cats had to say

''Since this is our first multi-GPU review after new-generation consoles settled into the market, we noticed a worrying trend in game engines (particularly with cross-platform games for new-generation consoles) where deferred rendering doesn't take advantage of multi-GPU setups; in games such as Dead Rising, Assassin's Creed: Unity, or Ryse. A tell-tale sign of such an engine would be the lack of MSAA support. That could spell trouble for not just a SLI setup with affordable graphics cards as it could also hit NVIDIA's enthusiast-grade multi-GPU market hard, including its flagship dual-GPU solutions.''

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_960_SLI/25.html
 

Wes006

Honorable
Jul 9, 2014
92
0
10,660
They really need to redesign the way multigpu works. Something is really wrong when 2+ gpus dont work half the time, or have higher latency then 1 gpu. That fact that this has persisted for like 15 years now is an utter shame. SLI profiles and all the bugs and bs that comes with SLI needs to be fixed. A game shouldnt even be able to tell how many gpus there are, and it certainly shouldnt be buggy on 2+ gpus but not on 1.

I also believe that alternating frames is utter crap. The fact that this has become the go to standard is a travesty. I dont care for fake fps, at the expense of consistent frames, or increased latency. If one card produces 60fps in a game. I would much rather have 2 cards produce 90fps and both of them work on the same frame at the same time, then for 2 cards to produce 120 fps alternating frames.

The only time 2 gpus should not be working on the same frame, is 3d or vr, where you need 2 angles of the same scene generated each frame. Then ya, have the cards work seperatly on their own perspective of the scene.
They really need to redesign the way multigpu works. Something is really wrong when 2+ gpus dont work half the time, or have higher latency then 1 gpu. That fact that this has persisted for like 15 years now is an utter shame. SLI profiles and all the bugs and bs that comes with SLI needs to be fixed. A game shouldnt even be able to tell how many gpus there are, and it certainly shouldnt be buggy on 2+ gpus but not on 1.

I also believe that alternating frames is utter crap. The fact that this has become the go to standard is a travesty. I dont care for fake fps, at the expense of consistent frames, or increased latency. If one card produces 60fps in a game. I would much rather have 2 cards produce 90fps and both of them work on the same frame at the same time, then for 2 cards to produce 120 fps alternating frames.

The only time 2 gpus should not be working on the same frame, is 3d or vr, where you need 2 angles of the same scene generated each frame. Then ya, have the cards work seperatly on their own perspective of the scene.
They really need to redesign the way multigpu works. Something is really wrong when 2+ gpus dont work half the time, or have higher latency then 1 gpu. That fact that this has persisted for like 15 years now is an utter shame. SLI profiles and all the bugs and bs that comes with SLI needs to be fixed. A game shouldnt even be able to tell how many gpus there are, and it certainly shouldnt be buggy on 2+ gpus but not on 1.

I also believe that alternating frames is utter crap. The fact that this has become the go to standard is a travesty. I dont care for fake fps, at the expense of consistent frames, or increased latency. If one card produces 60fps in a game. I would much rather have 2 cards produce 90fps and both of them work on the same frame at the same time, then for 2 cards to produce 120 fps alternating frames.

The only time 2 gpus should not be working on the same frame, is 3d or vr, where you need 2 angles of the same scene generated each frame. Then ya, have the cards work seperatly on their own perspective of the scene.
 

endeavour37a

Honorable
Quote:
"In part two of this series, we'll look at the red team's rival multi-GPU technology called CrossFire (we've already lined up a couple of AMD's next-gen cards to make this one happen for you)."

Does this mean R9 3xx cards?
 

Wes006

Honorable
Jul 9, 2014
92
0
10,660
They really need to redesign the way multigpu works. Something is really wrong when 2+ gpus dont work half the time, or have higher latency then 1 gpu. That fact that this has persisted for like 15 years now is an utter shame. SLI profiles and all the bugs and bs that comes with SLI needs to be fixed. A game shouldnt even be able to tell how many gpus there are, and it certainly shouldnt be buggy on 2+ gpus but not on 1.

I also believe that alternating frames is utter crap. The fact that this has become the go to standard is a travesty. I dont care for fake fps, at the expense of consistent frames, or increased latency. If one card produces 60fps in a game. I would much rather have 2 cards produce 90fps and both of them work on the same frame at the same time, then for 2 cards to produce 120 fps alternating frames.

The only time 2 gpus should not be working on the same frame, is 3d or vr, where you need 2 angles of the same scene generated each frame. Then ya, have the cards work seperatly on their own perspective of the scene.

If you drive down the freeway, how many trucks do you see pulling the same load, same trailer? I think you would be hard pressed to ever see that in your life time. I'm not sure you understand the technical nature of the problem. I absolutely do not intend any ill-will and I think you have the right to your opinion regarding the solution to the problem being that is should be a much simpler process. There will still come improvements regarding the way information is fed to the GPU's. Anyway, I have friends that share your opinion and have asked me the same question. I have to tell them in layman's terms why there are still issues with SLI/Crossfire. Again, I'm not saying this will always stay an issue, but until we have better intercommunication of the information of all interconnects in a computer and the way things are programmed, this will be an uphill battle.
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640
Quote:
"In part two of this series, we'll look at the red team's rival multi-GPU technology called CrossFire (we've already lined up a couple of AMD's next-gen cards to make this one happen for you)."

Does this mean R9 3xx cards?

Yes.
 

baracubra

Distinguished
Jan 24, 2008
312
0
18,790
I feel like it would be beneficial to clarify on the statement that "you really need two *identical* cards to run in SLI."

While true from a certain perspective, it should be clarified that you need 2 of the same number designation. As in two 980's or two 970's. I fear that new system builders will hold off from going SLI because they can't find the same *brand* of card or think they can't mix an OC 970 with a stock 970 (you can, but they will perform at the lower card's level).

PS. I run two 670's just fine (one stock EVGA and one OC Zotac)

What you say -was- true with 6xx class cards. With 9xx class cards, requirements for the cards to be identical have become much more stringent!

Not true. According to Nvidia's FAQ page:

"Can I mix and match graphics cards from different manufacturers?
Using 180 or later graphics drivers, NVIDIA graphics cards from different manufacturers can be used together in an SLI configuration. For example, a GeForce XXXGT from manufacturer ABC can be matched with a GeForce XXXGT from manufacturer XYZ.

Can I mix and match graphics cards if one of them is overclocked by the manufacturer?
Yes. A GeForce XXXX GTX that is overclocked can be mixed with a standard clocked GeForce XXXX GTX."

http://www.geforce.com/hardware/technology/sli/faq#c17
 

none12345

Distinguished
Apr 27, 2013
431
2
18,785
"If you drive down the freeway, how many trucks do you see pulling the same load, same trailer? I think you would be hard pressed to ever see that in your life time. I'm not sure you understand the technical nature of the problem. I absolutely do not intend any ill-will and I think you have the right to your opinion regarding the solution to the problem being that is should be a much simpler process. There will still come improvements regarding the way information is fed to the GPU's. Anyway, I have friends that share your opinion and have asked me the same question. I have to tell them in layman's terms why there are still issues with SLI/Crossfire. Again, I'm not saying this will always stay an issue, but until we have better intercommunication of the information of all interconnects in a computer and the way things are programmed, this will be an uphill battle. "

While i have not coded any graphics engines. Ive been writing code for 25 years, tho not as a primary profession. Ive written well balanced multihreaded code, as well as tried to shoehorn some things into a multithreaded situation that aren't ideal to it. A lot of times you need to completely rewrite code to really get it to an acceptable level of parallesim. I do understand a lot of the technical problems, tho certainly not all. I understand that each pixel in a frame has a different load to calculate it. And that those loads can vary drastically from pixel to pixel. I also understand that certain parts of the graphics pipeline can not be done in parallel. For instance composting and and some post processing effects.

However, a large chunk of the compute load of a graphics frame is an inherently parallel problem. This is why graphics cards run thousands of compute engines in parallel. If you duplicate the memory on each card, which they do right now, then you are left with 3 real problems. Trying to split the load evenly, and in the simplest case you can just checkerboard, which should be good enough most of the time. Then you have composting the 2 data streams into one; which is a distributed load problem they already deal with from the thousands of compute cores. And then post processing effects, that you need the whole image to compute.

Even parts of the pipeline that are not easily paralled can still be sped up accross 2 cards by alternating which card does the processing. If you distribute the thermal load in this way, you can turbo the cores more.

The thing that really gets me about sli/crossfire tho are the game profiles, and the bugs. Either a feature shoudl work with multi gpu or it shouldnt. Thus it should work for all games that request the feature, or none. Thats fine, whats not fine is when one feature works for one game but not another because of a missing sli profile....wtf? Additional microstudder when alternating frames, again WTF. Why do we sometimes see 10-15 extra milliseconds of frame pacing lag when adding a second card, when all they are doing is alternating frames, thats just nuts.

And ya nothing is simple int he real world. Certainly its not as cut and dry as jsut saying things shouldnt be the way they are. But, it really seems that the entire history of multi gpu has been shameful to say the least. Hopefully dx12 improves things. Since articles on this site have alluded to it being muct more multi gpu friendly. But well have to wait and see how that turns out.
 

op8

Distinguished
Jul 3, 2011
55
0
18,630
Can anyone tell me what kind of bottleneck I would get from my CPU (i5 4670K) if I was to get a second 780Ti? ...and is my MSi reference card compatible with EVGA ref. cards (the normal ones, not the SC or above)?
Thx.
 
Status
Not open for further replies.