Nvidia's SLI Technology In 2015: What You Need To Know

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

endeavour37a

Honorable


Thanks for the heads up on this, that will be great to get a look at the new AMD card performance. Perhaps you could say which ones you will be testing? Considering that you used top shelf NV cards like the 980s would they maybe be the R9 380x cards?

I understand they are reworked Hawaiian cores from the 290/290x cards, they say renamed Grenada. If so then your test results would be pretty relevant to everyone running 290/290x cards now.
 

Evarin

Distinguished
Sep 7, 2011
6
0
18,510
Great article. I'm very much looking to center my next build around VR, so addressing SLI from that perspective was quite useful.

For now, the plan is likely to buy 1 980 ti and possibly get a second if it proves beneficial to VR gaming.
 

Evarin

Distinguished
Sep 7, 2011
6
0
18,510
Great article. I'm very much looking to center my next build around VR, so addressing SLI from that perspective was quite useful.

For now, the plan is likely to buy 1 980 ti and possibly get a second if it proves beneficial to VR gaming.
 

baracubra

Distinguished
Jan 24, 2008
312
0
18,790


While I'll leave the bottleneck question to someone with a bit more information on the matter. Previously they have been compatible, but as of late I've heard that issues do sometimes crop up
 
Excellent article! I was one of those early SLI adopters with a pair of Voodoo2 cards. However when the TNT2 came out, I switched to back to Nvidia being an owner of the original TNT and Riva 128 cards and have gone between high level single GPUs and mid-range SLI GPUs ever since.

I'm running a pair of EVGA Superclocked 970s with an i5 4690K running at 4.7GHz. They do fine for a 1440p monitor overclocked to 90Hz. Never had any micro-stuttering issues and most games are locked at V-sync. However, I would much prefer to have a single GPU solution again even if it costs more for the same performance. The amount of heat those cards pump out the back of my case is the most I've ever experienced, and that's even with them being throttled back by a 90fps cap. Unfortunately, Nvidia has pushed back the 980 Ti release into 2016 and these 970s will be worth half what they are now next year on the used market. No plans to move to 4K any time soon for gaming having just moved up to 1440p less than two years ago.
 
The motherboard details in section 2 aren't complete. First, not all LGA1150 boards with two x16 length slots can handle PCIe lane splitting to accommodate SLI or CFX. Only Z chipsets can split lanes to do this.

Second, regardless whether it's a Z chipset or not, many boards have a second x16 length slot that's only wired for 4 lanes. In order to run SLI on LGA1150, you need a Z chipset with at least two x8 wired slots. Generally, if your mboard doesn't come with an SLI bridge in the box, it doesn't support SLI.

Third, you didn't mention anything about AMD boards here. The slots and lane wiring requirements will be the same as LGA1150, but you need a 990 chipset for AM3+. The only FM2+ boards that have the necessary lane support are the A88X chips, but they don't seem to "officially" support SLI.
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640


Gigabyte has agreed to loan us two of the new AMD flagship cards upon launch. That's all I can say.

Filippo
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640


The 180 driver release is a 2008 branch (i.e., 7 years old.) I believe that FAQ information to also be outdated. We've had consistent issues in our labs getting 9x0s from different manufacturers working with each other.

Filippo
 

Gunbuster

Distinguished
Dec 31, 2007
17
0
18,520
"A motherboard with at least two free PCIe x16 slots, operating in at least in x8 mode (Nvidia does not support SLI on x4 links). Pretty much all LGA 2011, LGA 2011-v3 and LGA 1150 motherboards satisfy this requirement."

Since when? As far as I know the motherboard MUST be SLI certified by Nvidia. AKA the Nvidia driver looks for a BIOS flag that says SLI is OK. No nvidia SLI logo no SLI for you! http://www.geforce.com/hardware/technology/sli/motherboards
 

TallestJon96

Distinguished
Dec 27, 2014
256
0
18,810
Great article and good analysis, but I have to disagree with your conclusion. Some games don't work with SLI for months, others have glitches, and scaling is usually between 60%-80%. The only time I would consider SLI is for 4k, but when games don't launch with SLI support, then you cannot reliably game at 4k.

I know toms had an article about AMD and NVIDIA being SLI compatible, and by extension different cards for the same manufacturer (say a gtx 970 and 680) would be compatible. This would make SLI mug more viable, as you can keep 1 old graphics card and buy a new one, and always have your last graphics card and newest one working together.
 

endeavour37a

Honorable
Thank you Filippo (mechan) for the info, from what I hear they will be released in June sometime.

Quote:
"I know toms had an article about AMD and NVIDIA being SLI compatible, and by extension different cards for the same manufacturer (say a gtx 970 and 680) would be compatible."

TallestJon96, you cannot SLI a 970 and 680 together, one is a GM204-200 and the other a GK104-400. You can XFire some AMD cards with different cores, I saw the list from AMD a bit ago, but not NV cores.

So NV pushed back the 980Ti to 2016, they were going to get it out by the end of summer, dang it! I thought they were just holding it back waiting for AMD's new cards, not like they have to engineer it as it's just a Titian X w/6GB in-stood of 12GB.

 

gallovfc

Distinguished
Oct 5, 2011
75
0
18,640
I have no problem using DSR while playing games like GTA 5, WoW, Laft 4 Dead 2. The only problem is if you ALT+TAB some games may come back to 1080p when you return to it.
 

BradleyJames

Reputable
Feb 18, 2014
421
0
4,960
I too really wished they released a 4 gb 780ti. im running 2 in sli and they are great cards. i was not impressed with the mere 5% ( if that ) performance increase of the gtx 980.
 

Eggz

Distinguished


Yeah, I have that card too. It's got more horsepower than the VRAM can handle. Sad :-/
 
Thanks for the article.
I'll note as others have ... planning for, or counting on a multi-GPU rig is not always in your best interest. By the time you are ready for that second card most likely there will be a single card on the market that will come close (or exceed) the performance of your multi-GPU setup.

And, while a comparison with Crossfire scaling will be greatly anticipated by many -especially with the new Radeons- the 'scaling comparison' seems a bit premature as the Catalyst drivers are likely to be less than 'mature' (understanding that the Volcanic Islands refresh is quite an unknown commodity at this moment).

Years ago --- THG did a comparison of 'early' Crossfire driver performance on a new card(s) versus subsequent updates, and IIRC, there was quite a nice boost.

Including 'mature' R9 290/X Crossfire scaling with that of the new varieties of 'green' Radeons would seem to be appropriate (not knowing , of course, if AMD will open the door to 200-series Hawaii running in tandem with a new 300-series Hawaii 'refresh')

 

legokangpalla

Honorable
Feb 28, 2013
23
0
10,510
I still prefer crossfire as it has the sfr supertiling option. Afr is unplayable due to microstuttring except for few optimized games.
 

mapesdhs

Distinguished
Filippo,

S'funny, you have the same EVGAs I bought, which I chose precisely because
of the 980 behaving so well for Elite Dangerous. Coupla things though...

"...or an even more expensive LGA 2011-v3 platform if you want to go
beyond two-way SLI."

Nah, just buy a used X79 board, much cheaper! I bought a P9X79 Deluxe for
75 UKP (for my brother), and a Rampage IV Extreme for 113 UKP, beat that. :D
Bought another R4E for 103 for benching with a 4820K.

A couple of points about Elite Dangerous: a single 980 does indeed give
very high frame rates, but what I wanted was to be able to crank up the
detail as much as possible while always staying above 60fps. A single
980 achieves this nicely, with just one of the detail settings being one
level below max. Also, ED performance reviews showed that having a
strong main CPU can definitely help when there's a lot of onscreen
action, hence why I went with a used X79 build (3930K @ 4.8) instead
of a new Z97, which was much cheaper too. And when I eventually get
a 4K TV, the mbd will handle another in SLI just fine with no worries
about PCE/bw limitations, etc.

Interesting to compare your stock SLI score against mine with 2x 980 @ stock:

http://www.3dmark.com/compare/fs/4058701/fs/3974252

Alas it shows that even Firestrike is now susceptible to distorted overall scores
because of a strong CPU, even though the Graphics Test numbers are not that
different (fair bit better for the Combined Test though of course).

Ian.

PS. Firestrike fun with 3-way:

http://www.3dmark.com/fs/3971612

and about the quickest P55 system you can have, hehe... (runs surprisingly
well with an i7 870, GFX Test 2 is only 3% slower on the P55, while GFX Test 1
only loses about 10%):

http://www.3dmark.com/fs/4099529

 

mechan

Distinguished
Jan 25, 2009
90
0
18,640


I am a big fan of creative solutions myself - hard to recommend used boards to a general audience though. That being said, I have myself an old Core i7-950 which works perfectly fine for pretty much anything I through at it.



That's why I haven't used numerical scores in the review. CPU and combined scores of Firestrike have always been pretty much meaningless from a real-world-performance perspective, and especially so when looking at GPUs.

Filippo
 

mapesdhs

Distinguished


There's a lot of utility in used parts IMO, and of course I would always suggest
only buying from reputable sources, sellers that accept returns, etc. Btw, not
just used, but also new sold via normal auction or even just bargain BIN amounts,
eg. sorting out a budget system for a friend atm, I bought a new Intel 520 240GB
for 45 UKP, a new Antec 300 case for an astonishing 17 UKP (that includes shipping!
Seller accepted an offer of just 10), used Z68 mbd for 41 from a good refurb seller
(novatech). A while ago I bagged a completely new ASUS M4E for only 80.

However, I understand what you mean, it's not for everyone, and by definition
not everyone could or would want to. Given the nature of the audience here
though, it's worth mentioning, especially since sometimes supplies of good stuff
in qty can be surprising. ASUS inparticular chucks out a lot of refurb boards
with warranties, or at least they do in the UK.




I'd like to get hold of a 950, a more applicable X58 example than the X5570
I have atm.




Indeed, but now it's getting kinda worse just like 3DMark06 eventually became.
I suppose they could change the weightings, but of course that would invalidate
older scores. Pairing GPUs with ever more powerful CPUs means the main GFX
tests become less relevant when wrapped up in the overall score.

Ian.

 

Deyadissa

Reputable
Apr 11, 2015
65
0
4,660
People are running into lots of problems with the SLI in 900's series with two different brand cards. I do not recommenced getting 2 different brands of the same card, it would have saved me at least 10 hours of my life troubleshooting.
 

Eggz

Distinguished
FOLLOW UP QUESTION FOR TOM'S

RE: SLI Bridge Bandwidth - Will Tom's think of, execute, and publish results of tests showing whether the very low bandwidth of the SLI Bridge poses a bottleneck? I have a strong feeling it does. The connectors are one of the few pieces (if not the only piece) in high-performance systems that rely on technology more than a decade old, and perhaps it's nearly two decades old.



But look at how long Nvidia's been using this.



PCI-e first came out the same year (2004) with PCI-e 1.0, and the max bandwidth of a x 16 slot was only 4 GB/s. So the SLI Bridge provided up to a 25% increase. Okay. But now the PCI-e slots have quadruple the bandwidth, with roughly 16 GB/s for PCIe- 3.0 x 16.


The cable itself may actually be much older than 11 years. Today's SLI bridge may in fact use the same cable configuration that 3Dfx used (and 3Dfx started SLI) starting back in 1998.



If current SLI Bridges are just prettier versions of the original SLI cables, that would make the modern SLI bridge a 17 year old interface on cards like the Titan X! After researching without finding definitive information, it certainly does appear that the 3Dfx SLI cable and the Nvidia SLI Bridge could differ only by appearance.

Regardless of the age, however, the fact remains that modern SLI bridges are limited to a measly 1GB/s. That was originally to spare PCI bandwidth, but we've moved on to PCI-e, which is much faster. We have plenty of good information showing that PCI-e 3.0 provides excessive bandwidth for GPUs beyond x 8 (click image below for link).


If PCI-e 3.0 x 8 is enough for even the fastest cards, then x 16 (with twice the bandwidth) is WAY more than enough. So why, then, would it make sense to use an SLI Bridge when the PCI-e bus has potential to do a far better job?

Perhaps the Nvidia purchase agreement used to acquire 3Dfx has a condition that requires the connector technology to stick around. Maybe there's not enough of an SLI market for Nvidia to allocate R&D toward the issue. And maybe 1 GB/s is actually plenty of bandwidth given the current state of things. Who knows.

The only clear thing to me on this topic is that the SLI Bridge seems to clearly not fit. If possible, it should be updated. And if not possible for some reason, there should be information telling people when the issue preventing innovation will expire.
 
Status
Not open for further replies.