Why is SLI tech so under utilised?

KnowasI

Honorable
Jun 6, 2013
7
0
10,510
The reason I ask is that i have been researching SLI builds and have noticed quite a few posts suggesting that it is being utilised by quite a small minority.
Hence the delay in SLI profiles and the apparent lack of support by nvidia when there are SLI issues.

What I don't understand is why more gamers don't utilise this option. Speaking for myself I have seen huge gains in performance since buying my second gtx 680.
In fact in most games i am seeing almost a 100% FPS increase. This includes Tomb Raider, BF4, Assassins Creed Black Flag and Witcher 2. Everything Maxed out including Uber-sampling on Witcher 2 at 1080p and nothing is below 60 FPS.
Looking at other card's benchmark's my setup is comparable to the 780 ti, the recent gtx 970 OC and GTX 980. All of these cards would have come at a huge cost compared to the $230 I just dropped to get my second card.

Now I have heard concerns that there are issues with microstuttering and SLI profiles for new games but i have not experienced this so far.
I just bought Shadow of Mordor that has no SLI profile as yet. But 2 minutes on google I found a fix using Nvidia inspector and adding an exisiting profile for F.E.A.R 3 and have had zero issues 15 hours in. No microstutter and no game crashes. Again running everything maxed and getting over 60 FPS no sweat. I am also using the High texture pack which according to the developers cannot be used by cards with less than 3 gigs of v-ram. My Cards only have 2.

So this begs the question when a gamer can get almost twice the performance at such a low cost why don't they?

Have I just been lucky so far?
Have i perhaps been misinformed about the amount of gamers using SLI?

Are there loads of folks out there that are doing this? If this is so then surely there would be more pressure on Nvidia to release day one SLI profiles and give more support when there are issues. I mean if someone out there can figure out that a previous profile works on a new game, like Shadow of Mordor, in less than 2 days of this game being released surely Nvidia can do the same.

Well that's my ten cents on this issue.

Let me know what you think.
 
Many gamers will not want/be able to set up SLI profiles to work for non supported games. When you buy something you want it to work out of the box. When I had my GTX 460 SLI, it worked fine after I found out how to modify the profile to work with BO2. I did buy COD Ghosts - what a POS that was with SLI - worked better with one card. Not long after that I had a GPU go bad - I now use a single GTX 670 in that computer - far less heat/PSU load/hassles. SLI when working - awesome. Many games now days are built for consoles - SLI is an afterthought.
-Bruce
 
The big problem is that it becomes "get a profile or update drivers EVERY time a new game comes out" and 99% of people don't really want to deal with that. The only people who have the patience to are the people who want the absolutely best all the time and are willing to jump through any hoops to get there. It doesn't help that SLI/CrossFire are giant black boxes that developers have limited, if any, access to at all, so you're entirely reliant on Nvidia/AMD to make it work.

Hopefully, with the movement in DirectX 12, Mantle, and other emerging frameworks to get developers closer to the hardware, we'll start to see developers being allowed to work with how multi-card rendering is done and get more optimization, but until then, you're left with a technology that's only beneficial 50-75% of the time and involves buying another graphics card.
 
It's simple really, sli setups requires a bit of work and is not just gonna work when you plug the sli bridge and play your games. You do get better results with sli but that's on optimised games (which are a lot) I did not play any game that is not optimised yet an when i run into one one card will probably be able to do extremely well in it.

New people to gaming who just want a normal working system want a solid easy to run pc. And single set up is the best to achieve that..
 
Well if more people were using this tech then there would be more pressure on Nvidia to support it properly. Perhaps more developers would take it into consideration when making their games if this was the case.