News The End of SLI As We Know It: Nvidia Reveals New Model

Olle P

Distinguished
Apr 7, 2010
720
61
19,090
Given that SLI has been on a downward slope for about 20 years it's more surprising that they haven't officially killed it sooner.
If anything, having a well functioning SLI could have been useful over the last five years or so, as cost/performance in graphics has stayed the same. Could have been nice to buy a $250 card, wait two years and buy another one of a newer version (but only marginally better than the old card) and add them together to get better performance. But oh no, Nvidia wanted you to spend $500 to get that level of performance...
 

edwilson

Distinguished
Dec 6, 2007
52
1
18,630
I was a major SLI supporter all the way back to Voodoo 2. Most recently with 2x 1070. However I noticed significant lack vendor support starting a few years ago. It is my understanding that one person at Nvidia in particular was the driving force behind SLI gaming who has now left around that time. We were always a niche market but it was fun while it lasted. I was able to extend the gaming life span of every machine I built by at least 2 years simply by adding a 2nd card when time came. When it worked correctly and was supported , it was like strapping on a supercharger to your PC. RIP SLI. It was fun while it lasted. I will pour one out for you.
 
  • Like
Reactions: Soaptrail

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
Nvidia then should sell SLI patents to AMD ... it is a shame for this innovation to be locked out from the rest of the world.

I really hate that some companies buy other companies and then lock out their patents and stop using them. the whole patent system is bad as it is today.
 
Nvidia then should sell SLI patents to AMD ... it is a shame for this innovation to be locked out from the rest of the world.

I really hate that some companies buy other companies and then lock out their patents and stop using them. the whole patent system is bad as it is today.
Nvidia may own patents on SLI but they don't have a patent on split frame or alternate frame rendering.

The article never said it directly but having multiple GPU working at the same time on the same game is part of Directx 12.


It's an old 2016 link but its still Directx 12.
 
Sep 18, 2020
1
0
10
norman.rasmussen.co.za
SLI makes it hard/impossible for games to tweak the multi-gpu setup for the best performance because it's implemented in hardware and limited to a specific driver setup, so continuing to support multi-gpu/SLI in dedicated hardware and the kernel drivers no longer makes any sense.

User mode graphics libraries like DirectX and Vulkan now support (for more than a year) multi gpus and allow the application to fine tune how the rendering happens over multiple gpus.
 

Endymio

Reputable
BANNED
Aug 3, 2020
725
264
5,270
Nvidia then should sell SLI patents to AMD ... it is a shame for this innovation to be locked out from the rest of the world.
I doubt AMD has much interest, for the same reason NVidia is dropping it. It's a huge, costly driver headache for very little benefit -- to the company, that is.

In any case, the majority of those patents are expiring soon or have already expired.
 

g-unit1111

Titan
Moderator
Considering the power requirements for the 3090, I don't think there's a PSU on the market that could adequately power an SLI setup. You'd probably need the city to turn on the auxiliary power like the Griswolds did in Christmas Vacation if you were to attempt such a setup.
 

escksu

Reputable
BANNED
Aug 8, 2019
878
354
5,260
Nvidia then should sell SLI patents to AMD ... it is a shame for this innovation to be locked out from the rest of the world.

I really hate that some companies buy other companies and then lock out their patents and stop using them. the whole patent system is bad as it is today.

Amd has stopped supporting crossfire quite sometime ago....so sli/cf is basically dead.
 

escksu

Reputable
BANNED
Aug 8, 2019
878
354
5,260
Considering the power requirements for the 3090, I don't think there's a PSU on the market that could adequately power an SLI setup. You'd probably need the city to turn on the auxiliary power like the Griswolds did in Christmas Vacation if you were to attempt such a setup.

There is. Corsair ax1600i.
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
Considering the power requirements for the 3090, I don't think there's a PSU on the market that could adequately power an SLI setup. You'd probably need the city to turn on the auxiliary power like the Griswolds did in Christmas Vacation if you were to attempt such a setup.

1200 watts power supply will be more than enough for SLI RTX 3090 ...
 

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
Considering the power requirements for the 3090, I don't think there's a PSU on the market that could adequately power an SLI setup. You'd probably need the city to turn on the auxiliary power like the Griswolds did in Christmas Vacation if you were to attempt such a setup.
You used to be able to SLI 3 Titans, which coincidentally cost the same $3000 as a dual 3090 setup would. 3 Titans have a 750W TDP which exceeds the 700W for the 3090's. If that wasn't enough, you could add a 4th Titan for a total TDP of 1000W. Two 3090's could probably be run with a 1000W PSU depending on your CPU. 1200W would be plenty.
 

g-unit1111

Titan
Moderator
You used to be able to SLI 3 Titans, which coincidentally cost the same $3000 as a dual 3090 setup would. 3 Titans have a 750W TDP which exceeds the 700W for the 3090's. If that wasn't enough, you could add a 4th Titan for a total TDP of 1000W. Two 3090's could probably be run with a 1000W PSU depending on your CPU. 1200W would be plenty.

Then add the power that an overclocked 10900K takes and you would probably exceed that 1200W input.

But my comment was mostly supposed to be a joke.
 

Gurg

Distinguished
Mar 13, 2013
515
61
19,070
This also is starting to look like the end to overclocking. The Asus TUF Gaming OC review found basically the same top end performance and boost for that $50 more expensive, three fan larger cooler card as the slimmer FE. Overclocking headroom for CPUs also seems to be going away in favor of manufacturers set boost controllers.
 
Last edited:

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
This also is starting to look like the end to overclocking. The Asus TUF Gaming OC review found basically the same top end performance and boost for that three fan larger cooler card as the slimmer FE. Overclocking headroom for CPUs is also going away in favor of manufacturers set boost.

not really , it is the power needed for the card ... Nvidia this time did not bother to lower the TDP much and added as much Cuda cores as it could ...

maybe the next generation will be 250 watts max for 50% more performance ...