[SOLVED] 2 GPUs without SLI

Status
Not open for further replies.
May 6, 2021
11
0
10
I've recently bought a second GPU and added it to my motherboard. It is now running 2 GPUs without SLI and suprisingly I can tell the difference when gaming. But the fact that I do not have SLI/Crossfire it bugging me. Does anyone know how it's working?
 
Solution
DX11 was the last DirectX to support sli/crossfire. Starting with DX12 (native to Win10) that was changed to multi-gpu (mgpu) instead. The difference is that instead of flip-flop processing like in sli where only 1 card is doing ½ the work at any given point, both cards are used simultaneously not sequentially.

The level of help from the second card depends on the amount of support from the game code, but at a minimum DX12 and NVCP can assign extras like physX or shadow or rtx or lighting affects processing to the second card.

Intel had something similar years ago with Lucid (3rd Gen Intel) using extra igpu power to smooth out pixelation and nvidia has been doing it for years with things like dedicated physX cards. (became redundant...
I've recently bought a second GPU and added it to my motherboard. It is now running 2 GPUs without SLI and suprisingly I can tell the difference when gaming. But the fact that I do not have SLI/Crossfire it bugging me. Does anyone know how it's working?

Unless you are running a specific game that supports DirectX12 Multi GPU (10 games in this list https://en.everybodywiki.com/List_of_games_with_DirectX_12_support ) you are not seeing any performance improvement.
 
  • Like
Reactions: digitalgriffin
I used to have a Gigabyte Z390 Designare motherboard and was using 2 EVGA 3090 FTW3 Hybrids in SLI mode with the NVLink bridge. The motherboard did not have the proper 4-slot spacing, so I ended up getting 2 riser cables and had the cards positioned vertically. There were not very many games I played that actually supported SLI, but the ones that did, the gameplay was slightly smoother in 4k.

Eventually, I upgraded to an MSI X570 Godlike motherboard(which is the only X570 board with the proper 4-slot spacing). After doing so, I could never get SLI to enable again in the Nvidia Control Panel. I never figured out what the problem was & eventually just put the 3090's into different PC's. I am currently switching back & forth between a PC I mainly use in my living room(5950x w/ 1 EVGA 3090 FTW3 Hybrid) and a PC in my bedroom(5950x but with a Red Devil 6900 XT Ultimate).

I will say... between the 3090 and this Ultimate edition 6900 XT, I've noticed that overall gameplay is slightly smoother with the 3090, but the overall visuals appear a bit more crisp/vibrant with the 6900 XT.

Anyway... as someone who has used SLI in the past... I can safely say that unless you are playing at 4k Ultra, you most likely won't notice a difference at all with SLI enabled or disabled. Now if you get an 8k display, SLI supported games would then play smoother & probably be noticeable then.

Since you didn't mention which cards you have... if they are both AMD, you should be able to just enable Crossfire in the AMD software. Although as with Nvidia's SLI, game support is a bit limited.

[EDIT]: If you have 2 3000 series Nvidia cards, only the 3090's support SLI/NVlink. Which means you will also need to buy an "NVLink bridge adapter" to link the 2 cards. You will also most likely need 2 matching cards as the connector is in a slightly different place on each card.
 
Last edited:
DX11 was the last DirectX to support sli/crossfire. Starting with DX12 (native to Win10) that was changed to multi-gpu (mgpu) instead. The difference is that instead of flip-flop processing like in sli where only 1 card is doing ½ the work at any given point, both cards are used simultaneously not sequentially.

The level of help from the second card depends on the amount of support from the game code, but at a minimum DX12 and NVCP can assign extras like physX or shadow or rtx or lighting affects processing to the second card.

Intel had something similar years ago with Lucid (3rd Gen Intel) using extra igpu power to smooth out pixelation and nvidia has been doing it for years with things like dedicated physX cards. (became redundant when gpu vram far exceeded games capacity to use it).

Either way, a second card can help with the looks of the picture, if it's setup to do so, performance gains usually coming from lowering detail levels that were strangling a single gpu.
 
  • Like
Reactions: digitalgriffin
Solution
Unless you are running a specific game that supports DirectX12 Multi GPU (10 games in this list https://en.everybodywiki.com/List_of_games_with_DirectX_12_support ) you are not seeing any performance improvement.

I have a RTX 2060 and a GTX 960.
I played Resident Evil Village with the 2060 and with RTX enabled in the graphic options and the gameplay wasn't smooth at all until I plugged in the 960, I can use more RAM than the main GPU (RTX 2060) has, I noticed it in about every game I've played.

CS:GO, Rainbow 6, Valorant and other games that are not listed there have reached higher fps with both cards. It may sound I'm making this up or it's just placebo but I've done benchmarks before and after having both GPUs and I've certainly noticed the performance improvement. It's weird, and I have no idea how I've achieved this. My motherboard is ASUS b150 pro gaming and I think it only supports Crossfire which makes the whole thing even weirder.
 
I have a RTX 2060 and a GTX 960.
I played Resident Evil Village with the 2060 and with RTX enabled in the graphic options and the gameplay wasn't smooth at all until I plugged in the 960, I can use more RAM than the main GPU (RTX 2060) has, I noticed it in about every game I've played.

CS:GO, Rainbow 6, Valorant and other games that are not listed there have reached higher fps with both cards. It may sound I'm making this up or it's just placebo but I've done benchmarks before and after having both GPUs and I've certainly noticed the performance improvement. It's weird, and I have no idea how I've achieved this. My motherboard is ASUS b150 pro gaming and I think it only supports Crossfire which makes the whole thing even weirder.

You're going to need to post proof because what you are describing is totally and completely impossible.

The ONLY games that may have that effect on is RE Village if its DX12 which it probablky is just not on the list yet. Rainbow 6, CS:GO, and Valorant are not.
 
  • Like
Reactions: dotas1
I played Resident Evil Village with the 2060 and with RTX enabled in the graphic options and the gameplay wasn't smooth at all until I plugged in the 960
BS.
I can use more RAM than the main GPU (RTX 2060) has, I noticed it in about every game I've played.
GTX 960 would have no impact on that.
CS:GO, Rainbow 6, Valorant and other games that are not listed there have reached higher fps with both cards.
BS.
It may sound I'm making this up or it's just placebo but I've done benchmarks before and after having both GPUs and I've certainly noticed the performance improvement.
BS. Make screenshots and show us (upload to imgur.com and post link)
 
I used to have a Gigabyte Z390 Designare motherboard and was using 2 EVGA 3090 FTW3 Hybrids in SLI mode with the NVLink bridge. The motherboard did not have the proper 4-slot spacing, so I ended up getting 2 riser cables and had the cards positioned vertically. There were not very many games I played that actually supported SLI, but the ones that did, the gameplay was slightly smoother in 4k.

Eventually, I upgraded to an MSI X570 Godlike motherboard(which is the only X570 board with the proper 4-slot spacing). After doing so, I could never get SLI to enable again in the Nvidia Control Panel. I never figured out what the problem was & eventually just put the 3090's into different PC's. I am currently switching back & forth between a PC I mainly use in my living room(5950x w/ 1 EVGA 3090 FTW3 Hybrid) and a PC in my bedroom(5950x but with a Red Devil 6900 XT Ultimate).

I will say... between the 3090 and this Ultimate edition 6900 XT, I've noticed that overall gameplay is slightly smoother with the 3090, but the overall visuals appear a bit more crisp/vibrant with the 6900 XT.

Anyway... as someone who has used SLI in the past... I can safely say that unless you are playing at 4k Ultra, you most likely won't notice a difference at all with SLI enabled or disabled. Now if you get an 8k display, SLI supported games would then play smoother & probably be noticeable then.

Since you didn't mention which cards you have... if they are both AMD, you should be able to just enable Crossfire in the AMD software. Although as with Nvidia's SLI, game support is a bit limited.

[EDIT]: If you have 2 3000 series Nvidia cards, only the 3090's support SLI/NVlink. Which means you will also need to buy an "NVLink bridge adapter" to link the 2 cards. You will also most likely need 2 matching cards as the connector is in a slightly different place on each card.

I have a RTX 2060 and a GTX 960 connected to a ASUS b150 pro gaming without SLI. I don't know how I've achieved this but game performance, even when RTX is enabled, is greatly improved. Aparently I didn't need SLI to have more performance, but I don't know if it's getting a botleneck because of that, despite having one I would say it's not worth the purchase and I don't even play at 4k. The whole thing is weird, I plugged both in to have one for streaming only and I noticed the impact on performance after launching Resident Evil 8 and enabling RTX.
 
Only way I imagine, that would be possible, if you compared game using GTX 960 first, then
connected RTX 2060, connected monitor to it and then compared performance again.

I tried to play only the RTX and it didn't have as much performance as both plugged in.
The only game where it was complete crap was Warzone because I could only choose one GPU in the options.
That lead me to belive I was using both of them in other games.
 
I have to agree with the rest of the guys here. You won't have a better performance using both GPUs in the games you mentioned.

I have tested similar systems and saw no gains and there are plenty of reviewers that tried such things and proved it's a lost cause.

I know it shouldn't make sense, I should have SLI and smiliar GPUs for this to work but it is working and I have no idea how, I'm just trying to figure it out.

Trust me I will do the benchmarks again and will post pictures of the results tomorrow. Can't do it today because I'm not going to be home until tomorrow.
 
I know it shouldn't make sense, I should have SLI and smiliar GPUs for this to work but it is working and I have no idea how, I'm just trying to figure it out.

Trust me I will do the benchmarks again and will post pictures of the results tomorrow. Can't do it today because I'm not going to be home until tomorrow.

Everything you are telling us is completely impossible. Thats not how any of this works. There is no way you will be able to provide real numbers to back up your findings..
 
I am not saying you lie, I believe it's the placebo effect.
Also using more VRAM than the one available can only be done by using the MUCH slower system RAM which dramatically affects the performance (making it worse of course).

Okay, I somewhat thought I was going to get reported for being a troll.

It can be the placebo effect, I acknowledge that, but I was caught off guard, not even thinking about it was going to happen.
I have real life witnesses that confirm this happened and they all thought it was impossible.

I know I'm claiming that I broke the rules of how graphic cards work and maybe some laws of physics in an online forum which is not a very convincing scenario for this subject. Either way I'm will post photos or even an unedited video, if my mind is playing tricks on me I'll apologise for wasting everyone's time.
 
Status
Not open for further replies.