[SOLVED] 2 GPUs without SLI

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
May 6, 2021
11
0
10
I've recently bought a second GPU and added it to my motherboard. It is now running 2 GPUs without SLI and suprisingly I can tell the difference when gaming. But the fact that I do not have SLI/Crossfire it bugging me. Does anyone know how it's working?
 
Solution
DX11 was the last DirectX to support sli/crossfire. Starting with DX12 (native to Win10) that was changed to multi-gpu (mgpu) instead. The difference is that instead of flip-flop processing like in sli where only 1 card is doing ½ the work at any given point, both cards are used simultaneously not sequentially.

The level of help from the second card depends on the amount of support from the game code, but at a minimum DX12 and NVCP can assign extras like physX or shadow or rtx or lighting affects processing to the second card.

Intel had something similar years ago with Lucid (3rd Gen Intel) using extra igpu power to smooth out pixelation and nvidia has been doing it for years with things like dedicated physX cards. (became redundant...
May 6, 2021
11
0
10
Everything you are telling us is completely impossible. Thats not how any of this works. There is no way you will be able to provide real numbers to back up your findings..

I know I can sound delusional or even a complete lunatic. I understand how unlikely and impossible this outcome is but I made this post for a chance to prove myself, that's all I ask.
 
There's a bunch of YouTube videos of people apparently doing Crossfire with 6900 XT's.

Well the control panel may enable it (which would be in error) but crossfire support has to be baked into the game and into the drivers. AMD said there will be no more crossfire updates from the 5xxx series and up because so few people used it and it was of little benefit. In other words, it wasn't worth it for their over stressed/worked driver team.
 
I have a RTX 2060 and a GTX 960 connected to a ASUS b150 pro gaming without SLI. I don't know how I've achieved this but game performance, even when RTX is enabled, is greatly improved. Aparently I didn't need SLI to have more performance, but I don't know if it's getting a botleneck because of that, despite having one I would say it's not worth the purchase and I don't even play at 4k. The whole thing is weird, I plugged both in to have one for streaming only and I noticed the impact on performance after launching Resident Evil 8 and enabling RTX.

HIGHLY doubtful unless those games are using PhysX OR mgpu and you are offloading physics calcs to the 960.
 

InvalidError

Titan
Moderator
Still failing to see how CF is working. Last I knew there were no bridges on the radeon pcbs.
AMD phased out bridges around 2014, when PCIe 3.0 became the norm with 3.0x8 providing enough bandwidth to handle both CPU-GPU and GPU-GPU traffic at least for resolutions, frame rates and available CPU/GPU-power of the time.

HIGHLY doubtful unless those games are using PhysX OR mgpu and you are offloading physics calcs to the 960.
Another scenario where it could make a difference is if the desktop and other apps land on the 960, leaving more resources available on the 2060. Doubt Windows would do that sort of balancing gymnastics on its own though, probably need to set the 960 as the default adapter and then force games on the 2060 or something like that.
 

Karadjgne

Titan
Ambassador
Op is using the RTX enabled on a 2060. That's brutal for that card, even minecraft will dump 100ish fps easily there.

So yes, I can see the 960 getting laden with the physX calcs, which will visually improve performance as the workload is somewhat ported out, raising fps. It's not increasing fps, like sli did, it's just lowering the impact on an overly taxed cards resources.
 
May 6, 2021
11
0
10
Op is using the RTX enabled on a 2060. That's brutal for that card, even minecraft will dump 100ish fps easily there.

So yes, I can see the 960 getting laden with the physX calcs, which will visually improve performance as the workload is somewhat ported out, raising fps. It's not increasing fps, like sli did, it's just lowering the impact on an overly taxed cards resources.

I also thought about that, sadly that wasn't the case. Either I messed up the first time I plugged in the 2060 or something else.

The first time I inserted the 2060 it really had poor performance but today I got home and tried the benchmarks again and I have the same performance despite if the 960 was inserted or not. I'm happy about the 2060 performance now than I was before, I literally thought that for an RTX it was really bad performance. Now everything makes sense.

I tested it with Rainbow 6, the RTX can handle itself pretty well even if the graphics are using more RAM than the 2060 can handle.

My apologies for making it seem I was able to acomplish something I never did. This never made sense to me, I originally intended to use one card for streaming and another for gaming and suddenly this whole post can explain what happened.
 
Well, at least you fixed the performance...
Btw, using more VRAM than the available is bad for performance. No questions about it. What resolution are you playing at? If it's 1080p then it's very very difficult for a game to use more than the 6GB available in the 2060.
 
Status
Not open for further replies.