Question Modern SLI for 3D Render Machine

May 20, 2023
2
0
10
Hello,

I have built a machine specifically to handle multi-GPUs for rendering with Redshift on Maya. Bridge was installed correctly (couldn't physically push it in any further). Both GPUs are installed properly and have been detected in GPU-z, PCIe was switched in Bios to have two x8 slots, and I even reverted back to both the 461.40 and 461.09 versions of the GeForce drivers (though Redshift showed some instability at that point).

i still cannot seem to figure out why SLI isn't enabled on GPU-z or in the NVIDIA control panel.

Specs:
Intel Core i9-13900K 3 GHz 24-Core Processor
MSI MEG Z690 ACE EATX LGA1700 Motherboard (Only one with SLI enabled)
G.Skill Trident Z5 Neo RGB 64 GB (4 x 32 GB) DDR5-6000 CL30 Memory
Seagate IronWolf Pro 14 TB 3.5"
2x EVGA FTW3 ULTRA GAMING GeForce RTX 3090 24 GB Video Card
Corsair HX1500i (2022) 1500 W 80+ Platinum Certified Fully Modular ATX Power Supply


I understand that SLI for gaming is dead. This isn't for gaming. At the time of design SLI seemed still possible, however it took a while to acquire all the parts.
 
May 20, 2023
2
0
10
SLI for the 30 series is only supported when a game is specifically enabling it through DX12. Older methods such as driver adaptations by Nvidia of DX11 games aren't supported

redshift supports multi gpu
image.png


So it doesn't even show up in the control panel anymore? Regardless, I know I'll still be able to get multi-gpu to work for redshift.
 
D

Deleted member 2947362

Guest
Doesn't the software just automatically use all the CUDA cores for rendering software?
Or do you have to select the GPUs in the software's render settings you want to use? is SLI even needed for that?

I wouldn't know myself I'm just interested.
 
D

Deleted member 2947362

Guest
Hello,

I have built a machine specifically to handle multi-GPUs for rendering with Redshift on Maya. Bridge was installed correctly (couldn't physically push it in any further). Both GPUs are installed properly and have been detected in GPU-z, PCIe was switched in Bios to have two x8 slots, and I even reverted back to both the 461.40 and 461.09 versions of the GeForce drivers (though Redshift showed some instability at that point).

i still cannot seem to figure out why SLI isn't enabled on GPU-z or in the NVIDIA control panel.

Specs:
Intel Core i9-13900K 3 GHz 24-Core Processor
MSI MEG Z690 ACE EATX LGA1700 Motherboard (Only one with SLI enabled)
G.Skill Trident Z5 Neo RGB 64 GB (4 x 32 GB) DDR5-6000 CL30 Memory
Seagate IronWolf Pro 14 TB 3.5"
2x EVGA FTW3 ULTRA GAMING GeForce RTX 3090 24 GB Video Card
Corsair HX1500i (2022) 1500 W 80+ Platinum Certified Fully Modular ATX Power Supply


I understand that SLI for gaming is dead. This isn't for gaming. At the time of design SLI seemed still possible, however it took a while to acquire all the parts.
The software your using does it sate you need to enable SLI? if not what render setting are available for you to choose/select from?

Look in the settings of the software your using to render
Look for settings related to Nvidia CUDA and possibly your graphics cards may be listed in the settings
I'm sure it would most likely ask you to disable SLI if it was enabled
Not 100% sure but it doesn't need to access the cards that way
I'm sure I played with stuff like that when I had quad fired HD3850's, sure I had to disable cross fire mode,
But that was many years ago. and AMD not Nvidia CUDA.
 
Last edited by a moderator:
D

Deleted member 2947362

Guest
your probs not going to like this but try in the BIOS disable XMP if ram is XMP put all ram settings on auto

if that cures your stability issue you can start to up ram speed to find out the best stable speed the memory controller can sustain while heavy rendering

I just noticed your using 4x32gig sticks at 6000mhz that's a lot of stress on the memory controller in heavy work loads like rendering if it has to access a lot of system ram

I don't know what speed your CPU memory controller is rated for when using all four slots but it will be lower speed than using two slots which official supported speed is 5200Mhz for 2 slots
The reason Intel and AMD says those speeds are guarantee because that's what they know they can sustain without error Of cause you can overclock but it can cause issues like what your having now, it's the pros and cons of overclocking stuff

I'm speculating :

All this ram that's in all these PC's that is running the CPU memory controller faster than it's official rated speed is making the CPU memory controller produce errors but they are within the memory controllers tolerances.

That is up until they are not and cause random crashing or blue screens etc, that's why even if it passes a mem test in other conditions it may crash, the faster the speed probs the narrower the tolerances the memory controller can sustain

But I could be totally wrong.
 
Last edited by a moderator: