I just built my PC two months ago with an i7-4790K, Asus Z97, 16 GB RAM, 1 TB HD, and Windows 10 and games installed on a 500 GB Samsung Evo SSD. My one SLI is also only two months old and the other one I just added is brand new. I even just upgraded from a Corsair RM850 to a RM1000 to make sure that I have no issues with a lack of power for these two monster cards.
But now games such as Rainbow 6: Siege, Fallout 4, and The Witcher 3 are all running at a higher and more stable FPS when I Disable SLI in the Nvidia Control Panel.
For example, Rainbow 6 detects Max Ultra settings at 3840x2160, but when I get into an actual game with SLI enabled it drops down to around 20-30 FPS and isn't playable. Disabling SLI brings it back up to around 30-60 FPS, but it's still unplayable at 4k even with turning some of the other graphics settings way down. With or without SLI, both FPS values seem very low for GTX 980 Ti cards.
What makes this issue even stranger is when I run the Heaven Benchmark on max 4K settings the FPS scores basically double when SLI is enabled through the Nvidia Control Panel... as you would expect with SLI enabled: 45.6 FPS, 1149 Score, 28.3 Min FPS, 91.9 Max FPS... So it seems like my SLI is working when benchmarking and I can see both cards are kicking at around equal power outputs, but they just don't seem to be working correctly once I get into games and SLI actually seems to be hurting my performance... My PhsyX settings are on Auto and Power Management is set to Prefer Max Performance... Heat levels seem to be fine... Tried installing the latest Nvidia drivers multiple times... nothing seems to help. I have no idea what's going on here.
Update: Watching the GPU usage for Fallout 4, it looks like when SLI is disabled my one card will use around 90% of the GPU power, as it should be doing, and reaches a nice 55-60 FPS. However, when I turn on SLI, which in theory should allow the second card to kick in to help maintain a solid 60 FPS, it's like instead both cards decide to take a break and relax around 50-70% GPU usage and together can't even hit 50 FPS.
The Witcher 3 has the same issue as Fallout, SLI causes Card 1's GPU to stay at 80-90% while Card 2 stays at 50-70% and only results in 30 FPS. Turning off SLI forces Card 1 to use 99% and Card 2 to 0%, and by having a single card at 99% I get better results at 35-40 FPS.
Rainbow 6's GPU usage is even stranger because with SLI enabled it causes one card to max out around 99% GPU and the other around 40-50%... but despite using more total GPU overall, the results are far worse than using a single card, as mentioned above.
But now games such as Rainbow 6: Siege, Fallout 4, and The Witcher 3 are all running at a higher and more stable FPS when I Disable SLI in the Nvidia Control Panel.
For example, Rainbow 6 detects Max Ultra settings at 3840x2160, but when I get into an actual game with SLI enabled it drops down to around 20-30 FPS and isn't playable. Disabling SLI brings it back up to around 30-60 FPS, but it's still unplayable at 4k even with turning some of the other graphics settings way down. With or without SLI, both FPS values seem very low for GTX 980 Ti cards.
What makes this issue even stranger is when I run the Heaven Benchmark on max 4K settings the FPS scores basically double when SLI is enabled through the Nvidia Control Panel... as you would expect with SLI enabled: 45.6 FPS, 1149 Score, 28.3 Min FPS, 91.9 Max FPS... So it seems like my SLI is working when benchmarking and I can see both cards are kicking at around equal power outputs, but they just don't seem to be working correctly once I get into games and SLI actually seems to be hurting my performance... My PhsyX settings are on Auto and Power Management is set to Prefer Max Performance... Heat levels seem to be fine... Tried installing the latest Nvidia drivers multiple times... nothing seems to help. I have no idea what's going on here.
Update: Watching the GPU usage for Fallout 4, it looks like when SLI is disabled my one card will use around 90% of the GPU power, as it should be doing, and reaches a nice 55-60 FPS. However, when I turn on SLI, which in theory should allow the second card to kick in to help maintain a solid 60 FPS, it's like instead both cards decide to take a break and relax around 50-70% GPU usage and together can't even hit 50 FPS.
The Witcher 3 has the same issue as Fallout, SLI causes Card 1's GPU to stay at 80-90% while Card 2 stays at 50-70% and only results in 30 FPS. Turning off SLI forces Card 1 to use 99% and Card 2 to 0%, and by having a single card at 99% I get better results at 35-40 FPS.
Rainbow 6's GPU usage is even stranger because with SLI enabled it causes one card to max out around 99% GPU and the other around 40-50%... but despite using more total GPU overall, the results are far worse than using a single card, as mentioned above.