[SOLVED] Forgot to disable SLI before removing my second GPU and now my x8 PCIE slot doesn't work anymore and I don't get picture using only a single card?

MrBaum07

Reputable
Jun 28, 2019
7
1
4,515

Specs:

CPU: Ryzen 5 5600X with a Wraith Prism cooler
RAM: 4x8 GB G.Skill Trident Z RGB 3600 CL16-16-16-36
Mainboard: X570 Aorus Master Rev. 1.0 with BIOS Version F35e, later downggraded to F32
Graphics Cards: 2x KFA2 EXOC 1080 Ti, supposed to be running in SLI
PSU: BeQuiet Straight Power 11 850 Watt 80+ Gold


Main Part:
I have the following issue: In my X570 Aorus Master I have two KFA2 EXOC 1080 Tis and for a few days they worked fine. However, because I wanted to compare performance, I turned off my system, turned off the PSU and disconnected the cable from the wall to the PSU, and then removed my second 1080 Ti from my PCIE x8 slot (The motherboard has one x16, one x8 and one x4).

For some reason Windows 10 then disabled my remaining 1080 Ti and while I still had picture, Device Manager said that my 1080 Ti was disabled because problems were encountered and I had only the Microsoft Basic Display Adapter. I then had to do a complete uninstall and clean reinstall of my GPU driver, which made the card work again, at normal performance and normal temperatures. I did the tests I wanted and was satisfied.

Then I turned my system and power off again, put the second 1080 Ti in again, plugged the SLI bridge and cables in again and turned the PC on again. However, now my second 1080Ti, the one in the PCIE x8 slot, wasn't being recognized at all. Neither NVidia Control Panel, nor GPUZ nor even my Motherboard's BIOS recognized the GPU and plugging in my HDMI cable to the. I then turned PC and power off again, swapped the two cards, turned everything on again and still got the same results.

I once again uninstalled and clean installed my graphics drivers, but to no avail.

Both of my cards are working, but only one, the one that is in the PCIE x16 slot, is being detected by any kind of software. Weirdly enough, the one in the other slot still gets power as long as it is plugged into the PCIE x8 slot, but the fans are always spinning, while the fans of the card in the PCIE x16 slot stop spinning when idle.

What is even weirder: I don't even get a picture when using only a single card in my confirmed working x16 slot! I am forced to have both cards plugged in to even get a picture. And in GPUZ my single card runs only at PCIE x8 despite being in the x16 slot, so the system has to know a second card is there.

I then tried PCIE Bifurcation, using a riser extension cable and a PCIE splitter card to split my confirmed working x16 slot into two x8 slots. I enabled Bifurcation in my BIOS and it worked out. SLI was working. Sadly, that construct didn't fit into my case. Not wanting to have both GPUs be outside my case, I scrapped that bifurcation idea but did another test because I noticed something. My SLI was still turned on. It never automatically disabled and I suspect that is why a single card showed no picture.

So, I intentionally left SLI on, turned power and system off, removed the bifurcation card and the riser extension cable from my system and put both cards back into the motherboard's PCIE slots. And as I predicted, because the system was only detecting one card, I once again got the same problem as in the beginning. My single detected card didn't work properly and I had to clean reinstall drivers. Then I put the bifurcation setup back in, and, for some mysterious reason the bifurcation card now "magically" showed the same problems. One of the two PCIE slots didn't work and card in there had perma-spinning fans and the system didn't put out a picture using only single card.

But the PCIE x16 slot still worked. No matter whether the riser extension cable or the GPU itself was plugged in, I still got picture as long as both cards were present.

After this I also tried installing windows on a completely separate SSD with my main SSD completely disconnected from the system. I tried reinstalling graphics drivers multiple times. I tried clearing the BIOS both with the Reset CMOS button and by removing the battery. I even downgraded the BIOS from Version F35e to Version F32. Nothing helped. The enabled, bugged SLI seems to be baked into the motherboard so deeply nothing can fix it.


Sooo, in conclusion...
Leaving SLI enabled before removing the second GPU apparently has the magical ability to permanently brick PCIE x8 slots not only on the motherboard, but also on a PCIE bifurcation card that wasn't even connected to any PC part at the time.

Did any of you ever encounter something like this? What do you think happened? What I personally believe is that for some reason, the motherboard or some other part (though I can't really imagine what kind of PC Part can just store information like that permanently even after countless resets and reinstalls) believes that SLI is still enabled because I never turned it off. However, after turning on my system with only one card, this SLI setting got bugged and somehow magically permanently bricked two different PCIE x8 slots and now some PC part permanently thinks my SLI is still on, which somehow prevents me from getting a picture with only a single card and makes the card in the

Do you have any solutions? Is there anything I can do short of RMA? I really can't afford to take apart my PC right now because all of my university lessons are homeschooling ONLY.

The only thing I can think of right is to flash a fresh vBIOS on the detected GPU and put it into the x8 slot. Maybe something weird setting got stuck in the vBIOS. But I doubt that, considering both cards worked in the bifurcation card without a vBIOS reset.

I can't try another CPU or RAM, I sold my old stuff.

If there is anything other than RMA, please let me know.
 
Solution
I don't even get a picture when using only a single card in my confirmed working x16 slot...
I put the bifurcation setup back in, and, for some mysterious reason the bifurcation card now "magically" showed the same problems.
how are you confirming that they still work if you now get no video output in any setup?
the fact that a GPU shows LEDs and it's fans spin does not prove that it is actually working.
it could be that the cards have malfunctioned and have been incrementally showing the damage done over this time.
If there is anything other than RMA, please let me know.
my next move would be to try my cards individually in another system.
if still no video output, definitely time for RMA.

and possibly try another card...
I don't even get a picture when using only a single card in my confirmed working x16 slot...
I put the bifurcation setup back in, and, for some mysterious reason the bifurcation card now "magically" showed the same problems.
how are you confirming that they still work if you now get no video output in any setup?
the fact that a GPU shows LEDs and it's fans spin does not prove that it is actually working.
it could be that the cards have malfunctioned and have been incrementally showing the damage done over this time.
If there is anything other than RMA, please let me know.
my next move would be to try my cards individually in another system.
if still no video output, definitely time for RMA.

and possibly try another card on my motherboard.
though this i would be wary of since there is a possible issue with the motherboard PCIe slot(s) causing damage somehow.
but i keep an old Radeon around just for this type of troubleshooting, doesn't matter if it gets fried.
only thing I can think of right is to flash a fresh vBIOS on the detected GPU
this will void any warranty on the card so you should not be attempting this
unless you want to just forfeit any future option of RMA.
X570 Aorus Master...
this SLI setting got bugged and somehow magically permanently bricked two different PCIE x8 slots
it's always possible with a defective motherboard.
there may have been underlying issues that just hadn't presented themselves until this all took place.

i even had a Gigabyte Z370 that damaged two separate RAM kits with some kind of current malfunction.
through service they determined it was an issue with the board, sent a replacement,
and refunded the money for the RAM kits(after months of sending them information and invoice data over and over).

i wasn't a huge fan of Gigabyte even then but that definitely took them off of my trusted manufacturer lists for good.
 
Solution

MrBaum07

Reputable
Jun 28, 2019
7
1
4,515
Okay I got it to work, by ACCIDENT.

I wanted to do some Custom ROM flashing for my mobile (Xiaomi Mi 10T Lite) and needed to Disable Driver Signature Enforcement for that. After I did that, it worked again!

Now I'm very happy but even more confused. Apparently something with the drivers was wrong. Okay. But what kind of driver error can cause what I have described here? How can a driver, installed on windows, only needed INSIDE WINDOWS, even remotely do that stuff? How can driver signature enforcement, an option given by WINDOWS, cause problems with video output when I removed all storage devices?

I don't get it, but yeah, I'm just happy it works again.
 
  • Like
Reactions: JohnBonhamsGhost