Hello.
I have and RX 6750 XT, and I recently got a GTX 1060 6GB from my friend for only 40€.
Cpu r5800x.
Ram 32 gb.
Mobo msi mag b550 tomahawk.
I planned to use the GTX for lossless scaling and the RX for game rendering, but when I tested the system without opening lossless scaling, I selected the RX as the rendering card or high performance card the RX on windows, and I did this even for the game exe (in this case is the last of us part 2). The problem is that I'm getting 40 fps (almost everything maxed, I'm playing on 1080p), before putting the gtx in my system I used to get around 85 or 90 fps, but not 40 fps.
It seems like windows is still using the gtx as output and rendering, even if I selected to use the RX.
Just for testing I swapped back the DP cable to the RX and all the games are still using the GTX...
Maybe is there an option on NVDIA panel?
I've installed the drivers on both cards.
What am I doing wrong?
I have and RX 6750 XT, and I recently got a GTX 1060 6GB from my friend for only 40€.
Cpu r5800x.
Ram 32 gb.
Mobo msi mag b550 tomahawk.
I planned to use the GTX for lossless scaling and the RX for game rendering, but when I tested the system without opening lossless scaling, I selected the RX as the rendering card or high performance card the RX on windows, and I did this even for the game exe (in this case is the last of us part 2). The problem is that I'm getting 40 fps (almost everything maxed, I'm playing on 1080p), before putting the gtx in my system I used to get around 85 or 90 fps, but not 40 fps.
It seems like windows is still using the gtx as output and rendering, even if I selected to use the RX.
Just for testing I swapped back the DP cable to the RX and all the games are still using the GTX...
Maybe is there an option on NVDIA panel?
I've installed the drivers on both cards.
What am I doing wrong?