Question Is there a way to force an application to use a specific GPU for encoding/decoding ?

Dec 8, 2022
1
0
10
Hi,

I'm noticing my system is getting bottle necked on encoding/decoding while I'm running games on a primary monitor, and watching youtube on a second monitor (have shadowplay going on the game one as well). Youtube starts to render about 1fps

I'm using a 3090 for the monitor that's playing the game while the second monitor is using my old 1060. I recently added it to my system to try to solve this, and had been only using the 3090 before. When I watch youtube the browser is using the 1060 for decoding until I switch to the game, then the browser starts to use the 3090. Going windowed/fullscreen has no effect to try and get the browser back to the 1060. As soon as I switch focus from the game the decoding goes back to the 1060.

Is there any option to force an application to use a certain gpu for encode/decode? Or even force it to only use the GPU for all that applications tasks? Most apps seem to have a toggle for hardware acceleration but they don't give more beyond that. I've seen that OBS has built in support to allow you to delegate this work onto another GPU. I'd like to keep shadowplay on the 3090 while running the rest of my less intensive hardware accelerated apps on the 1060

In the nvidia control panel I've tried setting the browser to use the 1060 for both these settings --> CUDA - GPUs, OpenGL rendering GPU
I'm also aware I could set a FPS cap to help but would really like to avoid this with the poor performance of today's titles