Encode with AMD play with Nvidia

Aug 2, 2018
1
0
10
0
[Moderator Note: Moved post to Graphics Cards.]

I have a system that runs on an Nvidia 1050ti GPU. Great card but I recently started streaming with StreamLabs and I lose frames in some games.

I was wondering if I can use an old AMD r7 250 to dedicate as the encoder gpu and still game on the Nvidia.
 
So... I would first start by just using the NVIDIA encoder and copy directly to your C-drive (I believe it defaults to Documents->Videos).

It's slightly confusing at first how to use the settings but open Geforce Experience, In-Game overlay ON, click settings in In-Game overlay then: Video Capture.

Experiment with the BITRATE, 30FPS vs 60FPS encoding etc, and try a few games to capture a few minutes to see how it works (even the DESKTOP).

I just test with ALT + F9 to start then stop recording manually (desktop or game).

Once you confirm that's working fine you can move on to either figure out how to upload/broadcast in the same program or switch to another program but again CPU encoding is different. It's very demanding on the CPU whereas GPU encoding like NVidia does (or OBS can do) is just using the GPU encoder and is not very demanding it all. In fact, I started recording Skyrim SE with VSYNC OFF (so no FPS cap) and it didn't even drop the FPS at all. It stayed right at 78FPS when I left my character looking in one position.

*I've seen very little reason to use the CPU to encode. I've encoded at 2560x1440 with small text and it looks nearly perfect to me. In fact usually YOUTUBE ends up applying some compression so even if you can do better with a CPU encode usually nobody can tell the difference once they watch the compressed version.

Anyway, hope this helps.
 
Without any issue at all, but depends which software do you use.
With OBS you need to enable AMD openGL on OBS via some tutorial to use it, I think you dont even have to install drivers for it, you select the GPU that you wanna use for encoding.
 
Uh... I don't think that AMD card is of any use.

AFAIK there are two ways to encode video:

1) CPU (software encoding) - very demanding so usually need a good CPU to avoid dropped frames

2) GPU (actually a specialized section to encode) - this samples the output video so it needs to be on the SAME CARD YOU GAME ON.

SO...
I suspect you are using CPU encoding since I lost only at most 5% on my i7-3770K + GTX680 setup years ago with NVidia Shadowplay. It will vary slightly by the FPS, encoding bitrate (and if you upload to internet vs just copy to HDD) etc you choose but again I suspect CPU encoding so that's not even related to the graphics card.
 
So... I would first start by just using the NVIDIA encoder and copy directly to your C-drive (I believe it defaults to Documents->Videos).

It's slightly confusing at first how to use the settings but open Geforce Experience, In-Game overlay ON, click settings in In-Game overlay then: Video Capture.

Experiment with the BITRATE, 30FPS vs 60FPS encoding etc, and try a few games to capture a few minutes to see how it works (even the DESKTOP).

I just test with ALT + F9 to start then stop recording manually (desktop or game).

Once you confirm that's working fine you can move on to either figure out how to upload/broadcast in the same program or switch to another program but again CPU encoding is different. It's very demanding on the CPU whereas GPU encoding like NVidia does (or OBS can do) is just using the GPU encoder and is not very demanding it all. In fact, I started recording Skyrim SE with VSYNC OFF (so no FPS cap) and it didn't even drop the FPS at all. It stayed right at 78FPS when I left my character looking in one position.

*I've seen very little reason to use the CPU to encode. I've encoded at 2560x1440 with small text and it looks nearly perfect to me. In fact usually YOUTUBE ends up applying some compression so even if you can do better with a CPU encode usually nobody can tell the difference once they watch the compressed version.

Anyway, hope this helps.
 

ASK THE COMMUNITY

TRENDING THREADS