Constant confusion about this.
There are several points:
1) what BANDWIDTH are you uploading?
Most people are very limited so ENCODING at great quality may be rather pointless... heck, if I upload (not in real time) a 1080p high-quality video I encoded using NVidia's tool Shadowplay (or whatever it's called now) Youtube never shows the same quality. Even 4K encodes don't look Bluray 1080p usually due to how the data is compressed.
2) Do you use a HARDWARE encoding method or SOFTWARE encoding method?
A hardware method like Shadowplay or other tools that use the NVidia or AMD hardware encoder use very little CPU resources. Even on my older i7-3770K it doesn't seem to affect the FPS much if at all.
Frankly, I can't even tell the difference in video quality when I choose the higher encode settings in Geforce Experience. I was playing back a 1440p encode of my DESKTOP, paused it, then forgot and later came back and got confused why my TIME was incorrect.
The created video stream from the GPU is simply shunted over to the Hardware Encoder (NVENC for NVidia) which encodes it in real-time then uses some CPU resources (not much and can be on an unused CPU thread) to move from the graphics card out to the Internet.
3) When AMD demonstrated Ryzen (not debating) R7-1800X it had to create a scenario where it would benefit. It did so by:
a) using a very CPU demanding "game" (AotS), and also
b) encoding using a SOFTWARE method (very CPU intensive)
That was also being compared to the i7-7700K and the i7-8700K has 50% more threads so more than enough to handle the extra software encoding.
4) A separate PC?
Nope.
You'd have to have ALL the following going on to benefit from that:
#1 - Youtube can actually show sufficient quality that your SOFTWARE encode looks better than a hardware encode (very unlikely)
#2 - You have sufficient bandwidth to upload that high quality software encode in real time, and
#3 - The i7-8700K is insufficient... extremely unlikely. I can't think of a single situation where you'd need that.