Everything works at 100%, all the time. Doesn't matter if the load is only using 20% or 90% of the cpu, the cpu is still running at 100%. Kinda like a car engine at 3000rpm, doesn't matter if it's in 1st gear doing 30mph or 5th gear doing 90mpg, it's still at 3000rpm. So your cpu, no matter what you do, will still pre-render frames as fast as it should, give those pre-renders to the gpu. Depending on detail settings and resolution, the gpu will render those pre-renders and throw them on screen. The gpu will also run as fast as it should, you can't slow it down like that. Some frames are highly detailed like explosions or grass, so take longer to paint the picture, but it's still running at 3000rpm. The longer it takes to paint the picture, the less frames it can paint per second.
With your 6300, let's say it can only pre-render 60 pictures every second. That's what it sends to the gpu. The gpu laughs at the cpu because it could paint 100 of those pictures every second.
A modern cpu, with the ability to throw 200 frames of that same picture at the gpu, now laughs at the gpu who is trying hard to get that 100 pictures painted.
Usage is different. With usage, the cpu might be at maximum pre-rendered frames of 60 per second, but the game engine only uses 50% of the cpu to get that. Like only uses 3 cores. Games don't always use every available resource, just the ones the code says to use. Same with gpu.
The way to tell if the cpu is maxed is to change detail levels. Say you get 60 fps at medium settings. Increase detail to ultra/max. If you still get @ 60 fps, then that's all the cpu can do, the gpu can still do more. If fps drops drastically then the cpu is giving plenty, but the gpu is in trouble.
If you believe the cpu isn't enough for fortnite, you should be quite able to throw that game on epic settings and play the same fps as at low ± a few fps. If fps tanks hard, then you have issues with the gpu, or it's drivers or you have 4kDSR set etc.