No. Those numbers do not represent amount of capacity, but amount of resources used. Both the gpu and cpu will always run at 100% of their ability. If the clock says 4.0Ghz, that's what it runs whether 10% of resources are used or 99%.
All those numbers mean is that in that particular game, the cpu is able to put out maximum fps, requiring 100% of its resources to do so, but the gpu only requires 60% in order to accomplish that fps at the settings you've enabled. Change your resolution to 4k DSR and you'll see cpu drop below 100% and gpu will be at 99% usage as the gpu will no longer be able to maintain maximum fps allowed by the cpu, so the cpu doesn't have to process as many per Hz.
I have enough (140) 2k/4k/8k scripted mods on my Skyrim that if left wide open fps, the cpu (i7-3770K) runs 99% and my gpu (gtx970) runs 63%. About 130fps @1080p/60Hz. There's no need for that, so I capped in game to 90fps, now my Skyrim runs 55% cpu/63% gpu. All because the cpu no longer needs to process extra fps that are never seen and of absolutely no use on a 60Hz monitor.
Tune your settings. You'll get better real life results than artificial benchmarks you have no use for.
Stutter is when you have mainly a 60Hz monitor and you are pushing to get 60fps. Each frame happens normally 1/60th of a Hz and is replaced 1/60th later, but if the game introduces any new thing, that can cause the game engine to make a correction, the exact same frame is reproduced at 1/30th, and the new frame happens 1/30th later. That's a stutter you can visibly see. Turn off any v-sync, maximize gpu, minimize cpu settings such as grass detail or viewing distance. By having a 100% load on cpu, it has no room to move for new data, so threads get prioritized, some put on hold until necessary, which creates a holdup your gpu reproduces as stutter.