I recently upgraded from a GTX 1070 to an RTX 2080. The 1070 used to run at ~99% usage in most games with CPU usage topping out around 80% (which I take to mean the GPU is being fed what it needs by the CPU to work as hard as it can) but since upgrading to the 2080 the GPU is usually 40 - 60% with CPU usage >90% and each core hits 100% at times. Has this upgrade meant my old I5 3570K has reached the point where it can't keep up with the 2080? I game on a 1440p, 144Hz G sync monitor. I usually play PUBG, Rocket League, No Mans Sky as well as the odd game from my steam library. Most games aren't pushing above the 100 FPS I would expect from this GPU. I've been through Nvidia settings to ensure the card is running on maximum performance etc.
I5 3570K OC to 4.4Ghz all cores
Asus P8Z77VLX2 Mobo
4x 4GB Patriot DDR3 RAM 1600Mhz
XFX XXX 850W PSU (overkill I know)
EVGA RTX 2080 XC Ultra
I was considering upgrading to a Ryzen 2600 in the short term to try and get the GPU usage higher then swap it out for a 3000 part if needed. Would this CPU swap likely remove the CPU usage issues? Also if I was planning to get a Ryzen 5 3XXX part would there be a benefit to getting an X470 board now for the 2600 as it might have better support for the 3600 in the future?
Thanks for your time
I5 3570K OC to 4.4Ghz all cores
Asus P8Z77VLX2 Mobo
4x 4GB Patriot DDR3 RAM 1600Mhz
XFX XXX 850W PSU (overkill I know)
EVGA RTX 2080 XC Ultra
I was considering upgrading to a Ryzen 2600 in the short term to try and get the GPU usage higher then swap it out for a 3000 part if needed. Would this CPU swap likely remove the CPU usage issues? Also if I was planning to get a Ryzen 5 3XXX part would there be a benefit to getting an X470 board now for the 2600 as it might have better support for the 3600 in the future?
Thanks for your time