Question Should I Upgrade my CPU?

gtr22

Distinguished
May 11, 2012
289
0
18,860
29
I have an i7 8700k delided, runs @ 5 ghz. GPU RTX FE 2080 Ti. The gpu never or sometimes runs at 100% usage, the cpu runs 50-60% during games. I'm getting descent frames, but it's always been a concern. The only way for me to have the gpu at 100% usage is to change the resolution to 4k. My monitor is native 1080p, not the smoothest gameplay. Temperatures are fine, both run 50c-60c.
On Fallout 76 I used to run 2 accounts at the same time, my cpu was 50% usage. Then after a patch usage is 70%, not sure it's from the patch or my cpu can't handle it now.

This gpu has always been under 100% so I'm trying to figure where the problem is.

ROG Strix Z390 E Gaming motherboard
Corsair 16 gig ram @3200 hz
Corsair i750 HX psu
Asus G248 144hz monitor with a display port cable.
 
Last edited:
I have an i7 8700k delided, runs @ 5 ghz. GPU RTX FE 2080 Ti. The gpu never or sometimes runs at 100% usage, the cpu runs 50-60% during games. I'm getting descent frames, but it's always been a concern. The only way for me to have the gpu at 100% usage is to change the resolution to 4k. My monitor is native 1080p, not the smoothest gameplay. Temperatures are fine, both run 50c-60c.
On Fallout 76 I used to run 2 accounts at the same time, my cpu was 50% usage. Then after a patch usage is 70%, not sure it's from the patch or my cpu can't handle it now.

This gpu has always been under 100% so I'm trying to figure where the problem is.

ROG Strix Z390 E Gaming motherboard
Corsair 16 gig ram @3200 hz
Corsair i750 HX psu
Asus G248 monitor with a display port cable.
well the motherboard could only take i9 9900K if that is enough for you (8c 16t). otherwise if you wanted to jump at 12th gen it might be okay, a 12400F is pretty much a beast vs your 8700k.
 
I have an i7 8700k delided, runs @ 5 ghz. GPU RTX FE 2080 Ti. The gpu never or sometimes runs at 100% usage, the cpu runs 50-60% during games. I'm getting descent frames, but it's always been a concern. The only way for me to have the gpu at 100% usage is to change the resolution to 4k. My monitor is native 1080p, not the smoothest gameplay. Temperatures are fine, both run 50c-60c.
On Fallout 76 I used to run 2 accounts at the same time, my cpu was 50% usage. Then after a patch usage is 70%, not sure it's from the patch or my cpu can't handle it now.

This gpu has always been under 100% so I'm trying to figure where the problem is.

ROG Strix Z390 E Gaming motherboard
Corsair 16 gig ram @3200 hz
Corsair i750 HX psu
Asus G248 monitor with a display port cable.
Hello there!
Generally at 1080p you are mostly bottlenecked by your CPU and 4k is more GPU intense thus at 4k your GPU usage is 100% which makes sense. What more games do you play what are you trying to achieve?
Also as @Nine Layer Nige mentioned check you have set your monitor to 144hz you can change it in display setting.
 

gtr22

Distinguished
May 11, 2012
289
0
18,860
29
I just ran 3D Mark Time Spy 64 bit. Gpu score : 13709
Graphics score : 15554
Cpu score: 8199

Which is great according their posts.

@General Kenobi as I mentioned I'm trying to find where the problem is for my gpu to run 60% usage in most of my games.

Battlefield I know is a cpu intensive game, BF V and BF 2042, Fallout 76, Red Dead Redemption 2 Story, Hitman 3 runs 100% usage. My monitor is at 144hz.
 

Nighthawk117

Notable
Sep 27, 2021
880
237
890
43
I have an i7 8700k delided, runs @ 5 ghz. GPU RTX FE 2080 Ti. The gpu never or sometimes runs at 100% usage, the cpu runs 50-60% during games. I'm getting descent frames, but it's always been a concern. The only way for me to have the gpu at 100% usage is to change the resolution to 4k. My monitor is native 1080p, not the smoothest gameplay. Temperatures are fine, both run 50c-60c.
On Fallout 76 I used to run 2 accounts at the same time, my cpu was 50% usage. Then after a patch usage is 70%, not sure it's from the patch or my cpu can't handle it now.

This gpu has always been under 100% so I'm trying to figure where the problem is.

ROG Strix Z390 E Gaming motherboard
Corsair 16 gig ram @3200 hz
Corsair i750 HX psu
Asus G248 144hz monitor with a display port cable.
Well it sounds like a CPU bottleneck. 144hz on Battlefield V for example will push a 6 core chip to very high usage, at 1080p 144hz it will bottleneck a 2080, the same is true of BF 2042.
With that being said though, are you unhappy with the performance you are getting?

Also on BFV are you running in DirectX 12? I get much better frame rates than on 11.
 

gtr22

Distinguished
May 11, 2012
289
0
18,860
29
I think I run DX11 on BF V, uninstalled it. My performance is good, would be nice to get the most out of my gpu. I don't overclock it, it doesn't change much.
 

Nighthawk117

Notable
Sep 27, 2021
880
237
890
43
I think I run DX11 on BF V, uninstalled it. My performance is good, would be nice to get the most out of my gpu. I don't overclock it, it doesn't change much.
Oh ok, it runs much better with DirectX 12, I've found with all the bits they've added over the years there's quite a significant difference between the two now. In terms of upgrades, I've used both 6 and 10 cores with an Nvidia 3080, yes you could get more out of your GPU in some titles at 144Hz if you had a faster CPU. The Battlefield games at high refresh rates, will be smoother and hold the frame rate more reliably. Your kind of on the edge, yes a new CPU will definitely be better, but most of the time it's not going to wow you.

If you did upgrade, I'm kind of against the 9900K as an easy drop in, I think it's not a big enough jump to justify the expense. I wouldn't say your GPU was wasted, some games like Cyberpunk will push a 2080Ti to it's limit even at 1080p 60hz. However if you want the smoothest 144hz performance possible on something like Battlefield 2042 for example, then I'd recommend going to a 12700K. If your 8700K is really not enough then any upgrade less than a 12700K is probably not going to satisfy you over the next 3 or4 years.
 
Reactions: Koekieezz
the 12700K/F/KF is pretty much worth the power draw vs 9900k, and i agree with night hawk opinion. You could grab a 12100F + B660 or Z690 motherboard if it doesn't fit your budget yet to go straight i7 12700K/F/KF. it will sure bottleneck the performance but not by alot, since Raptor Lake is upcoming next, you might want to consider the 12100F as a temporary upgrade, and you might could wait till intel releases it. The single core of alderlake is a beast compared to rocket lake, so be your coffee lake.
 

Nighthawk117

Notable
Sep 27, 2021
880
237
890
43
My two cents if buying now would be go big like a 12700K or stick with your 8700K for now and worry about it when your performance reaches a level where you decide you definitely need more. I am assuming you bought your 8700K when it was first released and have been using it for 4 years+. 144hz makes you a premium gamer so in my mind if your going to use a chip for another 4 years then a 12700K is likely to hold up better over time than say a 12400F. Even though the latter is a great little chip and would still be quite a noticeable improvement at that frame rate due to Alder Lake's very strong IPC.

Salt needed but Raptor Lake from leaks seems to have a 10% IPC gain, larger cache and double the efficiency cores, so I wouldn't expect substantial increases in gaming performance over Alder Lake. Meteor Lake though that comes after is a much bigger improvement, though that is 2 years away.
 
Reactions: General Kenobi

gtr22

Distinguished
May 11, 2012
289
0
18,860
29
New motherboard + cpu+waterblock, and then maybe a 3080 Ti would be available. I guess I'll wait it out. Do you think running my cpu at a lower clock speed would increase my gpu usage?
 

Nighthawk117

Notable
Sep 27, 2021
880
237
890
43
New motherboard + cpu+waterblock, and then maybe a 3080 Ti would be available. I guess I'll wait it out. Do you think running my cpu at a lower clock speed would increase my gpu usage?
No if it's a CPU bottleneck then increasing your CPU clock speed would increase GPU usage (not realistic if your already at 5Ghz). The problem is your trying to run at 144hz and the CPU is struggling to prepare the frames quick enough so your GPU is left waiting, hence why it's not being fully utilised. If you want to increase your GPU usage in something like Red Dead 2 then just increase your graphics settings and get better visuals instead of the super high frame rate.
 
Reactions: General Kenobi

ASK THE COMMUNITY