[SOLVED] Cpu Bottleneck?

Lushern1309

Reputable
Aug 22, 2019
164
15
4,615
hey guys, i have a ryzen 3 1200 paired with a gtx 1650, i've noticed in the witcher 3 that my cpu usage is really high 70 - 100% average and my gpu is only at about 40 - 45 %, is this a possible cpu bottleneck as the 1650 is only slightly better than a 1050 and the ryzen 3 1200 pairs comfortably upto a gtx 1060 possibly 1070 with no bottle neck. am i miss informed or is something wrong here. i've only been monitoring on the witcher 3. 900p, high graphics setting and only antialiasing on in post processing
 
Solution
The cpu pre-renders the frames according to game code. Nothing else. If it can dish out 100fps, it'll do so regardless of detail settings or resolution. It'll always run at 100% ability to do so, but usage is not ability, it's how much of the cpus resources are needed to get that 100% ability. 100% usage means it can't handle the code, the game is too intense. Too much bandwidth use, too much code, too much thread usage.

After the cpu is done with the pre-renders, it ships that to the gpu. That's where details and resolution change things. Higher the details, higher the resolution, more resources the gpu needs to use to get that 100fps on screen. At 720p, that's chump change, doesn't use much gpu at all to get all 100fps up. At 4k...

WildCard999

Titan
Moderator
is DSR available on the GTX 1650?
Yes it's a Nvidia Feature, VSR is the AMD version.

What it does is render the game at a higher resolution then scale it back to your native resolution forcing the GPU to be better utilized however with a lower end GPU your really going to need to adjust the graphics quite a bit to make the gameplay smooth.

If it's already smooth gameplay then don't worry about it and keep gaming.
 
I would not say the Ryzen 3 1200 can feed a 1070...not sure who told you the 1200 "pairs comfortably" with a 1070...I don't think it's enough CPU even for the 1650 @ 1080p if you want good fps (ie consistantly above 60). All the benchmarks I've seen of the 1200 indicate it's not capable of pushing most games to 60fps 1% lows @ 1080p...there are exceptions with older titles, but in general it's not recommended for gaming. I believe your CPU is the bottleneck you're looking for...a 2600 would be a much better choice for gaming.
 
Last edited:
I would not say the Ryzen 3 1200 can feed a 1070...not sure who told you the 1200 "pairs comfortably" with a 1070...I don't think it's enough CPU even for the 1650 @ 1080p if you want good fps (ie consistantly above 60). All the benchmarks I've seen of the 1200 indicate it's not capable of pushing most games to 60fps 1% lows @ 1080p...there are exceptions with older titles, but in general it's not recommended for gaming. I believe your CPU is the bottleneck you're looking for...a 2600 would be a much better choice for gaming.
Witcher 3 is a GPU heavy game. A Ryzen 3 1200 and a GTX 1070 would get you nearly the same fps as a i5 9600k or an oc'ed i7 2600k with around 80fps average at 1080p max settings. 4k res would be 35-40fps average. The Ryzen 3 1200 is also about as fast as an oc'ed 2600k which can already do over 100fps in a lot of games depending on the GPU used.
 
Witcher 3 is a GPU heavy game. A Ryzen 3 1200 and a GTX 1070 would get you nearly the same fps as a i5 9600k or an oc'ed i7 2600k with around 80fps average at 1080p max settings. 4k res would be 35-40fps average. The Ryzen 3 1200 is also about as fast as an oc'ed 2600k which can already do over 100fps in a lot of games depending on the GPU used.
Witcher 3 is notorious for maxing out quad core/thread cpu’s and benefits from higher core/thread counts.
 
My point was that if the OP were to upgrade something to play Witcher 3 at a higher fps, a new GTX 1070/1660ti/RTX2060 would be better now than a new CPU. A better CPU doesn't automatically mean everything will run faster, but I'm not saying don't upgrade it later.
Not convinced, I would not be surprised if the OP is already or close to being cpu bound. Easy enough to test, if the OP tries dropping the resolution to 720p if the FPS stay similar then they are cpu bound.
 
CPU usage is going to be high in almost every game today with any quad core...a fact of life.
If you want much higher GPU usage, crank up details, resolution (at least 1080P) and texture detail/AA, etc...by the time you are at a semi- 'slide slow' in frame rates, cpu usage will come down some, if that is any sort of 'solution'!
My advice : Find something else to 'investigate' or 'fix'? :) (Or get an R5-1600, minimum....)
 
Not convinced, I would not be surprised if the OP is already or close to being cpu bound. Easy enough to test, if the OP tries dropping the resolution to 720p if the FPS stay similar then they are cpu bound.
In Witcher 3, dropping down to 720p will just make the CPU usage stay at 100% and GPU will drop even lower. What the OP needs to do is increase graphics settings and the resolution past 900p to 1080p or higher and CPU usage will actually drop as GPU usage increases. Enable Nvidia Dynamic Resolution Scaling to get 1440p and 4k resolutions in game and you will see what I'm say.
 

boju

Titan
Ambassador
In Witcher 3, dropping down to 720p will just make the CPU usage stay at 100% and GPU will drop even lower. What the OP needs to do is increase graphics settings and the resolution past 900p to 1080p or higher and CPU usage will actually drop as GPU usage increases. Enable Nvidia Dynamic Resolution Scaling to get 1440p and 4k resolutions in game and you will see what I'm say.

If increasing resolution/details sees a drop in cpu usage, this isn't solely because gpu is now working harder, it's because the cpu isn't being called as much for pre-rendered frames. Depending how cpu demanding a game is in other areas like game code calculations / physics, maps, multiplayer, audio etc, a quad core as an example, 'may' not have enough resources left to satisfy a faster gpu.

This is what i mean about frame pre-rendering and how it affects cpu usage.
 

Lushern1309

Reputable
Aug 22, 2019
164
15
4,615
Hey Guys thank you all for your input. I've subsequently tried resident evil 2 remake and the game runs well at high settings on 1080p with lower cpu usage 60fps almost constant, will think about going 2nd Gen Ryzen in a few months
 

Lushern1309

Reputable
Aug 22, 2019
164
15
4,615
In Witcher 3, dropping down to 720p will just make the CPU usage stay at 100% and GPU will drop even lower. What the OP needs to do is increase graphics settings and the resolution past 900p to 1080p or higher and CPU usage will actually drop as GPU usage increases. Enable Nvidia Dynamic Resolution Scaling to get 1440p and 4k resolutions in game and you will see what I'm say.
My Little 1650 will die at 1440 let alone 4k lol
 
My Little 1650 will die at 1440 let alone 4k lol
I did some testing this morning on my system running Witcher 3 and I think the reason why you were seeing only 40-45% GPU usage is because you have 30fps cap on in video options. Using a GTX 1070, I set 30fps cap on and saw about 73% CPU usage and 36% GPU with high settings (hairworks off) at 1600x900. I disabled Hyper-threading on my i7-2600k for the testing.
 

Lushern1309

Reputable
Aug 22, 2019
164
15
4,615
I did some testing this morning on my system running Witcher 3 and I think the reason why you were seeing only 40-45% GPU usage is because you have 30fps cap on in video options. Using a GTX 1070, I set 30fps cap on and saw about 73% CPU usage and 36% GPU with high settings (hairworks off) at 1600x900. I disabled Hyper-threading on my i7-2600k for the testing.
Yeah i suspect that my gpu can push higher resolutions than what i throw at it, i've pretty much stuck to 720p on every game i play, i had a r7 250 prior to the 1650 so was really over conservative i guess. i can't even OC my Ryzen, was a bit dumb getting a a320 mb. do you think an OC capable board or higher thread CPU is a better option?
 
Yeah i suspect that my gpu can push higher resolutions than what i throw at it, i've pretty much stuck to 720p on every game i play, i had a r7 250 prior to the 1650 so was really over conservative i guess. i can't even OC my Ryzen, was a bit dumb getting a a320 mb. do you think an OC capable board or higher thread CPU is a better option?
More CPU threads will make most games run better and is generally the first upgrade most people should do. I did notice stutter when running Witcher 3 without hyper-threading on. I would say that if you can find a good deal on a ryzen 5 2600, get that before getting a new motherboard. A faster GPU will increase your FPS the most in Witcher 3, but an 8+ thread CPU will get you a smoother game at lower fps with higher settings. So because you were likely running graphics settings too low, I would get a Ryzen 5 2600 or even a 3600 first before upgrading the GPU. The only thing you will need to do is update the bios to accept the new CPU if you haven't already.
 

Karadjgne

Titan
Ambassador
The cpu pre-renders the frames according to game code. Nothing else. If it can dish out 100fps, it'll do so regardless of detail settings or resolution. It'll always run at 100% ability to do so, but usage is not ability, it's how much of the cpus resources are needed to get that 100% ability. 100% usage means it can't handle the code, the game is too intense. Too much bandwidth use, too much code, too much thread usage.

After the cpu is done with the pre-renders, it ships that to the gpu. That's where details and resolution change things. Higher the details, higher the resolution, more resources the gpu needs to use to get that 100fps on screen. At 720p, that's chump change, doesn't use much gpu at all to get all 100fps up. At 4k it's going to fail, use much higher gpu usage and still not get all 100fps up, probably closer to 30.

DSR is where the gpu receives the 100 pre-rendered frames from the cpu, finish renders them at a higher resolution, but sticks that finished frame on screen at a lower resolution. The gpu works harder, needs more resources, but still only puts 1920x1080 amount of pixels on a frame, not the full 3840x2160 (4k) amount of pixels up. So the gpu output is the same workload, the gpu render is harder, gpu input doesn't change. Your 1650 can do DSR, you'd get a sharper, clearer, cleaner look per frame, usage would go up, and it'd depend on resolution and details as to whether it's still capable of that 100fps on screen or not.

Basically, if the gpu is capable of 100fps, but only received 60fps from the cpu, that's what you get, 60fps. The cpu is choked by Witcher 3, and only a more capable cpu (more threads) will set the gpu free to reach its boundaries. Only by choking the gpu will the cpu catch a break as it's no longer forced to supply the 100fps demand.
 
Solution

WildCard999

Titan
Moderator
The only thing you will need to do is update the bios to accept the new CPU if you haven't already.
Just wanted to add that if you update the BIOS make sure you do them in order and I wouldn't skip any. A user a couple of weeks ago went from a old BIOS on a A320 to the newest one and bricked the motherboard and since it's considered user error the manufacturer refused the RMA.
 

Lushern1309

Reputable
Aug 22, 2019
164
15
4,615
Just wanted to add that if you update the BIOS make sure you do them in order and I wouldn't skip any. A user a couple of weeks ago went from a old BIOS on a A320 to the newest one and bricked the motherboard and since it's considered user error the manufacturer refused the RMA.
O my word that is exactly what I did a few days ago, should I be worried now?, system runs fine
 

Karadjgne

Titan
Ambassador
Agreed, some of the bios updates are cumulative and some aren't. The ones that aren't are not much more than addendums to existing bios, not full packages. So if you skip an upgrade, you might miss prior microcode that's needed for full implementation. And once the bios is bricked, you can't just start the pc and add what you missed. Not unless you got lucky and bought a board with dual bios and the second is factory virgin.
 
My point was that if the OP were to upgrade something to play Witcher 3 at a higher fps, a new GTX 1070/1660ti/RTX2060 would be better now than a new CPU. A better CPU doesn't automatically mean everything will run faster, but I'm not saying don't upgrade it later.

The part you're missing is the 1% lows I mentioned...all of the benchmarks claiming the 1200 is adaqaute for gaming generally only include the average FPS. When the 1% lows are looked at the 1200 under performs leading to laggy play and dropped frames...something the 2600 generally has no issue with.