CPU architecture bottleneck (despite low load)?

shohag2018

Distinguished
Apr 27, 2009
66
0
18,630
Hi I apologize,if i am putting this thread in the wrong section, but i figured it ll get better response than the hardware section.

I am running a 1st generation i7 860 Oced at 3.5ghz (so basically a 870- a tier 3 CPU-http://www.tomshardware.com/reviews/cpu-hierarchy,4312.html), 8GB 1333mhz ram coupled with a gtx 970. I game at 720p monitor with 60Hz.

Now the first two games i ve run with my recent build is AC IV Black Flag , and Fallout 4 (perhaps not the most optimized games out there,but still ..)

In both these games there are places where my frames would drop form 60 to 40ish(even 30s), while both my CPU and GPU load remain below 70s (at times cpu be at 30% in ACIV and i ll still have low frames!). It's really annoying to have such low frames per second, at the same time having unused hardware resources. I mean it'd be understandable to have leftover resources after reaching a 60fps v-sync, as I am gaming in a low res.. But 30 frames with low loads on both fronts? feels absurd.



I have tried various things, including OCing my CPU beyond 4GHz to no difference in fps whatsoever. No matter what i do these drops keep happening at certain intense parts of these games.

Now, the popular belief is- High cpu usage, with low gpu load denotes clear CPU bottleneck. But with all the tweaks and tunes i have tried over the past few days (including DSR to vary load on GPU), the results lead me to the conclusion that my CPU might somehow still be bottlenecking the GPU despite staying at a very low load. Is it possible that the old architecture itself of the CPU irrespective of overclock or usage is causing the limit or is it just a plain and simple case of Bad software coding of these games?
I have cash at hand to upgrade my cpu+mob, but the thing is a ive also come across people on this very site who reported little no no gain in gaming after upgrading from i7 lynfields to later gens, and some would go as far to say that they'd gladly take back the old cpu if the money spent was to be returned to them!

I am really confused at this point, whether spending on my system will achieve any significant improvements, or is it simply going to be a waste of money? After-all there are videos on youtube, of i7 860 running a gtx 980ti at 99% load in several games incl crysis 3!! https://www.youtube.com/watch?v=4HZmJKu7ESU
 
Solution
BF4 and F4:
Both games are known to have large drops so hard to use these.

TOMB RAIDER:
This is a game with minimal CPU bottleneck. Observe the bottom graph:
http://www.techspot.com/review/645-tomb-raider-performance/page5.html

Now look at F4:
http://www.gamersnexus.net/game-bench/2182-fallout-4-cpu-benchmark-huge-performance-difference

For refernce, the FX-8320E should be about the same as your i7-860 at 4GHz.

"Once moving into the power efficient 8370E and 8320E, we've fallen so far in framerate that we've landed below 60FPS. This shows just how substantial an impact the CPU has in Fallout 4, and not because of anything resembling masterful CPU or thread utilization... "



Other:
DSR simply artificially raises the resolution...
Hi,

1) First of all, a game can rarely use all four cores of your CPU so it won't show 100% load necessarily even if the CPU is the bottleneck. For example, let's say Starcraft 2 could use only TWO CORES, then you might show a bit over 50% usage (Windows uses some too) even though it's the bottleneck.

CPU bottleneck means a GPU bottleneck too (hence both show load).

2) CPU OC to 4GHz:
I would have expected this to help, though it's only about 14% so wouldn't be that obvious for normal gaming.

3) bad code?
Well, arguably any major stutter is "unoptimized code" depending on your definition but there's always a bottleneck somewhere. In a game it's normally the CPU or GPU.

If you dropped to 10FPS and a CPU was 2X as fast (and same architecture) then the game might still drop to 20FPS.

4) CPU SCALING:
I'll try to find some benchmarks to show some real-world info, however it VARIES a lot between games.
 
BF4 and F4:
Both games are known to have large drops so hard to use these.

TOMB RAIDER:
This is a game with minimal CPU bottleneck. Observe the bottom graph:
http://www.techspot.com/review/645-tomb-raider-performance/page5.html

Now look at F4:
http://www.gamersnexus.net/game-bench/2182-fallout-4-cpu-benchmark-huge-performance-difference

For refernce, the FX-8320E should be about the same as your i7-860 at 4GHz.

"Once moving into the power efficient 8370E and 8320E, we've fallen so far in framerate that we've landed below 60FPS. This shows just how substantial an impact the CPU has in Fallout 4, and not because of anything resembling masterful CPU or thread utilization... "



Other:
DSR simply artificially raises the resolution.

Other:
Raising the resolution can shift the CPU bottleneck to the GPU, however any time you raise the resolution (and change no other settings) your frame rate still drops. If you're dropping to 30FPS you aren't going to go higher than 30FPS at the same point by raising the resolution. You'll either stay the same or possibly drop.

Many games will be CPU bottlenecked with your rig, some may be GPU bottlenecked (again, see TR benchmark) and some will be CPU or GPU at different times in the same game depending on what's going on.

In the end with the same hardware all you can do is mess with the resolution, AA and other settings and see what happens.

Other:
Adaptive VSYNC (may want to try using that sometimes). It will auto turn VSYNC OFF if you can't output to match the monitor (i.e. below 60FPS for 60Hz monitor).

This gives screen tearing (VSYNC is turned off automatically) but not the stutter caused by mixing frame times. I force on in NCP-> Manage 3D Settings-> add game-> ...
 
Solution


Thanks for replying. You are right in saying higher resolution should put more stress on the gpu, but that wont improve the frames per second anymore at 1080p, if it wasnt in the 720p in the first place!
And for gaming at that res, i game on a bravia TV.
 


Please take a look at the OP the pic I attached, note the individual core loads, i dont think any of the cores are close to being heavily loaded..
 

The frame rate will be more stable at higher resolution and less FPS dips compared to low resolution such as 720P.
 


I have tried 1080p and 1440p , while the gpu load increases a bit, still no real gain in terms of frames or stuttering at intensive places..

 
unreal9400 - you have a strange (& flawed) understanding of what raising resolution does to CPU/GPU loads .

Shohag2018 - as photonboy has already mentioned you're running a chip that is seen as an 8 core by windows & the majority of monitoring software - I doubt you'll see more than 60% load ever apart from maybe 3 or 4 games.

Saying that I would expect minimums of mid 40's on those games in all honesty.
 
Fallout 4 I'm assuming you're maxing settings as you're running at 720p.??

In the launcher change shadow distance to medium - high is a fps killer in city areas at any resolution


Black flag - always been terribly optimised & never looked fluid for me - disable godrays for starters.
 
@madmatt30 thank you very much good sir for ur input. Makes a lot more sense now. Fo4 being the cpu hog that it is, maybe causing a bottleneck, but fo4 doesnt go under 40ever for me, so thats bearable. Whereas acIV is just plain unjustifiable! I mean would u be kind enuf to have a look at my op(the pic attached). The individual core loads in that scene- even if the game sees only 1or 2 cores of the cpu, wouldnt at least those get stressed!! Horrendous opt id say.
Hopefully dx12 will improve my cpu condition ..
 
I would suggest trying running black flag, alt-tab out , open task manager.
Find the executable , right click & set core affinity to cores 1/3/5/7 only.

May/may not make a difference but worth trying.

I'm in uk & at work at the minute , when I'm back home I'll check CPU usage on the 3 systems I own.