Hi I apologize,if i am putting this thread in the wrong section, but i figured it ll get better response than the hardware section.
I am running a 1st generation i7 860 Oced at 3.5ghz (so basically a 870- a tier 3 CPU-http://www.tomshardware.com/reviews/cpu-hierarchy,4312.html), 8GB 1333mhz ram coupled with a gtx 970. I game at 720p monitor with 60Hz.
Now the first two games i ve run with my recent build is AC IV Black Flag , and Fallout 4 (perhaps not the most optimized games out there,but still ..)
In both these games there are places where my frames would drop form 60 to 40ish(even 30s), while both my CPU and GPU load remain below 70s (at times cpu be at 30% in ACIV and i ll still have low frames!). It's really annoying to have such low frames per second, at the same time having unused hardware resources. I mean it'd be understandable to have leftover resources after reaching a 60fps v-sync, as I am gaming in a low res.. But 30 frames with low loads on both fronts? feels absurd.

I have tried various things, including OCing my CPU beyond 4GHz to no difference in fps whatsoever. No matter what i do these drops keep happening at certain intense parts of these games.
Now, the popular belief is- High cpu usage, with low gpu load denotes clear CPU bottleneck. But with all the tweaks and tunes i have tried over the past few days (including DSR to vary load on GPU), the results lead me to the conclusion that my CPU might somehow still be bottlenecking the GPU despite staying at a very low load. Is it possible that the old architecture itself of the CPU irrespective of overclock or usage is causing the limit or is it just a plain and simple case of Bad software coding of these games?
I have cash at hand to upgrade my cpu+mob, but the thing is a ive also come across people on this very site who reported little no no gain in gaming after upgrading from i7 lynfields to later gens, and some would go as far to say that they'd gladly take back the old cpu if the money spent was to be returned to them!
I am really confused at this point, whether spending on my system will achieve any significant improvements, or is it simply going to be a waste of money? After-all there are videos on youtube, of i7 860 running a gtx 980ti at 99% load in several games incl crysis 3!! https://www.youtube.com/watch?v=4HZmJKu7ESU
I am running a 1st generation i7 860 Oced at 3.5ghz (so basically a 870- a tier 3 CPU-http://www.tomshardware.com/reviews/cpu-hierarchy,4312.html), 8GB 1333mhz ram coupled with a gtx 970. I game at 720p monitor with 60Hz.
Now the first two games i ve run with my recent build is AC IV Black Flag , and Fallout 4 (perhaps not the most optimized games out there,but still ..)
In both these games there are places where my frames would drop form 60 to 40ish(even 30s), while both my CPU and GPU load remain below 70s (at times cpu be at 30% in ACIV and i ll still have low frames!). It's really annoying to have such low frames per second, at the same time having unused hardware resources. I mean it'd be understandable to have leftover resources after reaching a 60fps v-sync, as I am gaming in a low res.. But 30 frames with low loads on both fronts? feels absurd.

I have tried various things, including OCing my CPU beyond 4GHz to no difference in fps whatsoever. No matter what i do these drops keep happening at certain intense parts of these games.
Now, the popular belief is- High cpu usage, with low gpu load denotes clear CPU bottleneck. But with all the tweaks and tunes i have tried over the past few days (including DSR to vary load on GPU), the results lead me to the conclusion that my CPU might somehow still be bottlenecking the GPU despite staying at a very low load. Is it possible that the old architecture itself of the CPU irrespective of overclock or usage is causing the limit or is it just a plain and simple case of Bad software coding of these games?
I have cash at hand to upgrade my cpu+mob, but the thing is a ive also come across people on this very site who reported little no no gain in gaming after upgrading from i7 lynfields to later gens, and some would go as far to say that they'd gladly take back the old cpu if the money spent was to be returned to them!
I am really confused at this point, whether spending on my system will achieve any significant improvements, or is it simply going to be a waste of money? After-all there are videos on youtube, of i7 860 running a gtx 980ti at 99% load in several games incl crysis 3!! https://www.youtube.com/watch?v=4HZmJKu7ESU