Is this a bottleneck 4690k?

kol12

Honorable
Jan 26, 2015
2,109
0
11,810
In the game Battlefield 1 if I turn off vsync to allow unlimited frames my 4690k will pretty much sit on 100% usage. Because the frame rate in this case is maintained at 90-100 fps would this be classed as a minor bottleneck simply not allowing anymore than 100 fps rather than a major bottleneck that you might expect with 100% usage? For example if I was dropping under 60 fps a lot that would be a major bottleneck correct?

I've heard that 100% usage is sometimes not necessarily even a bottleneck and that 100% usage is ok. Can someone explain that further?

 
Solution
Unfortunately I don't know the details of most of those questions. I know very little about game programming but yes, the two platforms (pc and console) are likely coded a bit different. Mac is closer to pc than consoles and software isn't compatible for the most part. It requires 2 different versions.

Software for consoles can be streamlined, a developer already knows what cpu, gpu, etc is contained within a ps4 or xbox one. Consoles also use a much slimmer os than windows. You know exactly what is available on a ps4 gamer's system. A game dev won't have any clue what combo you've got on a pc, it could be dual core, quad, hex, hyper threaded or not, more ram, less ram, faster ram, slower ram, which gpu? The potential combinations are...
A bottleneck is when one component is weaker than the others and hampers performance (weakest link). A pentium with a gtx 1080, the cpu would be the bottleneck. An i7 7700k oc'd with a gt 730, the gpu would be the bottleneck. Bf1 is fairly cpu intensive and is known to push i5's to 100%, even i7's are pushed 80-90%+ a lot of times despite benefiting from hyper threading in that game.

Part of the bottleneck equation is how bad is it? Can you live with the performance, is it over 60fps on a 60hz monitor, is it tanking to 30-40fps? That sort of thing. Another part is which game, in this case bf1 is known for being cpu intensive. I'm not aware of people consistently getting 140fps+ so if paired with a 144hz monitor it may not push all the fps the monitor can handle.

Which gpu is it paired with? Part of that question also relies on which resolution you're playing at. If you're playing at 720p with a gtx 1080 the cpu will almost always be the bottleneck because that's a serious imbalance of gpu horsepower to screen resolution. The 1080 is geared more toward 1440p to 4k, if you're playing at 1080p or under the 1080 is generally going to be overkill, won't fully be utilized and will appear to be a bottleneck on the cpu.

The idea is to match the cpu to the game and the gpu to the game's graphical demands at the resolution you play at. 100% cpu usage is fine so long as you're not getting any stuttering causing the game not to play smoothly. If you've got a 60hz screen (refreshes the image 60 times per second) the additional fps won't do much for you visually. If you've got a 120hz or 144hz screen the additional fps over 60 will have an impact.

How does the game play? Is it smooth? Are you missing shots because of lagging fps? Can you turn on the spot or pan the battle field without a bunch of jerking or screen tearing? If everything is playing fine I wouldn't worry too much. If you have a z series motherboard and a decent aftermarket cooler you can try overclocking your cpu if you haven't already, should be able to get up to around 4.4-4.5ghz or so. It won't make the cpu stop running at 100% but your fps should increase so long as your gpu isn't also pegged at 100% use.

Edit: Correction made, thanks for pointing that out Dragos Manea!
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810
I think despite 100% usage with the 4690k it can maintain above 60 fps, so the bottleneck in this case is not being able to do over 100 fps which seems to be common for this game. The average with the latest game patch is about 75-80% (60 fps conquest server, 1920 x 1200) that's about a 10% increase to the previous patch according to my performance monitoring! When the 100% spikes occur the frames may or may not drop at worst 5 fps but strangely with vsync disabled I can get 70-100 fps and not drop below 70 fps, so the frame drops when at 60 fps seem to be misleading, possibly caused by something else?

For the most part I feel that the game runs smoothly, a pretty solid 60 fps, definitely not missing shots to lag. There is the 80-100% Cpu usage which can be concerning but the more I look I hear that this is ok if fps and temps are normal... I have the 4690k oc at 4.3 and about 47C gaming temps, not sure how much performance would be gained with another Ghz or two... The high Cpu usage seems to be becoming more frequent with newer games or maybe it's just the ones I'm playing which are Bf1, Watch Dogs 2 and Tom Clancy's The Division but I've seen similar reports of other games and the i5 also.

What's puzzling me here is if I actually need to think about upgrading the Cpu, are the games so demanding now that they've been optimized for Hyper Threading? It surely is a concern for BF1 in particular if the i7 is seeing such high utilization over 8 threads, there is a lot of debate as to whether the game suffers from needing further optimization. Is that issue at higher refresh rates with the i7?

 
80-100% usage shouldn't be too concerning. The alternative would be a cpu at 30-40% usage and what good would that be really? Unused cpu % is similar to unused gpu usage, it means the game can't make use of what the hardware has to offer. Some games do make use of hyper threading like bf1 but the vast majority still don't.

I'm sure over time this might change though hard to say how much time. Dx12 was a big deal and everyone was looking forward to it, windows 10 was supposed to change the gaming world with dx12 support that wasn't available in win7 or 8.1. This summer win10 will have been public for 2yrs and there's what, a couple dx12 games and a couple others with partial dx12 support? Some of those play worse than they do in dx11 mode. So despite literally years of hype the reality is that the transition has been a fairly slow one. Just pointing out that games moving from heavily depending on a single core to several cores to requiring 6-8 cores or heavy multithreading may also be a slow process.

Only you can decide if an upgrade to an i7 is worth it for you. Some games are harder to guesstimate because of their online multiplayer platform, it's very difficult to benchmark. Number of players, how many are on screen at once, how they're actually moving means the data continually shifts. Erratic data can't really be benchmarked since performance may seem to be all over the place.

A review done a few months ago showed dx12 in bf1 hurt performance a lot. I'm not aware of any significant changes there. The i7's hyper threading does improve fps especially min fps which is important. However even a slow lowly i5 6400 at 2.7ghz (making the older 4690k faster all things considered even with newer ipc improvements) not dropping below 60fps. That could change though during battles, gamersnexus did a few runs through checking straight runs vs battling and min fps could drop quite a bit during battles.

Techspot did some testing of the division, it's mostly gpu bound at least with a gtx 980ti. There are also lots of discussions and proposed fixes for the division to try and fix poor fps issues. Pcgamer did tests with the game using an oc'd 5930k i7 (6c/12t) and they also appear to be gpu bound. At 1080p ultra settings a single 980ti averaged less than 70fps and dropped to 45-50fps. Once they used 2x 980ti's in sli they were able to improve fps to 100fps average and drops down to 55fps.

Tom's did a review on watchdogs 2 and it's also pretty hard on the gpu. Using an i7 5930k 6c/12t cpu it was only reaching 56-70fps with a gtx 1070 and it took a gtx 1080 to keep from dropping below 60fps at 1080p ultra settings. Since you're using a 1070, if you're dropping to around 55fps then that's likely all the gpu can handle. An i7 with more cores/threads won't help as evidenced by using the 5930k. If you're getting only 40fps then it looks like an i7 may help you out some. If you're getting dips to 50fps, is an i7 worth the extra 5fps?

http://www.tomshardware.com/reviews/watch-dogs-2-performance-benchmarks,4844-2.html

It truly varies from game to game, each is a separate program. Some are cpu heavy, others gpu heavy. A stronger cpu may or may not have much impact depending on the game. Some games when they release are just in bad shape and have a lot of tweaking that needs done via updates and patches to fix them. Poorly coded games can cripple any hardware.
 

There is a typo there, the image is refreshed 60 times a second not once every 60 seconds, as for 144hz the image is refreshed 144 times every second and so on. A image refresh of once every 60 seconds would mean that you must wait a minute to see the next image xD.

EDIT: OP as long as both your cpu and gpu are used 100% or closely to 100% it is fine, if the gpu is 50% then it is not. Cpu can be used only 20-40-60 % but gpu it must be always at 100%.
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810
I've tried DX12 with BF1 and performance seems about the same. The big thing to look for with DX12 that I know of is lower Cpu usage is that right? I could probably run some tests to get averages from both DX11 and DX12. I was told on the forums here that DX11 can only use a maximum of 3 cores is that right? The games I'm playing see up to 100% on all 4 cores, so how is that explained? Is anyone sure these games aren't actually using all 4 cores?

So what makes the i5 capable of maintaining 60 fps when 90-100% usage occurs? Is it clock speed or something else? I see the i5 6400 at 2.7ghz doesn't drop below 60 fps so I'm curious about that one... What sort of scenario would see the same 100% usage cause significant fps drops?

My fps has been good with The Division but I entered a large battle with other players for the first time and things possibly became a little bit hairy, nothing major but need more play time to determine and I will look into tweaks.

The same story seems to play out with the 4690k in these games which is an average of 70-80% with brief moments of 90-100% it's like they have something going on in common. I see you have the 4690k as well, do you experience similar from it?

Watch Dogs is a bit stranger, fps will be nice on 60 fps for some time and then seems to randomly start having massive fps drops into the 40-50s in the same area scenarios etc. Maybe multiplayer is linked to some of this behavior? Driving in particular also causes some minor frame drops. I'm not certain but I think some of these frame drops don't actually coincide with 100% cpu spikes.

It's sounds like 6 core mainstream Cpu's are coming from Intel with Coffelake, will these be good gaming chips or will 4 core offerings still be better?
 
I don't have the latest AAA games so may not be relevant. I don't seem to have issues though in skyrim (around 50 mods), cod mw, cod ghosts, farcry 3, crysis2 etc.

Dx12 is supposed to support more even threading and is also a means for the hardware to talk to each other more directly. Lower driver overhead means less overhead for weaker cpu's so they can focus more on the game while the gpu handles the graphics. That's why lower power cpu's tend to benefit more greatly from dx12 than say an i5 or i7. I3's, pentium's, fx cpu's would generally benefit moreso.

I've not played bf1 personally though others testing of dx11 vs dx12 shows dx12 to have issues. It's not true for all games so I'd venture to say it's how it was implemented in this specific game title rather than a testament to dx12 in general. Any game has the potential to be a rockstar or a flop in terms of performance depending on how it's coded and what it consists of in graphics. Updates can also be released which work to correct various flaws. When looking at performance reviews it's important to make note of when it was reviewed, which patches if any were applied etc. Otherwise it could be tested upon first release and revisiting that review several months (and several updates/patches) later could yield different outcomes.

Dx12 performance seems to be a bit all over the place. It could very well depend on a game by game basis, some tests done are theoretical but don't pan out in real world cases. Dx11 seems to use more than just 3 cores, all the games I play are dx11 and my cpu shows all 4 cores fully loaded. Not just 2 or 3 with 1 core sitting idle. Some tests done by pcworld have shown dx12 performance doesn't increase much past 6c/12t and isn't a huge increase beyond that of 6 cores (no ht).
http://www.pcworld.com/article/3039552/hardware/tested-how-many-cpu-cores-you-really-need-for-directx-12-gaming.html

Here's an article that deals with clock speed as well as dx11 vs dx12 and amd vs nvidia.
https://www.hardocp.com/article/2016/04/19/dx11_vs_dx12_intel_cpu_scaling_gaming_framerate/5

The higher the clock speed the less performance difference between dx11 and dx12, the lower clock speeds the more benefit (throwing back to weaker cpu's gaining more from dx12 due to reduced overhead). At least with nvidia, with amd cards they showed more benefit from dx12. The article is also a year old, doesn't included the latest gpu's or the latest drivers. Take it with a grain of salt. It's hard to get finite data when the data is perpetually changing.

Multiplayer does have a large impact, performance won't be the same from time to time if gameplay is different. The game effectively becomes different. One game with 12 players coming from the left, 3 from the middle and 7 on the right is going to be different than single player campaigns where you have a more predictable 8 enemies coming from the left every play through. That's why benchmarks are often done with campaign mode or built in benchmark modes, not multiplayer the way many people play.

In order for a test to be confirmed or results to be deemed accurate you need the fewest variables possible. That's basic scientific approach. Multiplayer is full of variables by nature so difficult to bench.

I can't really answer which scenarios would cause 100% usage on an i5 to be ok and for fps to drop heavily in other scenarios. It's about data that needs to be processed, so long as there's data to be processed the cpu will keep working. If enough data is processed in time, fps should be smooth. If the game contains more data than an i5 can process by the time it's needed it may cause hiccups noticed in fps drops. That gets too far into the details of how each game is coded for me to answer.

It's a bit like asking why one webpage loads faster than another that's identical in appearance to it without knowing what's going on in the background. Are the images different sizes? Is the internet connection the same? Is one written in plain html, html+css, javascript, is there flash being used to achieve simplistic things that css could handle? At first glance they might appear identical, behind the scenes the coding could be vastly different. More streamlined efficient coding along with lightweight images or even repetitive images (things already in memory) vs inefficient bloated methods can achieve the same effect at reduced cost in terms of processing power and internet bandwidth. Games are similar.

Speculation is difficult, it's very possible that future games and hardware will move to higher core count options. When is the better question. Obviously cpu's have moved from single core to dual, quad, hex and octa core for various needs. Speculation that 6 and 8 core cpu's would be needed for common applications and gaming were made 4, 5, 6yrs ago with fx cpu's being so inexpensive (comparatively).

Every year people say well console xyz uses lots of cores so pc games must too and the logic doesn't exactly play out that way. Consoles have to use multicore cpu's at lower speeds to keep temps down. Imagine the cooling needed for an i5 even using a stock cooler, there just isn't room in a console. They went slower cores and wider core count. Pc games operate differently and even a dual core + ht i3 or 7th gen pentium can outperform an xbox one or ps4. Apples and oranges.

Big moves tend to happen more slowly, dx12 was promised to be game changing years ago and it's still in its infancy. 64 bit processing was out long before programs were able to fully utilize it or even need it. Browsers being some of the slowest adopters of 64bit. If you're a game dev you have to consider the desire to move forward along with the user base. If your game runs like crud on anything less than 6 or 8 cores, you're cutting out athlon x4, fx 4xxx, pentium, i3, i5 users. That's a pretty large percentage of hardware owners to suddenly snub and cutting your customer base by a thick chunk so it's better to ease into requiring more cores. Even 'cheap' ryzen 8 core cpu's are running $330-550. The ryzen 5 series are 6c/12t and supposed to be $220-250. If that becomes the min requirement what are budget gamers supposed to do? Many are still buying sub $200 cpu's to game with for budget builds. It's just my personal speculation, nothing more, but making the requirements/cost too high in the pay to play of gaming means excluding folks who can't afford all the luxury. Imagine if every game suddenly became VR only, you had to have a $300+ cpu, $450+ gpu, couple hundred worth of VR headgear and other components or forget it - a lot of folks would say 'forget it'. Prices are coming down for sure but 6core cpu's still aren't 5 for a dollar.
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810



Would a slower Cpu with the same large data demand result in being stuck at 100% more frequently causing more severe frame drops? Maybe that was kind of scenario I was thinking. Perhaps IPC would determine this rather than core speed?

I guess the i5 4690k still has the horsepower to run the latest AAA games but is on the verge or is a minor bottleneck at certain moments. Is that how you would sum it up?



Yes but aren't some games already utilizing or optimized for 8 threads like BF1? Quite a few games now have i7 as a recommended cpu, or do required specs not always tell the true story?



About the the consoles, do developers basically rewrite a games code for the PC versions?



Yes I see what you mean about cutting out the customer base and this really makes me wonder if the games coming out right now could be better optimized for the minimum spec cpu's. A lot of people are complaining with minimum spec cpu's that should still be able to play the games without issues. It makes me think the dev's are favoring the more powerful spec machines and not providing a fluid experience for the whole of the customer base.
 
Unfortunately I don't know the details of most of those questions. I know very little about game programming but yes, the two platforms (pc and console) are likely coded a bit different. Mac is closer to pc than consoles and software isn't compatible for the most part. It requires 2 different versions.

Software for consoles can be streamlined, a developer already knows what cpu, gpu, etc is contained within a ps4 or xbox one. Consoles also use a much slimmer os than windows. You know exactly what is available on a ps4 gamer's system. A game dev won't have any clue what combo you've got on a pc, it could be dual core, quad, hex, hyper threaded or not, more ram, less ram, faster ram, slower ram, which gpu? The potential combinations are endless.

Gaming system requirements can be somewhat accurate or nowhere close. Recommended specs have been calling for i7's for years even on games where it was difficult to benchmarks 1-2fps difference over an i5. Many times they just slap the highest cpu available on the package. Min requirements are xyz, oh it's 2017? Ok, recommended specs are i7 7700k or ryzen 1800x. When the 4790k was the fastest/highest end mainstream intel cpu it graced the 'recommended' list for most games.

That's why game reviews and benchmarks are more important, if there's a game you have you are interested in, check out a review of it. They will often be benched with various gpu's and cpu's to give an idea of performance. Having a 1070 gpu doesn't guarantee you'll be cranking out 80-90fps on ultra with every game, you might find a game where you're dipping below 50-60fps and a review can reveal that it's your gpu. Or at least the complexity of the game and its effective fps with your gpu.

Some games when they first come out are just complete turds. They run like crud on oc'd i7 7700k's with gtx 1080's and 16gb of ram. That's becoming a lot of games, hurry up and release it even if it runs horribly and slowly tweak it here and there to fix it along the way (hopefully). It's not a matter of giving preference to high end systems, that's more of a case with software being released prematurely. Selling a car with 3 wheels and no door handles, don't worry you'll get the extra pieces later.

Bf1 does do better with an i7 vs an i5 thanks to hyper threading, but it's more the exception than the rule. If you play it and if your performance is suffering then it becomes a matter of what do YOU feel you need/want and are you willing to pay the price? Bf1 came out in October last year so it's been out for 5mo or so. Tom Clancy's Ghost Recon came out a few weeks ago, so it's a newer AAA title and yet it's less cpu intensive and more gpu intensive.

Ghost Recon is a perfect example, they call for a min of an i5 2400s or fx 4320. There is a significant gaming/processing difference between those two cpu's. Recommended they call for an i7 3770 or an fx 8350. Again kind of odd, why such old cpu's for both min and rec specs for a 2017 game? No idea. An i7 again is vastly stronger than an fx 8350. A modern i3 would compete with an fx 8350 in most games. Sometimes I think they pull numbers out of a hat.

Dsogaming did some tests and they do similar setups as other review sites, extremely low graphics details and resolution to isolate cpu performance, higher settings and resolutions to test the gpu. They decided to use an i7 4930k 6c/12t cpu to test and simulate 2 cores, 2c/4t, 4c/4t etc. up to 6c/12t. At extremely low resolution (1280 x 720) and low settings, there was a difference and a benefit to having 6 cores. The minute they went to ultra settings at 1280 x 720 the gtx 980ti they used was bottlenecking and anything past 4c/4t made hardly any difference.

http://www.dsogaming.com/pc-performance-analyses/tom-clancys-ghost-recon-wildlands-pc-performance-analysis/2/

It sounds very confusing and in a way it is. Each game is different, each system setup comes into play, cpu, gpu, resolution, graphics detail settings etc etc. Does a brand new AAA game use more than the 4c/4t of an i5? Technically yes. IF you game at low res with details turned down and a higher powered gpu like a 980ti or 1070. So what happens when you're gaming at a more normal 1920 x 1080? The gpu will become the bottleneck faster than it does at lower resolutions.

Once the gpu begins to show as the limiting factor it makes little difference if you have 4c/4t or 6c/12t. So despite using higher core counts or thread counts than an i5 offers, in reality you may not even notice the difference. So is the i5 really bottlenecking? Not as much as you'd think. At least not with this game and at more typical settings. All these variables can be explored on a single game and give either a cpu bottleneck or gpu bottleneck giving very different results. There's rarely a clear cut answer in terms of one size fits all.

This doesn't even touch on other aspects, are you playing online over internet servers? Are you recording/streaming while gaming? The list is almost endless. In that dso gaming article they reference the game servers. Stating everything relies on the servers, latency from the servers can cause graphic issues, gameplay can hang while the pc sync's with the servers, sometimes hi res textures don't get applied immediately, etc. More variables on this particular title that may differ from one user to another and from one game to another. Otherwise you have a really recent AAA title that will max the gpu at higher settings and will play well on an i3 or i5.

There are far more recent games that didn't stress the cpu as bad as crysis did back in the day, hence the joke 'yea but can it play crysis'. A game or two with high requirements doesn't represent all gaming. It can be important if you're dead set on playing those specifically stressful games, then you'll either have to buy better hardware, turn down settings or come to a compromise depending on your budget and what it's worth. All of the bits about ghost recon only apply to that game, switch the title out for another and it would require looking at a whole new set of tests and results. What will games coming out next month or next year require? Spin the wheel of mystery, no one really knows.
 
Solution

kol12

Honorable
Jan 26, 2015
2,109
0
11,810
Just looked at some vid's of Tom Clancy's Ghost Recon and it looks to be extremely gpu taxing but cpu usage looks pretty average. They were beta vids so maybe it's been addressed since then.

Wow a 1070 starting to struggle so soon at 60 fps? Ultra settings must be getting very demanding...

At low resolution the gpu can render frames much faster than the cpu and the cpu has to try and keep up with that, that is the cpu bottleneck right?

Yes, what I meant in regards to whether multiplayer could be causing frame drop issues with Watch Dogs 2 was possible server latency issues and I suppose that's relevant for any any multiplayer game...

Ultimately I think that's right, there are a select few games out right now that seem to benefit from Hyper Threading. Luckily a modern i5 looks to only be a pretty minor bottleneck in those games. I agree that there are less stressful modern gaming releases, I just got concerned when seeing the 90-100% spikes across three different games that I've been playing more recently but I think it just happens to be those games in particular.

Heck I've only had the 4690k since early 2015 so I don't think upgrades move that soon. There would need to be at least a 60-70% performance increase over Haswell to make an upgrade necessary would you say? I believe Haswell was the performance gain that made a lot of people move from Intel Core, it was for me anyway...

In terms of the future yes nobody really knows, if more games do start heading towards using more threads it will become apparent. Some people seem to suggest going for i7 if upgrading for that reason.
 

st3v30

Admirable
With my 4770 and GTX 1060 when I use Ultra preset GPU will sit at 100% all the time, but i like higher FPS so I reduced some settings and now I'm at most of the time around 100 - 120 FPS, sometimes even more depending on map.Things I have noticed with BF1 is that performance has drooped since game launch, like every new patch shaves of few FPS.Also Dual RAM Channel and RAM speed matters a lot in BF1, also having 16 GB helps a bit, in my case game uses over 8 GB RAM (Total with windows and all other programs ) all the time and I have just a few programs running in background.
If you want to upgrade best choice would be to sell 4690k while you still can before Ryzen 5 hits the market and get 4790k, or since you have good motherboard and cooling to push overclock a bit more .
4690k is still a great gaming CPU, but some games just like to have more cores/threads at disposal when it is need and they will benefit from that.