CPU demand for 144hz?

kol12

Honorable
Jan 26, 2015
2,109
0
11,810
I don't really understand what more the CPU is having to do at high refresh rates and how much more demanding it is on the CPU over 60fps for example. Does it also have to render more frames as does the GPU? Is the demand as great on the CPU as the GPU in this situation?

I'm considering a 1440p 144hz monitor but wondered whether my 4690k would produce enough frames for 144hz to be worth while? Obviously a monitor is a longer term upgrade and a machine upgrade may happen while you still have the monitor but at least initially anyway?

I also wondered whether an upgrade from the GTX 1070 to the GTX 1080 would be better suited to get the most out of 144hz? Benchmarks I've seen show the 1070 with AAA games average 70-80fps with 1440p. What graphics settings would you usually lower to get more out of the 1070?

Over at Tom's bottlenecking thread the 4690k is shown as a semi bottleneck with the GTX 1080. What sort of performance could I expect with that combination?
 
Solution
I don't care for them personally I'll take a 75 Hz monitor and save $300 all day long. I build whatever the guy wants me to but I never recommend 144hz. You're thinking of drops as something gsync helps with. Gsync is just designed not to screen tears when you go above refresh rate. It does make things slightly smoother at lower fps setting, but 45 fps is 45 fps doesn't matter if it's on a 60hz or 500hz monitor

Supahos

Expert
Ambassador
CPU doesn't care what resolution you're running. It does calculations telling the gpu what to render. A 4690k would be fine running 4k hz, but would fail in several situations at 1080p 144hz. All a 4k rig needs is a massive GPU a 1440p 144hz rig needs a 7700k & a 1080ti to come close
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810
I wasn't really referring to resolution but refresh rate. I meant to ask, does the CPU have to work any harder to allow the GPU to render a higher amount of FPS?

Exactly where and why would the 4690k fall short at 144hz? I don't have a 144hz monitor but if disable V-Sync in a game the CPU tends to hang around 100% compared to around 70% at 60 fps, so I wondered what extra work the CPU is having to do...
 
In short, yes. Doubling your framerates requires nearly twice as much CPU power, and generally speaking you can't just double your core count. This is why the 7700K is the king of high hz gaming- it has the highest IPC and clockspeed of any existing CPU.
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810


So whatever calculations the CPU does for a game at 60 fps it has to make those calculations a lot faster for more FPS?

Does the 7700K work particularly well here for it's extra threads, IPC or both?

144hz doesn't mean a whole 144hz has to be maintained with G-Sync... Would the 4690K not be suitable for 144hz G-Sync at all? I figured 90-100 fps mightn't be so strenuous...
 
The number of calculations needed for 120fps are close to double those needed for 60fps. The extra threads of an i7 help in those games that can use the extra threads, and not at all in those that don't. Most AAA titles released in the last few years benefit from having more than 4 threads, but per-core preference is still critically important because games have a master thread that all others have to wait on. Ryzen 7's 16 threads often fall short of the 7700K's 8 in gaming for this reason.

Edit: There's no reason you can't use an i5 for higher hz gaming, but be aware that in the more demanding titles, a faster CPU would deliver more frames.
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810


So if it means the 4690k has to sit on 100% to produce 90-100 fps there's nothing to be concerned about? I suppose this would have been common in the early days of 144hz before games started utilizing more threads?

Is it correct that for more fps the CPU has to do more calculations, not just the same calculations faster?

By faster CPU are referring to a CPU with faster IPC rather than more threads? Isn't Kaby Lake about 20% more faster in the IPC department over Haswell?



 
Another way to think about it, the CPU has to do x amount of work for each frame. At 120fps the CPU has half the time to complete each frame when compared to 60fps.

Several modern games are starting to be able to use more threads/cores and this is why i7's in high fps situations are king but Ryzen 7 with its slightly weaker cores is not far behind a 6700k or 7700k.
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810
So it was very common for CPU's to be pushed to the max in early 144hz days? What would the worst case scenario be with the 4690K and high refresh gaming, limited fps, heat etc?

Regarding high refresh rates, at what point over 60 fps does a high refresh monitor become noticeable and and what point is not noticed anymore, i.e. 144hz vs 100hz?
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810


Is that another way of saying the CPU has double the work to do and has to work much faster to be able to do it?

I plan to upgrade the CPU at some stage, I'm just wondering what to expect from the 4690K with a 144hz G-Sync monitor...
 


I am simplifying, it not twice the work in reality but its significantly more. Best thing you can do is review CPU benchmarks, here are a couple

http://www.gamersnexus.net/game-bench/2673-battlefield-1-cpu-benchmark-dx11-vs-dx12-i5-i7-fx/page-2

http://www.gamersnexus.net/game-bench/2182-fallout-4-cpu-benchmark-huge-performance-difference
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810


So is the only real concern then for an i5 at high refresh gaming apart from occasional low frame dips increased heat output?

It would not be unusual for the i5 to see very high percentage utilization for high frame rates would it, depending on the game?
 
I am yet to see a benchmark where a modern high end i5 cannot hold a minimum of 60fps, the benefits of the i7 come in at higher target fps. However an i5 will run at very high usage in many games at 60fps but that's not really an issue even with stock cooler as long as your case has good airflow.
 

Supahos

Expert
Ambassador
This all goes back to my part that Is quoted in the bottlenecking thread. In every game there is a maximum number of frames a cpu is capable of sending to the gpu to be rendered. There is also a maximum number of frames the gpu is capable of rendering. Whichever number is lower is the number that will be rendered.

So let's say a game will run 90 fps on a 4690k max, and 130 fps on a 7700k

And 85 fps with a 1060 or 140 with a 1080ti

If you have a 1060 no matter which cpu you'll end up with 85 for as it's all it can render. If you have a 1080ti you'll get 90fps with the 4690k and 135 with the 7700k.

My point earlier was it takes a better system to run 1080p 144hz than a 4k 60hz


Here's why. 4690k as mentioned earlier as well as now any of the ryzen r5/7 CPU are easily capable of 60 fps in nearly all cases. You would however need at 1080 as a minimum for your GPU. If you want to run 144hz at say 1080p you're no longer going to be able to use any ryzen CPU as many cases they fall well below 144hz also in a lot of modern titles the i5s do as well. So now you need a 7700k/6700k/6800k/6850k/6900k as really your only option if you want to actually​ get 144 fps in most situations
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810
Thanks. With a G-Sync monitor you won't have to worry about maintaining 144 fps for the image to look consistent, so that is the upside.

A 4690K with a GTX1070/1080 might not provide the full benefit of 144hz but the benchmarks I've looked at show the 4690K is still capable of producing over 60 fps, so I think there will be some benefit with a 144hz monitor at least until a CPU upgrade is viable...

Also I understand CPU usage would actually lower when moving from 1080p to 1440p...
 

Supahos

Expert
Ambassador
CPU usage would only go down in a GPU bound game at 1440p some (not many admittedly) would still be CPU bound even at 1440p with a strong enough GPU. The reason usage would drop is if the GPU can't render frames as fast the CPU will slow down feeding it
 
Cpu usage shouldn't lower when increasing resolution. Instead what you might be noticing is a bottleneck from the gpu at 1440p. That can appear to make a cpu work 'less hard'. If a cpu like the 4690k were at 100% use and you're getting 60fps with a given gpu, say a 970 at 1080p, then moving to 1440p what you might see is fps drop to 40-45fps, the gpu at 100% and the cpu at 80%. It's not because the cpu doesn't have to work as hard necessarily as it is the gpu is now the limiting factor. Fps drop and so the cpu doesn't have to work as hard to maintain 40-45fps as it did for 60fps. Remove that gpu bottleneck with a gtx 1080 and the cpu usage will bounce right back up as the stronger gpu can handle 60fps or more again.

That's all just a hypothetical with the numbers but meant to convey what's going on. Provided the gpu isn't holding things up the cpu usage shouldn't decrease with higher resolution. Instead it might be a side effect of bottlenecking.

In dumbed down terms, gpu for resolution/eye candy (details high, very high, ultimate etc) and cpu for fps. It's not entirely that simple because the two work in tandem as a team. Bottlenecks occur whenever the game demands more power from one or the other and either component ends up being a significant blockage to achieving higher fps.

i3 + 1080ti, bottleneck on the cpu usually. i7 + 750ti, bottleneck on the gpu most likely. It also depends on the game. Witcher3 will put more stress on the gpu than cs:go. Every time you consider a different game, the requirements for that game sort of reset the whole equation in terms of what's needed to run it. A 1080 gpu will cause the 4690k to bottleneck a lot of the time especially at 1080p. Not because the i5 is slow as much as it is the gpu is too much for 1080p resolution. With 1440p and 4k resolutions out now there's a need for gpu's much stronger than most games can fully utilize at 1080p. It's just too much gpu for that given resolution. A 1060 or 1070 would be more appropriate. If a gtx 1080 could be fully used at 1080p it would tank at higher resolutions.

It also depends on what you do with the game and which game it is. Are you using things like godrays in witcher3 or using heavy duty enb in games like skyrim? Sure, a stronger gpu is needed. Vanilla skyrim doesn't require all that much in terms of hardware. If it's a game where the user can modify it with various extra eye candy that changes the whole dynamic. Same game, a few tweaks and vastly different needs.

Not trying to add confusion but there are also 120hz monitors available. Either 120 or 144hz, so long as you use something like gsync/vsync or freesync depending on which gpu you're using and what features the monitor uses to prevent screen tearing it should be somewhat worthwhile with the 4690k. Even if the cpu only pushes 70-90fps, that's still a fair amount more than 60fps. Provided that's what you're looking for.

Some people enjoy high fps no matter the cost (in terms of reducing graphics quality) for games like cs:go where 200-300fps is achievable. It doesn't matter if you're running the latest 7700k oc'd to 5ghz and a gtx 1080ti at 1080p, you're just not going to hit those kinds of fps in games like witcher3. Some people are happy with 60fps and would rather have 1440p for the higher resolution. Smoothing of graphics and a larger viewable area. There's no right or wrong, it's whatever you prefer.

In some cases you can get away with 1440p without going bonkers on the gpu, something like a 1070. If you get a somewhat smaller end of the spectrum physically sized screen in the 25-27" range at the higher resolutions vs a 32-34" 1440p screen you can disable things like AA. It will reduce the workload on the gpu and won't have as much visual impact. Higher resolutions on physically smaller screens result in a finer image, sharper/smoother edges without the jaggies. At 32" and 1440p the jaggies would be much more noticeable and you'd want to enable some sort of AA which will force the gpu to work harder.

Min fps rates are another thing to consider. That's generally where an i7 pays off. Many times an i7 can maintain higher min fps so the drops aren't as noticeable. If you're playing game xyz and an i7 4790k produces 80fps with a min of 65fps compared to the 4690k which produces 75fps with a min of 48fps, those frame drops are going to really show up. Having a higher refresh rate like 144hz that allows the full benefit of max or average fps will potentially cause the frame dips to be more apparent. The averages between an i5 and i7 may not look like much. At 60hz (60fps max) the drops from 60 to 48fps won't seem as exaggerated (12fps difference) as 75fps to 48fps (27fps difference) thanks to the additional 15fps realized by a faster monitor.

Again just bogus numbers but highlighting a potential pitfall of higher hz monitors. It will be exposing frame rate variations wider than previously viewable on a more limited 60hz screen.
 

Supahos

Expert
Ambassador
And that explanation is spot on and why when people ask bottlnecking issues there is never an answer unless you're asking about a specific game and a specific cpu, gpu, resolution, and refresh rate. At 1080p someone running a 1080 or 1080ti is almost always cpu bottlnecked even running a 7700k or a 6950k however in most cases if you crank the setting to max at 4k with a 1080 even a 4690k won't bottlneck a 1080ti
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810
Great answers synphul. I was thinking on more the 27" 1440p G-Sync monitors rather than the 32-34" 1440p as I would prefer the sharper resolution. Is the 1070 about the minimum for 1440p? Benchmarks I've seen show the 1070 can do up to about 100 fps at 1440p...

Are frame drops very noticeable over a high refresh monitor with G-Sync? Are you implying frame drops appear worse over high refresh G-Sync that at 60 fps?
 

kol12

Honorable
Jan 26, 2015
2,109
0
11,810


Well a 30 fps drop is a larger drop than 10 fps to begin with so would be more noticeable, but can you explain why an fps drop on a high refresh G-Sync monitor is more noticeable than on a 60hz screen?

Hopefully and ideally frame drops wouldn't be as large as 30 fps.

A lot of people swear by high refresh monitors, does anyone care to add their opinion?
 

Supahos

Expert
Ambassador
I don't care for them personally I'll take a 75 Hz monitor and save $300 all day long. I build whatever the guy wants me to but I never recommend 144hz. You're thinking of drops as something gsync helps with. Gsync is just designed not to screen tears when you go above refresh rate. It does make things slightly smoother at lower fps setting, but 45 fps is 45 fps doesn't matter if it's on a 60hz or 500hz monitor
 
Solution