i5-6600K + 1070 FPS on a 1080p/1440p?

Philip Babilonia

Honorable
Oct 10, 2013
140
0
10,680
Hey. I want to know what possible FPS/benchmark can I get on a 1080p or 1440p using a 6600K overclocked and an aftermarket 1070 on OC mode. Planning to play games like The Witcher 3, Battlefield 1, etc. on Ultra Settings and I want to know how does both compare. I know I can get better FPS on 1080p no doubt but I'm eyeing at the moment at a 1440p 144hz 1ms GSync monitor and if I can at least get 60 fps with this build (or more since we're talking about overclocked components), I'm quite happy with it.

Another question: how much higher can I overclock the 6600K without getting it dangerously risky? Some says 4.4ghz without going over 1.3v is it's sweet spot. Thanks!

P.S. Don't know if this is the correct thread. Please move so if it's on an inappropriate thread.
 
Solution
I'll try to break it down:

games need CPU power for it's calculations. that is not related to graphics settings. for example Battlefield:

the CPU needs to calculate the movements, the postions of you and the props, the bulletdrop, the damage done and so on. that is not dependant on graphical settings as it's the physics in the game. if you're playing on your phone's screen in a resolution of 640x480 the CPU still has to calculate the damage, bulletdrop, your movement on the map, etc.
that's why higher resolution don't really affect the CPU

now for your graphics settings:
the GPU has to render every pixel. 1440p has roughly two times the number of pixels than 1080p. so the GPU has to work twice as hard. other settings put additional...

apk24

Reputable
Aug 6, 2015
420
0
4,960
The i5 6600k shouldn't really hinder performance, you can expect within a few fps of all the 1070 benchmarks out there.

95 FPS at 1080p, 40 FPS at 1440p for Witcher 3 on highest settings

 
These benchmarks were done with n i7-6700k which performs about the same as the i5 in most games:

Overwatch_02.png

Witcher_02.png

DOOM_02.png

Division_02.png

Primal_02.png


so 60fps shouldn't be a problem.
If you're tweaking settings at a point (disabling MSAA for example, shadows and reflection to high instead of ultra high) you probably won't notice a difference in game but your fps will increase well beyond 60

 

Philip Babilonia

Honorable
Oct 10, 2013
140
0
10,680


Can GSync help to get it to 60 FPS on the 1440p?
 

Philip Babilonia

Honorable
Oct 10, 2013
140
0
10,680


Thanks for these benchmarks! I've no problem with a little tweaking of the graphics for better frame rates. But if I overclock my cpu, would these benchmark (for the 1070 at least) change to higher FPS (for those games which hasn't met 60 FPS) for 1440P. And would GSync in the monitor also help?

Anyways, may I know which site you use for these benchmarks? I may have a need for it for future testing and research.
 
the CPU doesn't help with resolutions. higher resolutions are soley on the GPU. those games were you're struggle @1080p (AKA none) won't struggle @1440p CPU wise.
overclocking your GPU might squeeze out a few fps. overclocking the CPU has no effect unless you're CPU limited anyways in a game (but to my knowledge there's no game yet where the 6600k struggles to get 80+ fps)

as for G-Sync: G-Sync doesn't get you higher frame rates. G-Sync is just elaborate V-Sync if you want. What G-Sync does is, there is a chip in the monitor that's constantly measuring the fps output of your system and it's sync'ing the monitors refreshrate to it. means, if it's a 144Hz monitor but you can only output 80fps, it automatically adjusts the refreshrate to 80Hz while adjusting it again in the next second should your fps drop or rise further.
this provides a tearing and lag free gaming experience at almost any frame rate (even 36 fps is suddenly way more playable).
so it doesn't get you higher fps but gameplay is smoother if you can't hit the monitor's refresh rate (as it is constantly adjusted) and sub-60 fps is way more playable with it.
If that's worth ~120$ extra to you -- can't answer that. G-Sync is expensive. Some people love it, others see it as overpriced gimmick with little real life value.

I was just looking around for 1440p GTX1070 benchmarks and used the first site, that didn't test with a 1000$ CPU but with something that's comparable to a 6600k. in that case it was http://www.techspot.com
 

Philip Babilonia

Honorable
Oct 10, 2013
140
0
10,680
I don't seem to understand 100%, only parts of it but I kinda get the main idea: overclocking cpu does not increase; overclocking gpu may increase, gsync does not increase fps; gsync helps matching the framerate of the monitor to the card; gsync makes gameplay smoother. Do I get this right?

I don't know if I'll be able to notice the difference with GSync at all since I until now played at low-medium settings with my r9 380 and an AMD A4-6300 cpu. Maybe I'm going to pass on that and get a much cheaper monitor. As for the resolution, both are okay for me as both seems bigger than the 15" I use (and in fact, I'm still quite happy with this one even now). Considering what I've said, maybe I'm better off with a 1080p/1440p 144hz 1ms monitor without GSync?
 
I'll try to break it down:

games need CPU power for it's calculations. that is not related to graphics settings. for example Battlefield:

the CPU needs to calculate the movements, the postions of you and the props, the bulletdrop, the damage done and so on. that is not dependant on graphical settings as it's the physics in the game. if you're playing on your phone's screen in a resolution of 640x480 the CPU still has to calculate the damage, bulletdrop, your movement on the map, etc.
that's why higher resolution don't really affect the CPU

now for your graphics settings:
the GPU has to render every pixel. 1440p has roughly two times the number of pixels than 1080p. so the GPU has to work twice as hard. other settings put additional load on the GPU (AA, reflextions, details, lightening,...). the more pixels the GPU has to render the less ressources it has to show all this fancy eye-candy in the game. for example: the explosion is calculated by the CPU but displaying it in a foto realistic fashion is the GPU's work and the more realistic it is displayed, the heavier it is on the GPU.

so when you get 70fps in bf1 no matter if you're running on minimum or maximum settings, it's a sign that your CPU can't calculate more frames. when you now pair it with 2 1500$ cards, it will still only display 70fps as this is the limit of your CPU. overclocking the CPU will get you higher fps in that case independent of resolution.

however when you play battlefield and get 150 fps gaming on medium settings @1080p and suddenly get 50fps on high settings @1440p, overclocking the CPU won't do anything as your CPU could output 150 fps but your GPU struggles to keep up and can't render more than 50fps.
Overclocking the GPU will increase the fps here.

now, not every situation in a game is equally demanding.
again using battlefield as an example: you spawn at your own base, but your team is dominating and is attacking the opponent's base while there's litterally nothing happening at your base. all the action is a 2min ingame run away. let's say you got 144 fps there.
now when you move towards the action, there's suddenly heavy mortar fire, grenates explode everywhere, tanks are moving up and down destroying structures and so on, you suddenly don't get 144fps anymore as all this puts a lot of load on your GPU, so your fps drop down to 60.
so your card outputs only 60fps but the display refreshes not 60 times per second but 144 times. so only every second frame the display shows is in fact a new one as it repeats the frame shown by the GPU as it's not supplied with a new one by it.
therefore you see the same frame twice before switching to the next one which appears sometimes as real lag, sometimes as stutter or just feels not very smooth.
it also works the other way around, when you play a less demanding game, your monitor's refreshrate might be 144hz but your GPU outputs 300fps which causes 'screen tearing'

to eliminate that there's V-Sync but it only works for the second option as it limits your GPU to the refreshrate of your screen. meaning no matter how many fps your GPU can draw, it will only display 60. this results in a certain input lag though and if you fall below your monitors refresh rate you will notice stutter.

G-Sync works the other way around and doesn't tie your GPU to your monitors specs but your monitor's to your GPUs.
this makes much more sense, especially when your GPU struggles with displaying enough frames to match the monitor's refresh rate.
so revisiting the battlefield example: when dropping to 60fps on a 144Hz screen with G-Sync, the monitors refreshrate is automatically changed to 60Hz therefore eliminating the problem of stutter on not smooth gameplay as the monitor is now exactly as fast as the GPU and doesn't have to wait for it.
in theory that's worth any penny and is much more future proof than spending 60$ extra on a card with higher factory overclock as fps drops doesn't hurt you so much anymore. there are people stating that even at 40fps with G-Sync the game feels very smooth and lag free.
so yes, it makes gameplay smoother, when your card struggles to keep up with your screen.

as for what you should buy: the real question you have to ask yourself is: how big of a screen do I want and need? for example I'm at 1080p because a 27'' monitor is just too big for me. I wouldn't be able to properly fit it on my desk and get enough distance to it so I'm staying with my 23.6'' for now. in this size there are only very few 1440p screens and most of them are...well with issues.
if you indeed decide to go for 1440p 144Hz, I'd say G-Sync is a nice feature to have as the 1070 will struggle to get to 144fps on that resolution without dropping to medium settings.
if it's worth that much money to you, well, depends on your bank account :D
if you decide to go for 1080p 144Hz I personally don't think it's that important. it still remains a nice feature but that's it. just a nice feature. and paying that much more money for a feature that you don't really benefit so much off in that case...well...depends on your bank account.

 
Solution

Philip Babilonia

Honorable
Oct 10, 2013
140
0
10,680
I now get it. Thanks for this comprehensive explanation. Correct me if I'm wrong in this one, but playing single player games rely on GPU and multiplayer games rely on CPU most of the time, is this correct?

Anyways, I'll give it some time to process in my head and I'll head back to this thread from time to time once it gets confusing. Right now, a 1440p 144hz GSync monitor's my best bet since as you said, gsync doesn't really that important when I go 1080p. Maybe I'll save up for another 6 months or so to get that $800 monitor. Since I'll need to save anyways since I'm short in the budget. Better to save up for the best I can get for that short amount of time. And maybe, prices will go down after that span of time. Thank you so much for all your help!
 
not really. generally classic e-sports games (such as LOL, Dota, Starcraft, Counterstrike) CPU heavy
everything strategy related relies heavily on the CPU

open world games usually on both (Witcher, GTA)

while sports games and shooters are usually heavier on the GPU
but you know, BF1 being one of the cpu heaviest games there is kinda defies that classifications.

depending on tha games you spend most time with, a 1440p/60-75Hz IPS display might be the right choice for you, as we haven't really talked about panels.
 

Philip Babilonia

Honorable
Oct 10, 2013
140
0
10,680
I see, thanks for clarifying that.

Unfortunately, I don't think I can find a 60-75hz 1440p monitor here in my place (Philippines). Mostly I see listed is either a FreeSync or a GSync monitor respectively, although I haven't tried vising local stores yet, I may have more luck there.

I may know difference between the two panels and I'm looking at an IPS for the better colors than a TN panel for the smoother/faster response.