Does an i7 6700k bottleneck the GTX 1080 Ti ?

Snorxy

Commendable
Oct 18, 2016
14
0
1,510
I'm planning on buying the 1080 Ti and game in 3440x1440 resolution. Does it bottleneck?

My rig:
OS: Windows 10 64bit
MB: Asus z170 Pro Gaming
CPU: Intel i7 6700k
GPU: Asus Strix GTX 970
RAM: Kingston HyperX 16GB DDR4 2666Mhz
PSU: LC Power LC1050 1050W
 
Solution
Yes, it does.
The amount of bottleneck depends completely on the game though. There is no such thing as a CPU that is not at some point a bottleneck for any game.

It's LESS likely at resolutions higher than 1920x1080, and even when it DOES bottleneck the amount may not matter as you might get for example 70FPS instead of 78FPS on an i7-8700K.

There's plenty of proof if you care, but it's mostly a non-issue as there aren't many games that benefit much from a better CPU like the i7-8700K (a few of those benefit from the extra cores more than the extra frequency, even at 2560x1440 or 4K).
Yes, it does.
The amount of bottleneck depends completely on the game though. There is no such thing as a CPU that is not at some point a bottleneck for any game.

It's LESS likely at resolutions higher than 1920x1080, and even when it DOES bottleneck the amount may not matter as you might get for example 70FPS instead of 78FPS on an i7-8700K.

There's plenty of proof if you care, but it's mostly a non-issue as there aren't many games that benefit much from a better CPU like the i7-8700K (a few of those benefit from the extra cores more than the extra frequency, even at 2560x1440 or 4K).
 
Solution
Also, if that is a GSYNC monitor then it's much easier to tweak your settings. Just run everything at ULTRA then see what the average FPS ends up being.

If it's above 100FPS (if 100Hz) then maybe set an FPS cap to 95FPS to stay in asynchronous mode, or if it's too LOW such as 40FPS and 60FPS would be ideal then drop a few settings.

(I'm not sure the ideal way to set a per-game FPS cap though. Maybe NVInspector?)
 
http://www.tomshardware.com/news/intel-coffee-lake-i7-8700k-gaming-tests,35595.html

Look at the slides and use the i7-7700K vs i7-8700K to see how the games compare. Note that they are listed at stock, and both at 4.9GHz. Since you're CPU is likely under the i7-7700K@4.9GHz then any time it doesn't match the i7-8700K you could be doing better in that scenario with the i7-8700K.

But.. also notice that it doesn't matter much overall, and if I hunted down 3440x1440 results many of these situations would have no CPU bottleneck on the i7-6700K at all (for average FPS).
 


So you guys have done extensive Frame Time Analysis have you?

You are aware that there are still games like AotS that are very well threaded and will utilize more than an i7-6700K can do even at 4K? (it's the battle calculations during heavy battles that kills it).

Even Starcraft 2 will get bogged down. It can only use two cores but that's not the point. It will still drop below 60FPS on any CPU that exists during really heavy battles.

CIV6 also is very CPU intensive and will use more than what the i7-6700K offers at times.

I think I'll give up on trying to explain how this all works though because even when I give actual proof people still aren't able to accept it.

As I continue to say it's mostly a non-issue.
 


ur' taking 3 games out of thousends... so yeah, it's not a bottleneck that u should be worried about

 
... and I said "it's mostly a non-issue"

And that's three examples. There are other games too, and that's not counting the large number of games that still miss frames due to the CPU (call it inefficient code but it's still a CPU bottleneck).

That's what Frame Time Analysis tells you (though of course some of that can still be a GPU bottleneck.

Whatever... I guess this isn't the place to be accurate.
 
Yes... Well the i7 6700k is still not "perfect"
But there realy isnt an option that would worth it
even a i9-7980xe would not run games that are optimized so badly
So if we are going with photonboys logic every cpu would bottleneck you... and he is right in this

But you can confortably run an i7 6700k with a gtx 1080 ti and you would get more frames in most games simply overclocking your gpu than upgrading to the most expensive cpu there is xP
 


I have two GTX 1070s in SLI and I'm still on Sandy Bridge (Xeon E3 1240). I have not encountered any bottlenecking and two 1070s in SLI easily beats the pulp out of a 1080Ti. Just look at the official Tom's Hardware Superposition thread.
 


First of all, a GPU-focused benchmark tends to NOT use the CPU in a very demanding way since there's a lot of code a visual benchmark doesn't need to do that a GAME would, especially a modern application like Superposition which is more threaded than older software.

Secondly, short of running a FRAME TIME analysis or other such technical analysis of the state of your CPU and or GPU load there's no way to know if you have a bottleneck or not.

Finally, it's even MORE difficult to know if you have a bottleneck using SLI (specifically AFR aka Alternate Frame Rendering) since no program will use both GPU's to 100%.

For example you might see about 60% GPU usage on both GPU's. What does that even tell you?

Normally you'd say there's no "CPU bottleneck" if the GPU was used at close to 100% at its maximum frequency (i.e. 96% GPU usage at 1980MHz)... but since GPU scaling usually limits both GPU's in SLI to partial GPU usage how EXACTLY did you determine you have no CPU bottleneck?

As for looking at the CPU itself that's meaningless unless the CPU is at 100% utilization because you can even be at 30% load on the CPU and still have a CPU bottleneck depending on how many threads a game can utilize. (such as Starcraft 2 which is limited to two threads AFAIK... though code thread jumping doesn't make that obvious).

*I'm seriously curious.. can you give me two EXAMPLES that demonstrate you have no CPU bottleneck?
 
Also, here's just one example of how Sandy Bridge bottlenecks one GTX1070 and it would be even worse with two GTX1080's:
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/76333-i7-2600k-vs-i7-8700k-upgrading-worthwhile-7.html

GTAV at 1080p has 50% higher FPS than an i7-2600K, and an i7-2600 is slightly faster than your XEON.

Even at 2560x1440 there's a difference with GTA5.

I did say it was a minor issue for the i7-6700K vs GTX1080Ti, so this comment is just in response to your claim of no XEON bottleneck for 2xGTX1080 which is unlikely.

Whether you would benefit much from a faster CPU is pretty hard to say considering I'm not sure how you would test that short of comparing the same games with a much better CPU vs your Xeon.

*And again, there are games like CIV 6 which are going to be WAITING on the CPU not the graphics card when the number of units ramps up and a good i7 could be over 25% faster than your Xeon.
 


The i7 2600 is not faster than my Xeon. The i7 2600's base frequency is 3.4GHz, mine's overclocked to 3.6. The i7 2600 average benchmark score on Passmakr is 8200, my Xeon sometimes surpasses the 9000 mark.

I ran a few games at 1080p and 2K, like TW3, GTA 5, and Dark Souls 3, and I can definitely say for certain that my CPU usage is at around 70-90%, but never slams a single core/thread at 100%. My GPU usage is almost always higher, with the exception being script-heavy games like Skyrim, Fallout 4, Fallout New Vegas, and ESPECIALLY Oblivion, since Oblivion has terrible multicore support.