Will a core i7 7700k bottleneck a gtx 1080ti?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Arnout_2

Prominent
May 1, 2017
5
0
510
So i'm gonna build my new rig and I need to know if a core i7 7700k will bottleneck a gtx 1080ti on 1080p?
 
Solution
"At 1080p, a gtx 1080ti is going to cause CPU bottlenecks in some games, no matter what CPU is used."

That's what you said, I just corrected you. There is absolutely no way a gpu can cause a bottleneck on a cpu. I don't care if it's a i7-7700k pushing a gtx 720, the gpu is not bottlenecking the cpu, the cpu will still perform at its rated speed and IPC. The gpu only bottlenecks stuff that's actually downstream, in this case the monitor, since the monitor will only receive a fraction of the info being processed by the cpu. The gpu is the bottleneck. Reversed with a pentium pushing a 1080ti, the cpu is the bottleneck as the amount of info to the gpu is severely curtailed. But it's all flow from source to monitor, it doesn't go backwards...


I generally agree with all your points, except delidding the CPU. Delidding isn't necessary unless you're chasing very high overclocks, or you are just an enthusiast who wants to play around with it. For the average user it's totally fine with even a £25+ cooler.
 


Thanks :) there have been quite a lot of improvement for a lot of people after delidding, a normal person won't buy such an expensive CPU if they dont plan to overclock it. Hence my point was keeping overclocking in mind, even when OP isnt going for it, which seems such a waste of resources.

 


True, but you can overclock pretty well without things getting too hot, assuming you have a decent cooler. For example a $100 closed loop water cooler will get you to 5GHz without the temps getting high enough to delid. It's just the instant loss of any warranty when you delid that I don't like, I don't feel it's necessary to get a decent overclock.

Still tempted to delid my 6700k...but I'd only get 0.1, MAYBE 0.2 GHz more overclock, which to me isn't worth pushing for with the loss of warranty in case you break it in the process!
 


Totally, 6th Gen Chips didnt had the overheating issue, just need a better waterloop or a a good AIO. No idea about air coolers being used in extreme overclocking.

I only support delidding for 7700K in particular, other than that I won't advise anyone to void their warranty.

 


Are you referring to this temperature spiking thing? Is that affecting everyone? I've only seen delidded CPUs having the problem, but I haven't gone looking. Presumably they just had a bad batch, it's by no means affecting everyone right?
 
Intel Sandy-Bridge was the last cpus to use soldered cores, Ivy Bridge and newer have all had paste under the lid. Consequently, there's been an influx of wierd temp variations, you can easily get mid 60's under 3 cores and mid 70's under the last core, Intel usual range being ±5°C either or both ways from avg. That 1 high temp core will be the limiting factor, especially in OC considerations, so many will delid just to bring the cores to a tighter average, which when done right usually results in slightly lower temps across the board. I7s are particularly susceptible to this when running HT as the heat output of that single core goes up considerably when trying to run 2x threads. You'll also find many don't bother looking at per core temps, they'll use a program that just gives a cpu-temp and freak out because it's higher than expected from the expensive cooler they have.

Because Intel uses on-core sensors, that take less than a second to register temps, when a core is used, it gets hot almost instantly whereas it takes a little longer for that almost instant heat to transfer to the cooler. So Intel cpu's will see large spikes in temp, instantly and consistently, yet not really do much to affect overall cpu temps under load. See this mostly at idle, temps jump from 32-45 all the time and right back to 32 a second later, but that's done nothing to change the liquid temp of the cooler in general. And won't until that 45 is more average under a sustained load.
 
Intel uses some black goop to fasten the lid to the core pcb. Under the lid is paste of varying thickness depending on the exactness of the lids manufacture. Get a slightly concave lid and paste between it and the cooler is thicker still, the edges of the lid stopping the cooler from applying enough pressure. What delidding does is twofold. It allows that black goop to be gone, which lowers the sides of the lid and consequently allows a thinner application of Tim between the lid and cores. Done right this'll do nothing but allow better transfers of heat to the cooler. But here's the gimmick. You just dropped lid height by ½mm, so allowances for this must be taken into consideration or you'll end up with not enough pressure on the cpu lid to really get good pressure and spread the paste out thin, resulting in temps not doing much or even going up as the paste insulates the cpu from the cooler. Corsair aios are especially susceptible to this as their standoffs on the pumps haven't taken into consideration thinner mobo manufacturing, so can be relatively loose to begin with.
 


I think you're kinda generalizing here, most Intel CPU cores will report up to 10C difference between each other at equal load, that's definitely true, but that's not to say it's a problem. I understand what you are saying, it's just not that big of a deal. I read a long time ago about the sensors in the CPUs having abysmal accuracy at lower temps as well, not sure if that's still true. Basically they're designed & calibrated for reading values up near the max (70C+ region) & down at 30C or less they're insanely inaccurate, like up to 50% error bar!
My 6700k for example will idle at mid 20s apparently, yet the water in my loop is hotter than that, so it's impossible for it to be correct.

Anyway! I think we got sidetracked here. My summary is:
Yes delidding will help an overclock, no it's not something an average user should do.
 
Yes. I was generalizing, there's a pretty broad spectrum between all the i5's, i7's of the last 4 gens of cpus.
And yes, I agree totally that while delidding or even lapping can help with OC, its definitely an advanced procedure and really should not be undertaken by the average user.
 
1080Ti is already a behemoth for 1080p. So no, 7700K won't bottleneck your rig. If you go with 4K or more and than MAYBE, I said MAYBE you will get a LITTLE bottleneck. Simple and short answer :)
 
Ahahahahahahah... A 7700k + 1080 ti for 1080p... you are delusional! If you are really serious about it, and not just looking for attention, you MUST buy at least a 2K monitor - let me suggest a 3440*1440.

Anything other than that is just a waste of your money and our time.
 


Perhaps not delusional, rather uninformed for the application of a 1080ti. If I were him I would quietly get a Ryzen 5 1600 and a decent Graphics Card instead of buying an extremely priced chip.

 


looks like you never heard of the game called battlefield 1
 


If his goal is purely gaming at 1080p with a 1080ti, then a 7700k will beat a Ryzen 1600 or 1700 every single time. Sometimes not by much, in rare cases by a lot. Now if he was talking 4k gaming and/or a bit of productivity on the side, or simply wanted to save some cash then hell yeah Ryzen 1600/1700!
 


He was talking about future proofing, and 4 cores are already becoming CPU bound. He never mentioned what resolution he is planning on running. Future proofing will be resolutions above 1080p, and everything becomes GPU bound. Ryzen 1600 or 1700 will game much better with more applications running while gaming. Streaming is a good example of Ryzen dominance in this scenario even when put up against a 6900k.

Start at 5:35. Ryzen is THE BEST CPU for Game Streaming? - $h!t Manufacturers Say Ep. 2
Linus Tech Tips
Published on Apr 6, 2017
Is Ryzen REALLY the best consumer CPU option for video encoding and game streaming? Let's find out! https://www.youtube.com/watch?v=jludqTnPpnU
 


Yes he did, you even quoted the OP where it says specifically 1080p! I agree games are becoming better threaded, however today, and for the next year or two, the 7700k will be capable of pushing a higher framerate than any of the current Ryzen chips. Some games will be more or less identical, but most will have a small advantage on the higher clock speed 7700k. If we could just get 4.5GHz Ryzen chips this would be a different story!

I also agree with the streaming thing, Ryzen is better than a 7700k for streaming, that's why I said specifically if the OP is doing gaming only. How many people stream anyway? Surely the average user asking for advice is more likely to not care about streaming. Always good to give the full picture though.

I'd summarise with:

  • If you want best possible performance for games only (no streaming) today and for the next year or two - go 7700k.
    If you want best possible longevity and/or want to stream/edit videos etc. and don't mind a slight reduction in framerates at 1080p for the next couple of years - go Ryzen 1700 and overclock it so 3.8-4.0ish.
I have a 6700k@4.8 and a Ryzen 1700@4.0 so would like to think I'm not biased either way here.
 


You also have to consider AM4 will allow upgrade into the next generation Ryzen CPU. The 7700K's performance is within 10% in gaming with a few exception. But let's be realistic if he is gaming at 60Hz the higher FPS numbers won't provide a better gaming experience than Ryzen at any resolution. Even if I was gaming at 144Hz at 1080p with a 1080Ti Ryzen can hit 144Hz or just be shy a few FPS from 144Hz, which mean you might have to turn down a setting in the game. The multi-threading benefit of Ryzen completely destroys the possible FPS gains in this limited scenario from being a feasible option in my opinion. If you could have near 6900K multi-threading performance or better and be just shy a few FPS on max settings from hitting 144Hz in some games, and having to lower a gaming setting a notch. I'm sorry the i7700K even overclocked to 5.0GHz isn't worth it too me.

Is a $160 CPU Enough for Gaming?
Tech YES City
Published on Jun 14, 2017
Today we pit the AMD Ryzen 5 1400 against the Intel i7 7700k with the Radeon and Geforce Mid-Range Champions (The RX 580 & GTX 1060 Cards) to see how much of a difference there is and also whether the performance you could gain off a 7700k is worth it when compared to the Ryzen 5 1400. Everything in this comparison was overclocked to relatively normal levels for air and water overclocks.
https://www.youtube.com/watch?v=R173IbAXKX8

I'm completely bias, I own 3 Intel computers. And I wouldn't buy Intel over Ryzen at this time.
 

I don't disagree with anything you said there, I just have a different conclusion - I want the best possible gaming performance available to me today, and if there's something better in a year or two then I'll trade up in a year or two. Most people will want the opposite of course and would much rather the longevity, I just get the upgrade itch a LOT so I'm happy to pay more over time and step up every two gens. Turning down a setting or two is absolutely the opposite of what I want for games!

I'm a graphics whore so I want my games to look as pretty as they can do and I want them at 144Hz. On that note yes, if the OP has a 60Hz monitor then definitely get Ryzen! The only times you'll want more CPU is old games that need the best possible single core experience - strategies in particular, Sups, CoH etc. and pretty much any MMO. So that was another consideration for me, I play an old MMO which needs the best possible single thread performance, an overclocked 7700k is still king there hands-down. I should test exactly that actually, if I get round to it on this lazy Sunday I'll post back some results :)
 


That would be interesting. I'm sure you know all about the infinity fabric. Going from 2133MHz to 3200MHz will net ~30% increase in FPS. About ~15% comes from the reduction in latency between CCX communication. That's a 15% higher boost to FPS performance than Intel get with same increase in RAM frequency increase. So, every little bit helps. The new agesa 1.0.0.6 added better RAM support, and overclocking ability. What video card using? Resolution?
 

Well unfortunately the 3000MHz RAM in the Ryzen will only clock to 2666MHz at the moment so I'd have to either bring my RAM down on the Intel rig as well, or just ignore it. Tbf it only clocking to 2666MHz is the mobo's fault so that's part of getting Ryzen so it's kinda valid to just compare as they come.

So I can easily use the same resolution, that's no problem, but I don't know if I can be bothered with going to the effort of swapping one GPU between them to test. I was thinking 1080p graphics on super low and seeing what the difference was there. I can try max settings too, as long as neither GPU is maxed out then I think it's a pretty safe comparison. The only thing is it's an AMD GPU in the Ryzen build and an Nvidia GPU in the Intel build so driver overhead could potentially make a different too which would bias this as a CPU test.
 


I found some benchmarks with a 1080 not Ti. Ti is about 33% faster.
7740x-ashes-benchmark.png

7740x-ashes-escalation-benchmark.png

7740x-gtav.png

i7-7740x-mll.png

7740x-wd2.png

http://www.gamersnexus.net/hwreviews/2965-intel-i7-7740x-cpu-review-vs-7700k-not-worth-it/page-4
Overclock 1700 3.9GHz RAM 3466MHz
 
The short answer is NO. I built that rig two weeks ago: i7 7700K, EVGA 1080ti SC2 (I got the SC2 cause I'm gonna build a custom loop and thought the FTW was unnecessary in that case), and a T2 Plat 850w EVGA PSU...I've stressed tested it, gamed hard on 4K ultra, benchmarked on cinebench, all the firestrikes and timespy, etc barf infinitum no problems. It's match made in gamer heaven tbh. It's probably conceivable that the 1080ti could bottleneck the 7700K, but I haven't run into it yet. Incidentally I'm running a Maximus IX mobo and oced my 7700k to a stable 5100 MHz by clicking on one button. Now it's holding at 28 to 30 degrees idle, and the hottest the GPU has gotten (on air) is around 80 degrees when I oced it to 2010 MHz 5000+ MHz on the V-RAM. I'll probably get my chip delidded as it seems I did very well in the lottery (on the GPU too, it seems), and then I'll be able to push it further on the new loop...So don't overthink this...JUST DO IT! lol <3
 
Status
Not open for further replies.