Will a core i7 7700k bottleneck a gtx 1080ti?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Arnout_2

Prominent
May 1, 2017
5
0
510
So i'm gonna build my new rig and I need to know if a core i7 7700k will bottleneck a gtx 1080ti on 1080p?
 
Solution
"At 1080p, a gtx 1080ti is going to cause CPU bottlenecks in some games, no matter what CPU is used."

That's what you said, I just corrected you. There is absolutely no way a gpu can cause a bottleneck on a cpu. I don't care if it's a i7-7700k pushing a gtx 720, the gpu is not bottlenecking the cpu, the cpu will still perform at its rated speed and IPC. The gpu only bottlenecks stuff that's actually downstream, in this case the monitor, since the monitor will only receive a fraction of the info being processed by the cpu. The gpu is the bottleneck. Reversed with a pentium pushing a 1080ti, the cpu is the bottleneck as the amount of info to the gpu is severely curtailed. But it's all flow from source to monitor, it doesn't go backwards...


I'm not sure you noticed, but he's only gaming at 1080p, not 4K. There is a huge difference in bottleneck potential between 1080p and 4K.

 
There is. At 4k, you'll stress the gpu, limiting it to only what it can handle graphics wise. The side affect of this is that the cpu doesn't have to work as hard, as fps take a nosedive. At 1080p, that gpu is gonna run wide open fps, making the cpu actually raise load %. It sounds backwards, but it's not. The cpu shunts a certain amount of info to the gpu per frame. Does not really matter if it's 4k or 1080p, the cpu doesn't care, it's not really affected. The gpu, however, is hugely affected since its got to quadruple the amount of ppi. So in a sense, at 1080p, on a game like BF1, it's possible to see 100% cpu usage, yet stick that on a 4k monitor and you'll barely be over 50% cpu usage. This is very common in the i5's with limited cores/threads and uber powerful gpus at 1080p.
In a sense, it's a bottleneck, in reality it's just too much power with no place to go.
 
Oops! Well, the only valid reason to even get a 1080 ti is gaming at 4K. Any other reason is a waste of money. And, I'll stop you right there Taylor, don't try to out argue me. That's also the opinion of Kingpin, Linus, Jay, Gamers Nexus and everyone else who knows more about hardware than the vast majority of people on this site. If you're gaming at 1080p you definitely don't need a ti tbh. I got it for 4K and VR and it destroys both. It's the single best single card solution outside of the new titan 1000 series refresh which is a waste of money for the FPS game (like 5 or 6 fps, REALLY Nvidia?)
 
Well that particular thing can happen if your streaming and playing high demanding games. There are situations where the I7 is not enough with its extra weak threads, and remember the 6 and 7 generations of of I5/I7 have problems of threads not working at all due to bugs.
After few years building pc's for some gamers and streamers, but mostly high demanding workstations, I have been seeing I7 bottlenecking with top tear graphics, that includes quadros and AMD pro (the 4C/8T version). three friends of mine for whom i have built pcs have just moved to ryzen, 1 of them to ryzen 7 and the other tow for ryzen 5 1600X. All of them stream and play and from what i have seen and tested as month go by I7 4C/8T is becoming useless for streaming, its always near 100% and that cause breaks and latency sometimes. With the 6C/12T and 8C/16T from ryzen that does not happen, yes lower clocks, but the amd cpu are beasts compared to the over priced I7 that has a very bad performance/price and performance over all if your doing more then just gaming. Contrary to many people higher clocks does not mean better, more cores overcome that, as many professional programs already take advantage of multiple physical cores, as the gaming industry is slowing moving to the same predicament.
This happen with DX12 last year in just 2 or 3 games, now everything is coming with DX12 that makes a budget RX480/580 very close to a 1070 on dx12, not even talking about vulkan that makes that amd graphic perform better then a 1070 by a very tiny margin. Comparing to a 1070 running dx11 in some games, because using dx12 will make nvidia perform worst then dx11.
This is a shame cause i have 980TI and the graphic was suposed to be dx12 ready and just before my graphic hit the 2 years mark i stopped getting improvements in the newer drivers.... I am glad RX vega is coming so i can finally do an upgrade that will last me more then 2 years.
 


Huh? Such generalizations like that don't help anyone. Just because the game may be more CPU intensive than GPU intensive does NOT automatically make the CPU a bottleneck UNLESS the CPU/Memory combination is not of a high enough spec or frequency to handle all the instructions being thrown at it or cannot physically move those instructions thru the the processing pipeline fast enough. This can happen even if the game is multi-threaded but it will most frequently happen if the game is poorly optimized, only designed for single core or in either of the above cases, you are trying to run the game on a CPU never designed to handle that kind of a load. If the affected CPU cores are not being constantly slammed at 100% and the GPU can handle everything thrown its way by the CPU then there is no bottleneck.
 
A very simple answer. The CPU will determine the maximum frame rate in any game. So the better the CPU the better the potential maximum frame rate. Right now IPC and clock speed is king in the gaming world. So the Core i7 7700K is the best you can get. There will be exceptions though. Games that can use many cores will do better with CPU's that have more cores. Generally I wouldn't worry all too much. The 7700K can push super high frame rates.
 
Status
Not open for further replies.