[SOLVED] Will i7-3770k 4.6ghz bottleneck 1080ti?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

boriss911

Commendable
Nov 1, 2019
68
8
1,535
Hello, i recently had questions about i7-3770k cooling, and now i went over to GPU upgrade. I have i7-3770k in combo with 1070 strix, i am planning to get around 144fps in witcher 3, so i am planning to buy 1080ti. But, as i know my cpu is too weak for that GPU, but will that actually matter in games that dont use yours cpu at 100%? For example Witcher 3.
 
Solution
3770k's instructions per clock is a bit better. My total cpu usage sits between 80~90 in those games at 1440p. Up to around 95~98% in 1080p.

Don't get me wrong, 1080Ti would do better in a modern system, im well aware of that, but looking at benchmarks of the same games and resolution, it's not far off. It was a heap better than the 1070 anyway.

Irisena

Commendable
Oct 1, 2019
94
10
1,565
cpu usage didnt get over 80%, so that means that cpu has still some space to work with?
I don't think that's the case? what you should be looking is GPU usage. outside town, the system did not experience major bottleneck, so the 1080ti works at full capacity with the 3770k, but inside town though, the part of the game which is CPU heavy, the 1080ti only work 75%-ish, and that's a bottleneck. with enough CPU power to back the 1080ti off, you could get 25% more performance in the town. Also, the cpu usage on HT rarely hits 100%. switch the HT to off though, it would shoot up to 100% all the time.
 

Karadjgne

Titan
Ambassador
As the owner and current user of an i7-3770K I'm going to say this. So much bs in this thread, far too much disinformation and supposition.

Cpu sets the fps limit according to the game code instructions. With caps or limitations on thread usage, this can be lower than expected, or with simple code, can be far higher than can possibly be used. But it will vary from game to game. The cpu will take the code, pre-render it into a frame, giving objects places, addresses, dimensions, statistics, movement, shader info etc. The speeds at which that happen are a product of IPC and clocks. Faster it gets pre-rendered, the more frames can get processed per second, higher the fps. Got nothing to do with resolution. After the cpu pre-renders the frames, it ships them to the gpu.

It's there that the gpu will finish render the frame according to detail settings and resolution. It'll do one of only 2 things, either reach the fps limit set by the cpu, or fail. If it surpasses the fps limit, then changing from lower settings to higher doesn't change fps, if it fails then changing settings lower increases fps output, upto the cpu set limit.

Still according to the game.

In CSGO I get 300fps, so fps over 144 is most definitely attainable. In Skyrim (vanilla) it's over 180fps. Add in the 170 scripted mods I use, which is extremely heavy cpu usage, going from 2 threads to 6 threads, and fps tanks, to 60.

The problem here is that the 3770k is getting old in the tooth. It's IPC isn't comparable to newer platforms, even if clocks are. So regardless of gpu used, it's maximum attainable fps is going to be somewhat lower than new. All according to which game is played. The 1080ti is strong enough that it'll actually reach cpu limits in most games.

That's not a bottleneck, that's just inability of the cpu to fully utilize the gpu. Bump up resolution, higher gpu demand, then that changes. At 4k a 1080ti is perfectly suited for the old 3rd gen, you'll only need to attempt 60fps and the gpu will struggle with that at times.
 

boriss911

Commendable
Nov 1, 2019
68
8
1,535
As the owner and current user of an i7-3770K I'm going to say this. So much bs in this thread, far too much disinformation and supposition.

Cpu sets the fps limit according to the game code instructions. With caps or limitations on thread usage, this can be lower than expected, or with simple code, can be far higher than can possibly be used. But it will vary from game to game. The cpu will take the code, pre-render it into a frame, giving objects places, addresses, dimensions, statistics, movement, shader info etc. The speeds at which that happen are a product of IPC and clocks. Faster it gets pre-rendered, the more frames can get processed per second, higher the fps. Got nothing to do with resolution. After the cpu pre-renders the frames, it ships them to the gpu.

It's there that the gpu will finish render the frame according to detail settings and resolution. It'll do one of only 2 things, either reach the fps limit set by the cpu, or fail. If it surpasses the fps limit, then changing from lower settings to higher doesn't change fps, if it fails then changing settings lower increases fps output, upto the cpu set limit.

Still according to the game.

In CSGO I get 300fps, so fps over 144 is most definitely attainable. In Skyrim (vanilla) it's over 180fps. Add in the 170 scripted mods I use, which is extremely heavy cpu usage, going from 2 threads to 6 threads, and fps tanks, to 60.

The problem here is that the 3770k is getting old in the tooth. It's IPC isn't comparable to newer platforms, even if clocks are. So regardless of gpu used, it's maximum attainable fps is going to be somewhat lower than new. All according to which game is played. The 1080ti is strong enough that it'll actually reach cpu limits in most games.

That's not a bottleneck, that's just inability of the cpu to fully utilize the gpu. Bump up resolution, higher gpu demand, then that changes. At 4k a 1080ti is perfectly suited for the old 3rd gen, you'll only need to attempt 60fps and the gpu will struggle with that at times.
Thanks for the reply. I was thinking about that too. For me a bottleneck is a core duo with 1080Ti, but a stronger cpu just wont let gpu achieve its limits.
 

Karadjgne

Titan
Ambassador
I'd rather have a gpu too strong for the cpu, play any setting I wished and not change fps, than have a cpu far stronger than the gpu, and be limited to whatever settings the gpu allows to get playable fps.

60fps is very playable, faster just means slightly smoother and sharper detailing, but when I'm snap-scoping a terrorist, I'm really not all that interested if he's got a full beard or 5 o'clock stubble, I just want him dead. Smoother or better details be damned, just die so I can move on to the next target.
 
Last edited:
  • Like
Reactions: RodroX
Adversely i spend my hard earned money on high end graphics cards like the gtx 1080 ti. I want my system to be gpu limited, pegged at 100% load to ensure i'm fully utilizing the card. Otherwise I could have spent less money on a lesser model, to achieve the same result. Either way you'll have a component limitation no matter what the component combo is. The important take away is to ensure you can come close to matching FPS with your monitors refresh rate at an acceptable graphical detail, based on the budget in question.
 

boriss911

Commendable
Nov 1, 2019
68
8
1,535
So guys! I wasnt actually hoping for such interest from you, i am very happy that you spent your time helping me. I took some tests in Witcher 3, i had CPU usage on. Tested at different qualities, from low to ultra, tried different gaming situations and landed on that. My cpu is used about 55% when i am standing, about 70-75% when i am moving and about 70-75% when i am fighting. I tried different qualities, but that didnt make much change. Cpu loads were still the same. But sometimes CPU randomly jumps up to 95-100% in a millisecond, that happened very often in the beginning. It was raining in game. The GPU was pinned at 98% of usage. At Low quality with extra off, i had about 144fps, Medium with everything extra off was about 120fps, High was about 80-105, ultra was about 70-80. It was raining, so it surely affected fps count in the negative way. So, what is your conclusion after that?
 

boriss911

Commendable
Nov 1, 2019
68
8
1,535
I'd rather have a gpu too strong for the cpu, play any setting I wished and not change fps, than have a cpu far stronger than the gpu, and be limited to whatever settings the gpu allows to get playable fps.

60fps is very playable, faster just means slightly smoother and sharper detailing, but when I'm snap-scoping a terrorist, I'm really not all that interested if he's got a full beard or 5 o'clock stubble, I just want him dead. Smoother or better details be damned, just die so I can move on to the next target.
Performance is adequate. You aren't play a fast twitch first person shooter game.
But, something that i dont understand is - why are mine cpu usages as big as with a 1080Ti?
View: https://www.youtube.com/watch?v=R7T4B1nNppA&t=213s
 

Karadjgne

Titan
Ambassador
Gpu isn't going to change things. What will is all the extras like hairworks and rain that uses PhysX and other stuff that's scripted into the game code. Just lowering detail settings broadly by changing from ultra to high or medium may or may not turn off some of those. You'll also have to check other cpu dependent stuff like gamebar, Xbox DVR, Cortana etc and kill that. In nvidia control panel change pre-rendered frames from default (3) to 1, lower grass detail slider etc.

You'll find there's a reason you can manually change particular settings. Grass, rain, clouds are all cpu intensive as they need to be placed individually, which is why your fps tanked in the rain, there's a few sections that get copied, but in each section every raindrop gets addressed, moved etc before it even gets to the gpu.


Play around with individual settings in game, some will have more cpu impact than others.
 

boriss911

Commendable
Nov 1, 2019
68
8
1,535
Gpu isn't going to change things. What will is all the extras like hairworks and rain that uses PhysX and other stuff that's scripted into the game code. Just lowering detail settings broadly by changing from ultra to high or medium may or may not turn off some of those. You'll also have to check other cpu dependent stuff like gamebar, Xbox DVR, Cortana etc and kill that. In nvidia control panel change pre-rendered frames from default (3) to 1, lower grass detail slider etc.

You'll find there's a reason you can manually change particular settings. Grass, rain, clouds are all cpu intensive as they need to be placed individually, which is why your fps tanked in the rain, there's a few sections that get copied, but in each section every raindrop gets addressed, moved etc before it even gets to the gpu.


Play around with individual settings in game, some will have more cpu impact than others.
Gpu isn't going to change things. What will is all the extras like hairworks and rain that uses PhysX and other stuff that's scripted into the game code. Just lowering detail settings broadly by changing from ultra to high or medium may or may not turn off some of those. You'll also have to check other cpu dependent stuff like gamebar, Xbox DVR, Cortana etc and kill that. In nvidia control panel change pre-rendered frames from default (3) to 1, lower grass detail slider etc.

You'll find there's a reason you can manually change particular settings. Grass, rain, clouds are all cpu intensive as they need to be placed individually, which is why your fps tanked in the rain, there's a few sections that get copied, but in each section every raindrop gets addressed, moved etc before it even gets to the gpu.


Play around with individual settings in game, some will have more cpu impact than others.
Ok, because theres a really decent deal on a 1080Ti right now, but i was very worried for the bottleneck. Since i am going to game at 1080p, then 1080Ti would make a big difference? Cause 3600x hits 1 frame more than 3770k.