Question Will an Intel 11600k bottleneck a 3080 Ti ?

Jul 28, 2023
15
0
10
Hi there,

I wanted to ask about my CPU combo, see, I have a 3080ti and my CPU is an intel 11600k, however I do not wish to upgrade my GPU anytime soon if I can avoid it.
I’m also using a B560 mobo and 3200 ram, so if I went to intel 12th/13th gen or AMD I would need a new mobo at the very least.

I’m wondering if my CPU powerful is enough to last me about 2 more years without leaving a significant amount of GPU performance on the table.
I play mostly action adventure RPGs such as Witcher 3 with mods, Skyrim (also with mods) Horizon Zero Dawn, RDR2, Shadow of War, Battlefront 2 etc etc, I also do some emulation.

I usually game at 4K (with DLSS when it's available) and sometimes 1440p.

I am seeing the impressive performance of many newer CPUs such as the 13600k and can’t help but think I may be holding back my 3080ti with my 6 core 12 thread CPU and would hate to be bottlenecking my GPU for the next 2 years or so.
I plan on upgrading to the Nvidia 5000 series depending on if they are any good (possibly a 5080) but could just as easily wait another generation if it doesn’t turn out to be worth it, I also don’t plan on buying any of the new UE5 games on the way unless the CPU performance and shader comp stutter situation is improved dramatically (going off of DFs new video, it’s not looking good!) so I may be sticking with existing titles for some time anyway.

Anyway, am I crazy? Or is a CPU/mobo/ram upgrade a good idea to really serve my GPU properly and get the best out of it….would love to know from anyone who has experience with similar hardware, has been in a similar situation or just has the knowledge…cheers!
 

boju

Titan
Ambassador
Stretching CPU's fps cap would have to be the more accurate interpretation of bottleneck and a lot miss that point.

What's your cpu capable of in terms of fps? Hard to say, need to look at benchmarks or run a game at super low resolution to gauge that yourself but I'd say your cpu is capable of 200fps+ before showing signs of being maxed out. Depending on title and game engine, fps will vary and also your refresh rate will also determine what is practical. No point running higher fps than refresh rate, that's just wasted processing for nothing. Having said that and since you play with high resolution and details, a 3080ti won't even come close to saturating your cpu, graphics card simply isn't powerful enough. And i doubt 50 series would also be a threat. Even if it was, just up the eye candy to lower fps demand from cpu.
 
I’m wondering if my CPU powerful is enough to last me about 2 more years without leaving a significant amount of GPU performance on the table.
I play mostly action adventure RPGs such as Witcher 3 with mods, Skyrim (also with mods) Horizon Zero Dawn, RDR2, Shadow of War, Battlefront 2 etc etc, I also do some emulation.
Can I assume you play at 60hz?
 
Jul 28, 2023
15
0
10
Stretching CPU's fps cap would have to be the more accurate interpretation of bottleneck and a lot miss that point.

What's your cpu capable of in terms of fps? Hard to say, need to look at benchmarks or run a game at super low resolution to gauge that yourself but I'd say your cpu is capable of 200fps+ before showing signs of being maxed out. Depending on title and game engine, fps will vary and also your refresh rate will also determine what is practical. No point running higher fps than refresh rate, that's just wasted processing for nothing. Having said that and since you play with high resolution and details, a 3080ti won't even come close to saturating your cpu, graphics card simply isn't powerful enough. And i doubt 50 series would also be a threat. Even if it was, just up the eye candy to lower fps demand from cpu.
Yes i don’t play games at low res so I guess what concerns me is things like stutters and games heavy with NPCs and simulations, but what you have said is somewhat comforting.

I have a 240hz monitor and like to play at about 80 fps minimum although I do make exceptions, I typically cap my story games at 100fps as anything higher typically makes the frame time graph go crazy and I prefer smooth and consistent frames, I do play some CSGO or battlefield 1 but obviously my rig is more than enough for those games…
 
I have a 240hz monitor.
Let's say hypothetically, that you have a GPU that can run every game today at any resolution at a constant 240 FPS. In such a situation, then yes the 11600K will not perform as well as a more recent CPU like a 13700K or a 7800X3D. I don't have a more recent CPU but I would expect the difference would be quite noticeable. If you were to run a game like Cod Warzone 2 for example at that speed it's not going to be a particularly smooth experience.

What I'm unsure about is your game choice, you seem to play mostly RPG games that tend to be more graphically intensive and more difficult to run at high refresh rates than shooters.

Running RDR2 at 240hz at 4k or even 1440p doesn't seem achievable even with a 3080 Ti therefore any difference between newer CPU's is going to be much more limited.

I typically cap my story games at 100fps as anything higher typically makes the frame time graph go crazy and I prefer smooth and consistent frames
If you've got frame time spikes and stuttering then from my experience that's usually more CPU related. What I would do is, run MSI Afterburner in these games. If the GPU is not maxed out during these scenes where you get the frame pacing issues, then that would suggest it's the CPU causing the problem.
 
Jul 28, 2023
15
0
10
Let's say hypothetically, that you have a GPU that can run every game today at any resolution at a constant 240 FPS. In such a situation, then yes the 11600K will not perform as well as a more recent CPU like a 13700K or a 7800X3D. I don't have a more recent CPU but I would expect the difference would be quite noticeable. If you were to run a game like Cod Warzone 2 for example at that speed it's not going to be a particularly smooth experience.

What I'm unsure about is your game choice, you seem to play mostly RPG games that tend to be more graphically intensive and more difficult to run at high refresh rates than shooters.

Running RDR2 at 240hz at 4k or even 1440p doesn't seem achievable even with a 3080 Ti therefore any difference between newer CPU's is going to be much more limited.


If you've got frame time spikes and stuttering then from my experience that's usually more CPU related. What I would do is, run MSI Afterburner in these games. If the GPU is not maxed out during these scenes where you get the frame pacing issues, then that would suggest it's the CPU causing the problem.
Yes I will keep a closer eye on when I am CPU bound, I think most of the time when I’m CPU bound it’s just because a game is poorly optimised and the game maxes out 1 or 2 threads and the rest is left on the table, or the engine inherently makes poor use of more modern CPUs (I refunded Jedi Survived for this reason) but I also have been trying to learn about CPUs lately and heard things about how the superior architecture of newer gen CPUs can help significantly with reducing stuttering and with CPU heavy processes like RT and simulations etc etc, so I wanted to make this post..
 
Jul 28, 2023
15
0
10
Yes I will keep a closer eye on when I am CPU bound, I think most of the time when I’m CPU bound it’s just because a game is poorly optimised and the game maxes out 1 or 2 threads and the rest is left on the table, or the engine inherently makes poor use of more modern CPUs (I refunded Jedi Survived for this reason) but I also have been trying to learn about CPUs lately and heard things about how the superior architecture of newer gen CPUs can help significantly with reducing stuttering and with CPU heavy processes like RT and simulations etc etc, so I wanted to make this post..
I also should have made clear that I do not expect to play any graphically demanding game at 4K 240hz, only that I am finding some games like Witcher 3 next gen (without using RT) maxed out is rather stuttery and realisticly would need to cap my fps at about 60hz for a somewhat smooth experience, which lead me to believe perhaps a newer CPU would improve that frame time graph and allow me to cap my FPS higher…the 240hz on my monitor only comes in handy when I play a shooter which is fairly rare, I bought the monitor for other reasons, it just happened to be 240hz as well.
 
(I refunded Jedi Survived for this reason)
I think Jedi Survivor is just s*** in it's development. They never addressed the inherent flaws in Fallen Order. I would say though with newer games that run like garbage, is that they seem to get released in an unfinished state and everyone becomes a beta tester. So I would expect the game to improve, however as I understand it Jedi Survivor while it runs much better on the latest CPU's, stuttering is not completely eliminated even on a i9 13900K.

In my own personal gaming, I actually find that it's older games I have performance issues with not the newer games. I'm a strategy fan, and unfortunately most of them make very poor use of multiple cores and get bottlenecked on the first thread. You basicly have to brute force them with very high single thread performance.

but I also have been trying to learn about CPUs lately and heard things about how the superior architecture of newer gen CPUs can help significantly with reducing stuttering and with CPU heavy processes like RT and simulations etc etc, so I wanted to make this post..
Faster cores and more of them will help with stuttering yes. If by RT you mean ray tracing then I can attest to that because I've tested Cyberpunk in multiple CPU configurations. What I found was that for my CPU it took 8 cores/16 threads to maintain a stable 60FPS with the graphics and ray tracing affects all set to max. Playing the game without ray tracing is significantly less demanding. It is playable on 4 core/8 threads with it disabled, with it on the performance goes through the floor.
 
Witcher 3 next gen (without using RT) maxed out is rather stuttery and realisticly would need to cap my fps at about 60hz for a somewhat smooth experience, which lead me to believe perhaps a newer CPU would improve that frame time graph and allow me to cap my FPS higher
I agree, I would expect that to be CPU related. I've modded games before and it can make them much more demanding. I would still do some analysis with MSI Afterburner before making a purchasing decision though. Plotting CPU and GPU usage along a frame time graph might give you an insight into what's going on.
 
Jul 28, 2023
15
0
10
I think Jedi Survivor is just s*** in it's development. They never addressed the inherent flaws in Fallen Order.
Oh god tell me about it, no one was more hyped for Survivor than I was, and what’s wrong with it is so foundational it will never get fixed, getting rid of shader comp stutter in UE is manual and lengthy process and even if devs make an effort they can still miss stuff, Repawn knows how to make a fun game, but not how to optimise one, very sad state for AAA gaming right now.
 
Jul 28, 2023
15
0
10
I think Jedi Survivor is just s*** in it's development. They never addressed the inherent flaws in Fallen Order. I would say though with newer games that run like garbage, is that they seem to get released in an unfinished state and everyone becomes a beta tester. So I would expect the game to improve, however as I understand it Jedi Survivor while it runs much better on the latest CPU's, stuttering is not completely eliminated even on a i9 13900K.

In my own personal gaming, I actually find that it's older games I have performance issues with not the newer games. I'm a strategy fan, and unfortunately most of them make very poor use of multiple cores and get bottlenecked on the first thread. You basicly have to brute force them with very high single thread performance.


Faster cores and more of them will help with stuttering yes. If by RT you mean ray tracing then I can attest to that because I've tested Cyberpunk in multiple CPU configurations. What I found was that for my CPU it took 8 cores/16 threads to maintain a stable 60FPS with the graphics and ray tracing affects all set to max. Playing the game without ray tracing is significantly less demanding. It is playable on 4 core/8 threads with it disabled, with it on the performance goes through the floor.
Hmm yes Cyberpunk is demanding, at least it can justify itself with great visuals and IMO very good CPU thread performance, love to see it, I will play one day when I have a GPU capable of frame generation haha
 
Oh god tell me about it, no one was more hyped for Survivor than I was, and what’s wrong with it is so foundational it will never get fixed, getting rid of shader comp stutter in UE is manual and lengthy process and even if devs make an effort they can still miss stuff, Repawn knows how to make a fun game, but not how to optimise one, very sad state for AAA gaming right now.
I'm not a game developer but driving through the neo city lights of Cyberpunk at night in the rain is a visual masterpiece. It is also one of the best games at distributing the load relatively evenly among the cores. Then I look at Jedi Survivor and aside for the fact of needing a 13900K in the first place, I have to ask why cannot it not run smoothly given the enormous processing power available. Unless I'm mistaken there doesn't seem to be anything revolutionary about it. It just seems a bigger Fallen Order with significantly improved textures and some additional effects.
 
Jul 28, 2023
15
0
10
I agree, I would expect that to be CPU related. I've modded games before and it can make them much more demanding. I would still do some analysis with MSI Afterburner before making a purchasing decision though. Plotting CPU and GPU usage along a frame time graph might give you an insight into what's going on.
Yes, I am not about to make a hasty decision, however if I end my with my 3080ti for sometime I want to be eating good and not plagued but CPU limitation, then again upgrading my mobo/CPU and ram will be costly and I’m not made of money haha, I will need to take a better look, seems it’s more likely I’m just a graphics whore and should tone down my expectations…
What monitor do you have? Does it have Freesync/Gsync? It might help eliminate the stuttering.
Hi, it’s the Samsung Odyssey G7 1440p 240hz.

I thought Freesync/Gsync was all about screen tearing and would not affect frame times? Or perhaps I’m mistaken, I never use either technologies as I almost never see any tearing…
I'm not a game developer but driving through the neo city lights of Cyberpunk at night in the rain is a visual masterpiece. It is also one of the best games at distributing the load relatively evenly among the cores. Then I look at Jedi Survivor and aside for the fact of needing a 13900K in the first place, I have to ask why cannot it not run smoothly given the enormous processing power available. Unless I'm mistaken there doesn't seem to be anything revolutionary about it. It just seems a bigger Fallen Order with significantly improved textures and some additiona
I'm not a game developer but driving through the neo city lights of Cyberpunk at night in the rain is a visual masterpiece. It is also one of the best games at distributing the load relatively evenly among the cores. Then I look at Jedi Survivor and aside for the fact of needing a 13900K in the first place, I have to ask why cannot it not run smoothly given the enormous processing power available. Unless I'm mistaken there doesn't seem to be anything revolutionary about it. It just seems a bigger Fallen Order with significantly improved textures and some additional effects.
Yes the rain/wet pavement with all the bounce lighting from the neon lights is beautiful.

As for Survivor, I agree that there is nothing special about it visually, I suppose the textures look higher res than Fallen Order but it looked rather meh for the hour that I played it.

At the time, FSR was broken and so were the resolutions (game looked like it was 720p to me even though I selected 2160p) you literally couldn’t even change resolution, but I could tell that nothing I was seeing on screen could ever justify the absolute garbage performance in front of me.

I later learned more about UE4 and how it was always designed to work on consoles as PC was a major after thought at that time, and open world games were never what it was created for, much less AAA high detail environments with RT features, so it’s no surprise all these UE4 PC ports perform like dog water.
 
Jul 28, 2023
15
0
10
I would have said DLSS 2.0 looks pretty reasonable on higher resolutions like 1440p and 4k. I play at 1440p 60hz with a 3080.
It does look great, I’m just not used to playing at sub 60fps, I plan on playing it in a couple years when I build a new PC, gonna turn on pathtracing and DLSS 3, by then frame gen will likely have come a long way also .
 

punkncat

Polypheme
Ambassador
I own both an 11600K and an 11900K. I have used them both with a 3070 and although there is a significant difference between the two graphics cards, both of those CPU basically loaf around using it at 1440/144. Where I could see the higher framerate you are looking at being a factor, the higher resolution should count more on the GPU delivery, IMO.

Best thing in my mind would be to try it and see what you get. The card is still a significantly larger expense than a CPU/mobo combo, so if you decide to upgrade later it should be far easier on your wallet.
 
Jul 28, 2023
15
0
10
I own both an 11600K and an 11900K. I have used them both with a 3070 and although there is a significant difference between the two graphics cards, both of those CPU basically loaf around using it at 1440/144. Where I could see the higher framerate you are looking at being a factor, the higher resolution should count more on the GPU delivery, IMO.

Best thing in my mind would be to try it and see what you get. The card is still a significantly larger expense than a CPU/mobo combo, so if you decide to upgrade later it should be far easier on your wallet.
Hi, thanks for the input, so you are saying I should try the mobo/CPU upgrade?

I actually did have a 3070 to start with, but sold it to a friend who needed a GPU, and bought my 3080ti since the price came down a bunch.
 

punkncat

Polypheme
Ambassador
Hi, thanks for the input, so you are saying I should try the mobo/CPU upgrade?

I actually did have a 3070 to start with, but sold it to a friend who needed a GPU, and bought my 3080ti since the price came down a bunch.

I apparently missed something up there. I thought this was a "I am going to", not an "I already have".

In this case, only your perception of the performance you are getting is important. You are the one who needs to be happy with it and presumably funding it. Unless you came across a stellar deal, updating the 11th gen is likely not going to be worthwhile. The i9 does perform marginally better, but also costs are in line with typical Intel pricing now that the clear out sales have long dried up.

In my own world, I have not seen any reason to spend money towards a base system update yet. My 11900K does everything I want it to and could do with a more powerful graphics card, but I just wouldn't really utilize it much with the titles I play.

IMO, Cyberpunk is a terrible title to base any systems performance on. It was poorly optimized when released, remains fundamentally flawed, especially when you consider all the resolution modifiers that came out in order for really powerful hardware to run it "well". With that said, it is a beautiful game so long as your expectations are in line with said hardware.
 
Jul 28, 2023
15
0
10
Nice, it’s ironic how performant CDPRs RED Engine was and now they have switched to UE5 which is probably gonna be the bain of PC gaming for the next decade, unless Epic can solve their CPU utilisation and shader compilation problem…