[SOLVED] Will my i7-6700K seriously bottleneck a RTX3080?

Anonymous Penguin

Honorable
Oct 8, 2014
44
2
10,545
2
I will either be playing on a 4K-60hz TV or a 1080p-144hz monitor (maybe upgraded to a 1440p down the line).
I plan to upgrade to a modern AMD CPU in an year or so but will the 6700K hold up some decent fps?
 

Yeldur

Reputable
Jan 28, 2017
100
4
4,695
3
Essentially, in any given scenario, your CPU is always going to be the real bottleneck. What your GPU does is make the pretty graphics on your screen, and it also controls the maximum possible graphical fidelity, what it DOESN'T do, is.. Well, all of the work; these are things such as tracking the positions of players and objects at any point in time, among lots of other calculations it has to perform.

The logical things are happening inside of the CPU, and this is why they call it the "brain" of the computer.

The bad part about CPU bottlenecks is that there's almost nothing you can do about them short of replacing the CPU in its entirety.

It's not the same as a GPU where you can simply lower graphical settings to fit your needs or your optimal framerate.

If you ask me, put your CPU at either an equal or more important status compared to your GPU.

If you're aiming for a 3080 now, why not switch down to a 3070 and replace your CPU as well? I'm not sure what your financial situation is like however this would be a good compromise to your CPU bottlenecking what you can actually achieve.

In terms of performance, your CPU is OK at its job but it's not going to be a heavy hitter. To provide a somewhat close comparison point, my rig currently runs an I7-8700k (Which imho is definitely overdue replacing given the current state of CPU's) paired with an RTX 2070 - As an example, on RDR 2 with a mixture of mostly high and a few medium graphical settings (with the occasional ultra) - I hit around 80 FPS

Overall, I think your CPU would do the job but you would be better off upgrading them both simultaneously.
 
Reactions: Anonymous Penguin

Yeldur

Reputable
Jan 28, 2017
100
4
4,695
3
Essentially, in any given scenario, your CPU is always going to be the real bottleneck. What your GPU does is make the pretty graphics on your screen, and it also controls the maximum possible graphical fidelity, what it DOESN'T do, is.. Well, all of the work; these are things such as tracking the positions of players and objects at any point in time, among lots of other calculations it has to perform.

The logical things are happening inside of the CPU, and this is why they call it the "brain" of the computer.

The bad part about CPU bottlenecks is that there's almost nothing you can do about them short of replacing the CPU in its entirety.

It's not the same as a GPU where you can simply lower graphical settings to fit your needs or your optimal framerate.

If you ask me, put your CPU at either an equal or more important status compared to your GPU.

If you're aiming for a 3080 now, why not switch down to a 3070 and replace your CPU as well? I'm not sure what your financial situation is like however this would be a good compromise to your CPU bottlenecking what you can actually achieve.

In terms of performance, your CPU is OK at its job but it's not going to be a heavy hitter. To provide a somewhat close comparison point, my rig currently runs an I7-8700k (Which imho is definitely overdue replacing given the current state of CPU's) paired with an RTX 2070 - As an example, on RDR 2 with a mixture of mostly high and a few medium graphical settings (with the occasional ultra) - I hit around 80 FPS

Overall, I think your CPU would do the job but you would be better off upgrading them both simultaneously.
 
Reactions: Anonymous Penguin

Anonymous Penguin

Honorable
Oct 8, 2014
44
2
10,545
2
Essentially, in any given scenario, your CPU is always going to be the real bottleneck. What your GPU does is make the pretty graphics on your screen, and it also controls the maximum possible graphical fidelity, what it DOESN'T do, is.. Well, all of the work; these are things such as tracking the positions of players and objects at any point in time, among lots of other calculations it has to perform.

The logical things are happening inside of the CPU, and this is why they call it the "brain" of the computer.

The bad part about CPU bottlenecks is that there's almost nothing you can do about them short of replacing the CPU in its entirety.

It's not the same as a GPU where you can simply lower graphical settings to fit your needs or your optimal framerate.

If you ask me, put your CPU at either an equal or more important status compared to your GPU.

If you're aiming for a 3080 now, why not switch down to a 3070 and replace your CPU as well? I'm not sure what your financial situation is like however this would be a good compromise to your CPU bottlenecking what you can actually achieve.

In terms of performance, your CPU is OK at its job but it's not going to be a heavy hitter. To provide a somewhat close comparison point, my rig currently runs an I7-8700k (Which imho is definitely overdue replacing given the current state of CPU's) paired with an RTX 2070 - As an example, on RDR 2 with a mixture of mostly high and a few medium graphical settings (with the occasional ultra) - I hit around 80 FPS

Overall, I think your CPU would do the job but you would be better off upgrading them both simultaneously.
Thank you for the descriptive answer.

I try to get something that'll last a while so I opted for the 3080 instead of 3070 as I don't upgrade very often, my current PC was assembled around 7 years ago. (if my mobo hadn't acted up, I'd still be having a 4790k)

I do plan to buy gen 4 AMD when it arrives so waiting for that plus I'll need around 4 to 5 months to save up for it anyway.

So if the 6700k can do the job alongside an RTX3080 for another 5 months or so, I think I'll get the 3080 right away instead of waiting to upgrade both.

Thanks again!
 
Last edited:
Reactions: Yeldur

Anonymous Penguin

Honorable
Oct 8, 2014
44
2
10,545
2
4K 60Hz you might seem fine, if not also attempting live streaming, etc...

butci’d be pricing a 3700 or a 10600k/10700k , or possibly one of AMD’s new Zen3 cpus when they are reviewed in a month or so...
Yeah, switching to team red this time. I heard higher resolutions cause less CPU bottlenecks anyway.

I don't stream so no problems there. I do record gameplay sometime though, but I can live with choppy recordings for 5 months or so until I upgrade to AMD.
 

Anonymous Penguin

Honorable
Oct 8, 2014
44
2
10,545
2
I think for most games it will be fine, for those games that can use more cores it will be a limiting factor. If you are a low detail high fps twitch gamer then yes it will be an issue
Yeah, I'm not playing anything competitive these days anyway. I can wait 5 months to get Ryzen 4000 series.
I'm not good enough at anything to be streaming on Twitch, no issues there :p
 

Karadjgne

Titan
Ambassador
Essentially, in any given scenario, your CPU is always going to be the real bottleneck. What your GPU does is make the pretty graphics on your screen, and it also controls the maximum possible graphical fidelity, what it DOESN'T do, is.. Well, all of the work; these are things such as tracking the positions of players and objects at any point in time, among lots of other calculations it has to perform.

The logical things are happening inside of the CPU, and this is why they call it the "brain" of the computer.

The bad part about CPU bottlenecks is that there's almost nothing you can do about them short of replacing the CPU in its entirety.

It's not the same as a GPU where you can simply lower graphical settings to fit your needs or your optimal framerate.

If you ask me, put your CPU at either an equal or more important status compared to your GPU.

If you're aiming for a 3080 now, why not switch down to a 3070 and replace your CPU as well? I'm not sure what your financial situation is like however this would be a good compromise to your CPU bottlenecking what you can actually achieve.

In terms of performance, your CPU is OK at its job but it's not going to be a heavy hitter. To provide a somewhat close comparison point, my rig currently runs an I7-8700k (Which imho is definitely overdue replacing given the current state of CPU's) paired with an RTX 2070 - As an example, on RDR 2 with a mixture of mostly high and a few medium graphical settings (with the occasional ultra) - I hit around 80 FPS

Overall, I think your CPU would do the job but you would be better off upgrading them both simultaneously.

None of the above. Not even close. In 1080p maybe, since the gpu won't be doing much, but 4k is entirely different, literally 4x the amount of pixels to populate, which has nothing to do with the cpu, that's all gpu bound.

On top of that, Ampere is a new design where it offers direct rendering. Prior, the cpu needed to pre-render all game code, so set an fps limit by the amount of frames pre-rendered in a second. With Ampere cards, some of the graphics code is actually direct rendered by the gpu instead, decreasing reliance on the cpu, which gets bypassed and freed up to deal with the extras like physX and AI code.

All 7th, 8th,9th, 10th Gen cpus are basically nothing more than refinements on the 6th Gen Skylake cpus, so figure a 6700k with a 3070 is going to perform somewhat equitably to a 8700k and 2080ti. It'll just be a shift in performance.

No such thing as a cpu bottleneck. Even with Ampere cards, the cpu will still SET the fps limits. The gpu either lives up to that limit or fails. The cpu cannot Slow the flow of info, the definition of a bottleneck, as it is the Source of info. Depending on the game, detail settings, resolution the gpu will either be more capable, balanced or not able to equal the cpu.
 

Anonymous Penguin

Honorable
Oct 8, 2014
44
2
10,545
2
None of the above. Not even close. In 1080p maybe, since the gpu won't be doing much, but 4k is entirely different, literally 4x the amount of pixels to populate, which has nothing to do with the cpu, that's all gpu bound.

On top of that, Ampere is a new design where it offers direct rendering. Prior, the cpu needed to pre-render all game code, so set an fps limit by the amount of frames pre-rendered in a second. With Ampere cards, some of the graphics code is actually direct rendered by the gpu instead, decreasing reliance on the cpu, which gets bypassed and freed up to deal with the extras like physX and AI code.

All 7th, 8th,9th, 10th Gen cpus are basically nothing more than refinements on the 6th Gen Skylake cpus, so figure a 6700k with a 3070 is going to perform somewhat equitably to a 8700k and 2080ti. It'll just be a shift in performance.

No such thing as a cpu bottleneck. Even with Ampere cards, the cpu will still SET the fps limits. The gpu either lives up to that limit or fails. The cpu cannot Slow the flow of info, the definition of a bottleneck, as it is the Source of info. Depending on the game, detail settings, resolution the gpu will either be more capable, balanced or not able to equal the cpu.
Well, that's more good news for me, I can easily make-do with a 6700k until I can upgrade.
And I didn't know Ampere allowed for the actual code to run on the GPU!

Thank you for the information and response!
 

Karadjgne

Titan
Ambassador
To support these developments, NVIDIA has created RTX IO. This delivers GPU-based lossless decompression, and low-level, super-efficient APIs architected specifically for game workloads. Just as we did with RTX and DirectX Raytracing, we are partnering closely with Microsoft to make sure RTX IO works great with their DirectStorage on Windows API. By using DirectStorage, next gen games will be able to take full advantage of RTX IO-enabled hardware to accelerate load times and deliver larger open worlds, all while reducing CPU load.

Anything logical, like physX or AI or moving objects etc will still go through the cpu, but much of the static objects and graphics files will basically bypass the cpu and get decompressed directly on the gpu, which reduces cpu workload, latency etc.
 
Reactions: Anonymous Penguin

Anonymous Penguin

Honorable
Oct 8, 2014
44
2
10,545
2



Anything logical, like physX or AI or moving objects etc will still go through the cpu, but much of the static objects and graphics files will basically bypass the cpu and get decompressed directly on the gpu, which reduces cpu workload, latency etc.
I see, so the CPU and Primary Storage (System RAM) will have nothing to do with asset loading and the likes.

So it's: Secondary Storage --> GPU instruction --> GPU Memory

Neat.
 

Karadjgne

Titan
Ambassador
Pretty much. Not for every file, but some, as can be used. But it'll decrease the importance of the cpu in the chain giving many a break. You'll be seeing R5 3600's not having issues with 4k etc.

If amd doesn't pull a big, white, super rabbit out of a seriously magical top-hat in the next generation, their already suffering gpu sales will tank.
 

mjbn1977

Honorable
Aug 20, 2015
666
47
11,240
53
In terms of performance, your CPU is OK at its job but it's not going to be a heavy hitter. To provide a somewhat close comparison point, my rig currently runs an I7-8700k (Which imho is definitely overdue replacing given the current state of CPU's) paired with an RTX 2070 - As an example, on RDR 2 with a mixture of mostly high and a few medium graphical settings (with the occasional ultra) - I hit around 80 FPS
If you overclock the 8700k to all core 4.7 or 4.8Ghz, this is still one of the fastest gaming CPUs available.....far away from being obsolete. And if you spend about $50 and a few hours of work you can delid and make it run at 5GHz all cores relatively cool and basically make it one of the gaming top performers. Somewhere I just saw a 8700k revisited review (I think it was techspot) where they tested if its still up to the task in 2020 and an overclocked to 5GHz version basically was level with 10900k in most games. Again, we talking gaming, not content creation....I would go Ryzen in that case.
 
Reactions: Afro_ninja199

Karadjgne

Titan
Ambassador
Content creation depends on the software. For anything Adobe which doesn't scale well at all past 8 threads, Intel still holds a definite advantage by IPC, clock speeds etc. That includes other popular stuff like Maya, which is also heavily single threaded usage, so absolutely favors Intels.

What ppl tend to miss is that every review done is based on single player. Once you get online, and start running into multi-player scenes like high count server cities and the massive amount of AI involved, things change. A 24 man team in a closed world setting like CSGO is one thing, a 24man team in an open world setting is entirely different and even quad core HT i7's are taking a beating with 6core cpus aswell.
 

ASK THE COMMUNITY

TRENDING THREADS