[SOLVED] 3080 bottlenecked by 3700x?

Dawis67_AE

Distinguished
Jul 9, 2014
396
1
18,815
Hi.
With the new Nvidia cards launching just in 2 weeks, im doing final touches for my build wishlist. I have started off the build with 10700k as it significantly outperforms 3700x in games(5-20fps), for not too much more money.

As the thought of the build matured, and i got more acquainted with the new features of the 3000-gen, it changed my mind and currently i have a 3700x in my wishlist. My turningpoint was the PCIe 4.0 support for Ryzen. From what Moores law is dead has said recently, 4.0 will allow better performance for the new cards.

My question is, will the 3700x bottleneck the 3080 Is there a better alternative atm, which is also futureproof? Should i just get 3700x and then switch to the 4000-series AMD cpus when they launch, in what i have heard, not such a long time?

Edit: To clear up the question. My definition of bottleneck is somewhere of about 10+fps loss.
Edit2: My monitor MSI Optix MAG322CQR (1440p 165Hz)
 
Last edited:
Solution
Edit: To clear up the question. My definition of bottleneck is somewhere of about 10+fps loss.
Edit2: My monitor MSI Optix MAG322CQR (1440p 165Hz)
Thats not what "bottleneck" does or means.

Basically, the CPU provides the framerate, the GPU provides the eyecandy.

The CPU will output what it does.
The GPU can either handle being served that framerate and output the desired graphics, or it cannot.

The new 3xxx series can accept whatever framerate current CPUs can provide, and give whatever eyecandy the game and your settings dictate.

If you were to pair that 3080 with a much older CPU, the CPU can't provide enough frames per sec that the GPU can handle.
The GPU would just be loafing along.
In this instance, the older CPU would...
Has anybody run any tests with the new GPU's? I really think anybody who tries to answer is just guessing without some test results to refer to. For that I'd look to GamersNexus for the most dependable testing, and especially the most relevant analysis.

But in general the only significant thing a 3700X offers over a 3600X is 2 cores/4 threads. Nothing I've seen or read about the new GPU changes how games utilize cores/threads so the same considerations as before should apply.
 
  • Like
Reactions: King_V and Phaaze88

USAFRet

Titan
Moderator
Edit: To clear up the question. My definition of bottleneck is somewhere of about 10+fps loss.
Edit2: My monitor MSI Optix MAG322CQR (1440p 165Hz)
Thats not what "bottleneck" does or means.

Basically, the CPU provides the framerate, the GPU provides the eyecandy.

The CPU will output what it does.
The GPU can either handle being served that framerate and output the desired graphics, or it cannot.

The new 3xxx series can accept whatever framerate current CPUs can provide, and give whatever eyecandy the game and your settings dictate.

If you were to pair that 3080 with a much older CPU, the CPU can't provide enough frames per sec that the GPU can handle.
The GPU would just be loafing along.
In this instance, the older CPU would be the bottleneck.

If, on the other hand, you were to pair a top shelf CPU with a much older, less capable GPU....the CPU can provide a framerate much more than the GPU can utilize.
There, the older GPU would be the bottleneck. You simply turn down the eyecandy settings to get back to an acceptable framerate.


Adding a better GPU does not reduce framerate.
 
Solution

Dawis67_AE

Distinguished
Jul 9, 2014
396
1
18,815
Thats not what "bottleneck" does or means.

Basically, the CPU provides the framerate, the GPU provides the eyecandy.

The CPU will output what it does.
The GPU can either handle being served that framerate and output the desired graphics, or it cannot.

The new 3xxx series can accept whatever framerate current CPUs can provide, and give whatever eyecandy the game and your settings dictate.

If you were to pair that 3080 with a much older CPU, the CPU can't provide enough frames per sec that the GPU can handle.
The GPU would just be loafing along.
In this instance, the older CPU would be the bottleneck.

If, on the other hand, you were to pair a top shelf CPU with a much older, less capable GPU....the CPU can provide a framerate much more than the GPU can utilize.
There, the older GPU would be the bottleneck. You simply turn down the eyecandy settings to get back to an acceptable framerate.


Adding a better GPU does not reduce framerate.
Sorry for the confusion. I know GPU wont make the CPU worse. What i mean was, as a bottleneck i would consider a CPU that would reduce 10+fps from the GPUs maximum headroom. I did that because i have seen people complain that peoples definition of a bottleneck can be different.

But yeah, i understand it can be difficult to judge the performance of the new GPUs without reviews, but the reason i asked this question was because from some benchmarks comparing 10700k and the 3700x with 2080ti i have seen gpu more utilised on the 10700k. So if the 3700x already bottlenecks a 2080ti, it would surely bottleneck the 3080... Thats my reasoning.
 

logainofhades

Titan
Moderator
If you are considering getting a placeholder CPU, till 4th gen comes out, just go with an R5 3600. The 3700x vs 3600, gaming wise, is bascially 0.

Personally, I would wait until reviews are out. I know Hardware Unboxed intends to use a 3900x, for their reviews, to test if PCI-E 4.0 actually makes a difference, with these new cards.
 
  • Like
Reactions: King_V

USAFRet

Titan
Moderator
Sorry for the confusion. I know GPU wont make the CPU worse. What i mean was, as a bottleneck i would consider a CPU that would reduce 10+fps from the GPUs maximum headroom. I did that because i have seen people complain that peoples definition of a bottleneck can be different.

But yeah, i understand it can be difficult to judge the performance of the new GPUs without reviews, but the reason i asked this question was because from some benchmarks comparing 10700k and the 3700x with 2080ti i have seen gpu more utilised on the 10700k. So if the 3700x already bottlenecks a 2080ti, it would surely bottleneck the 3080... Thats my reasoning.
And again, that's not what it means.

The 3700x will output whatever it can. No matter what GPU it is talking to.
The 2080ti will take those frames and render them to whatever settings you and the game have selected.

Swapping in a better GPU does not reduce the framerate the CPU outputs. It just means the GPU is not working as hard, to provide that desired eyecandy.
 

USAFRet

Titan
Moderator
Let's pair a 3700x and 2080ti.
At whatever settings you choose, you get 100fps.
Looks good, feels good.

Swap in a 3080.
Same game, same settings.
You would still get 100fps. Just that the 3080 is not working as hard to provide that.
 

Dawis67_AE

Distinguished
Jul 9, 2014
396
1
18,815
And again, that's not what it means.

The 3700x will output whatever it can. No matter what GPU it is talking to.
The 2080ti will take those frames and render them to whatever settings you and the game have selected.

Swapping in a better GPU does not reduce the framerate the CPU outputs. It just means the GPU is not working as hard, to provide that desired eyecandy.
I must suck at explaining. I mean that a bottleneck for me is when a CPU reduces 10+fps from the GPUs max potential. So for example if 3700x outputs 100fps max for a given game and the GPU could output 110fps or more, that to me would be a bottleneck. That is at the given settings. I wonder if the 3700x is gonna hit its ceiling before the 3080 does.
 
I must suck at explaining. I mean that a bottleneck for me is when a CPU reduces 10+fps from the GPUs max potential. So for example if 3700x outputs 100fps max for a given game and the GPU could output 110fps or more, that to me would be a bottleneck. That is at the given settings. I wonder if the 3700x is gonna hit its ceiling before the 3080 does.
I get what you are saying, we need to see benchmarks. I don’t think it is wild to believe some games might be limited by the cpu before the gpu even at 1440p. We saw it at 1080p with the 2080Ti even with the best CPU’s. With such an uplift in performance may we hit the same limits at 1440p, who knows until the results are in.
 

USAFRet

Titan
Moderator
I must suck at explaining. I mean that a bottleneck for me is when a CPU reduces 10+fps from the GPUs max potential. So for example if 3700x outputs 100fps max for a given game and the GPU could output 110fps or more, that to me would be a bottleneck. That is at the given settings. I wonder if the 3700x is gonna hit its ceiling before the 3080 does.
Well, yes.
A new top end GPU, such as the 3xxx series...can probably handle more than current CPUs can provide.

But given that we're already talking top end CPUs....there little you can do.
Unless you want to wait for the 4xxx series of AMD, or the next gen Intel.


There is NEVER a perfect pairing. One end or the other will be slightly less capable than the other.
And that 'less capable end' varies between different games. Some games are more CPU intensive, others more GPU intensive.
 
  • Like
Reactions: King_V and Phaaze88
I must suck at explaining. I mean that a bottleneck for me is when a CPU reduces 10+fps from the GPUs max potential. So for example if 3700x outputs 100fps max for a given game and the GPU could output 110fps or more, that to me would be a bottleneck. That is at the given settings. I wonder if the 3700x is gonna hit its ceiling before the 3080 does.

I think we all understand exactly what you mean.

Put simply the 10700k is capable of pushing more fps than the ryzen 3700x, that's a fact, there's no argument whatsoever.

The issue here is when you state 'for not much more money'.

Its roughly $200 more by the time you're done whme you factor in $100 worth of aftermarket cooler and a way more expensive board capable of running the 10700k without throttling.

Said it before

Ryzen for people with more sense than money.
Intel for people with more money than sense.

Which sounds slightly damning, is meant tongue in cheek, but is infact absolutely true
 
  • Like
Reactions: Phaaze88
....
Personally, I would wait until reviews are out. I know Hardware Unboxed intends to use a 3900x, for their reviews, to test if PCI-E 4.0 actually makes a difference, with these new cards.
I have to think they'll also test with a 3600, or at least 3600x...and Intel equiv's...and maybe even Ryzen 2000 CPU's. It would be important to establish what makes a good pairing, performance wise, so as not to go crazy on cost.

I mean, even if PCIe4 DOES make a diff, will be enough for people to be influenced to dump perfectly good PCIe 3 only systems? That would be a fair thing to clarify in a review.
 
  • Like
Reactions: Phaaze88

Dawis67_AE

Distinguished
Jul 9, 2014
396
1
18,815
I think we all understand exactly what you mean.

Put simply the 10700k is capable of pushing more fps than the ryzen 3700x, that's a fact, there's no argument whatsoever.

The issue here is when you state 'for not much more money'.

Its roughly $200 more by the time you're done whme you factor in $100 worth of aftermarket cooler and a way more expensive board capable of running the 10700k without throttling.

Said it before

Ryzen for people with more sense than money.
Intel for people with more money than sense.

Which sounds slightly damning, is meant tongue in cheek, but is infact absolutely true
Well, i live in Norway so the prices here are different anyways. I think i will go with 3700x as it is cheaper and then upgrade to new cpus when they come out. I guess my question now is, will the 4000-series use the same AM4 socket?
 
Well, i live in Norway so the prices here are different anyways. I think i will go with 3700x as it is cheaper and then upgrade to new cpus when they come out. I guess my question now is, will the 4000-series use the same AM4 socket?
4000 Series as in Zen 3? Yes, it will be on AM4. But nobody's offered any clue yet what the numbering scheme is so at this point '4000 series' only refers to the APU's based on Zen 2 arch.

AMD has yet to offer any clues what Zen 4 will sit on.
 

Dawis67_AE

Distinguished
Jul 9, 2014
396
1
18,815
4000 Series as in Zen 3? Yes, it will be on AM4. But nobody's offered any clue yet what the numbering scheme is so at this point '4000 series' only refers to the APU's based on Zen 2 arch.

AMD has yet to offer any clues what Zen 4 will sit on.
Well, whatever im refering to is what people say the "just around the corner" AMD cpus are. Im guessing they will boost to 5Mhz, which is really the only think atm holding AMD back. So as long as the new AMD cpus will fit my planned Asus Prime x570-pro, im fine with getting 3700x as a placeholder.
 

Turtle Rig

Prominent
BANNED
Jun 23, 2020
772
104
590
Hi.
With the new Nvidia cards launching just in 2 weeks, im doing final touches for my build wishlist. I have started off the build with 10700k as it significantly outperforms 3700x in games(5-20fps), for not too much more money.

As the thought of the build matured, and i got more acquainted with the new features of the 3000-gen, it changed my mind and currently i have a 3700x in my wishlist. My turningpoint was the PCIe 4.0 support for Ryzen. From what Moores law is dead has said recently, 4.0 will allow better performance for the new cards.

My question is, will the 3700x bottleneck the 3080 Is there a better alternative atm, which is also futureproof? Should i just get 3700x and then switch to the 4000-series AMD cpus when they launch, in what i have heard, not such a long time?

Edit: To clear up the question. My definition of bottleneck is somewhere of about 10+fps loss.
Edit2: My monitor MSI Optix MAG322CQR (1440p 165Hz)
I can tell you PCIe 4.0 ain't going to do jack. 😲 nVidia is not stupid they know at least 75 percent of their buyers use 9900k 8700k 9700k and older Intel and 2xxx series AMD and some only some with 3xxx series since it is now barely 1 year old. AMD builds which are all PCIe 3.0 and what not except for the 3xxx series. I can bet you if I put in a PCIe 3.0 M.2 and then a PCIe 4.0 m.2 and ask you to boot up the machine and time it there would be no difference and if you didn't time it you wouldn't know which is which. Yes specs go up with PCIe 4.0 but real world performance is the same. TLDR 🍩

You seem like a true gamer in this case go with the 10700k and don't look back. 🚓

This holds true for PCIe 3.0 2080Ti vs a PCIe 4.0 3080. This is why a 5700XT currently AMD's top of the line consumer card can barely compete with a 2060 PCIe 3.0 card and not as fast certainly as the 2070 card. Now nVidia has Super editions to top it off. Yes these prices of the 3080 and 3070 look nice on paper but I know from history that this is the lowest MSRP for a standard Founders Edition which will be rare to get either way. Once rare to get it will be sold for more, especially in these covid times. Supply will be low demand will be high so these cards will actually sell for 200 ro 400 more avg then the prices nVidia gives. The 3090 Will be close to two thousand dollar card. 3090Ti will be 2500 thousand and 3080 will be a 1100 dollar card. 🚔

Also say theres a 30 percent increase in performance. I can tell you if you get 100fps in a certain place on a map then you will get 130fps with a 3080. Does not matter one is on a PCIe 3.0 platform and one on a PCIe 4.0 platform. We know 2.4 and 5Ghz wireless there is a difference where 2.4Ghz is a better reliable connection where as 5Ghz is speed. Well 6Ghz is out so does that mean your going to get amazing speeds suddenly? Nope it doesn't as it all has to do with your provider and 6Ghz range will be much better but nothing a 5Ghz can't do. In closing what do you want your system for besides gaming? Intel powns AMD when it comes to machine learning and emulation and single threaded apps like Photoshop or 3D or DAW, or Premiere. These apps will still use your cores and threads but not much. The only way to really use all your cores with a 10700k would be to render, however with free app like Format Factory you can decode using GPU which is alike 600 percent faster then a CPU. Also the new Premiere allows for encoding and decoding both now using GPU on their latest version. Also a DAW will use other cores but wants fast and responsive cores no more then 2 to 6 for example where as Photoshop would use 2 cores at max doing crazy stuff and Premiere would be all overo the place using all cores and threads buy not a lot tho. Uusually you will have couple cores pegged with Premiere or any hardcore HT app for that matter and what not. Let us wait until PCIe 6.0 5080Ti to really get excited. Still 4k gaming is to small, better IMO with 2k with all AA methods maxed and turnred on and all nVidia settings or AMD settings maxed out and game maxed out. The 3080 series and beyond is for 4k gamine, but who wants to 4k game at 60fps. 4k is great for desktop with 100 percent DPI its small but very nice DPI and readable if your monitor is large enough. However for gaming 4k just makes things smaller and slower. Sure you wont see jaggies but you wouldn't see them anyhow if you were smart enough to turn on and max out your nVidia panel and in game max out to ultra settings and what not. The 10700k and 9700k and 9900k and 10900k all give pretty much the same performance in FPS give or takes 5fps. A 3700x will fall behind due to clock speed and yes there will be anywhere from 5 to 35fps difference, just depends on game and location and action on the map and what have you. 👍💯🤷‍♀️
 
Last edited:
Until we see the new GPUs actually run in several games, with the assorted processors, discussing the impact of Intel vs. AMD and PCI-e 3.0 vs. 4.0 is purely academic speculation...; I'd not speculate on thinking AMD will pull ahead on PCI-e 4.0, nor think Intel will maintain it's 5-10% advantages at 1080P.

We will all know in a few weeks, I'd guess...(not sure when the NDA is lifted on actual reviews)

I will venture a 'guess' that for a 3070, ' it ' (3800X vs. 10700K) won't matter as much as it might with a 3080 or 3080 Super if at 1080P...(and will matter even less if at 1440P or above)
 

Dawis67_AE

Distinguished
Jul 9, 2014
396
1
18,815
I can tell you PCIe 4.0 ain't going to do jack. 😲 nVidia is not stupid they know at least 75 percent of their buyers use 9900k 8700k 9700k and older Intel and 2xxx series AMD and some only some with 3xxx series since it is now barely 1 year old. AMD builds which are all PCIe 3.0 and what not except for the 3xxx series. I can bet you if I put in a PCIe 3.0 M.2 and then a PCIe 4.0 m.2 and ask you to boot up the machine and time it there would be no difference and if you didn't time it you wouldn't know which is which. Yes specs go up with PCIe 4.0 but real world performance is the same. TLDR 🍩

You seem like a true gamer in this case go with the 10700k and don't look back. 🚓

This holds true for PCIe 3.0 2080Ti vs a PCIe 4.0 3080. This is why a 5700XT currently AMD's top of the line consumer card can barely compete with a 2060 PCIe 3.0 card and not as fast certainly as the 2070 card. Now nVidia has Super editions to top it off. Yes these prices of the 3080 and 3070 look nice on paper but I know from history that this is the lowest MSRP for a standard Founders Edition which will be rare to get either way. Once rare to get it will be sold for more, especially in these covid times. Supply will be low demand will be high so these cards will actually sell for 200 ro 400 more avg then the prices nVidia gives. The 3090 Will be close to two thousand dollar card. 3090Ti will be 2500 thousand and 3080 will be a 1100 dollar card. 🚔

Also say theres a 30 percent increase in performance. I can tell you if you get 100fps in a certain place on a map then you will get 130fps with a 3080. Does not matter one is on a PCIe 3.0 platform and one on a PCIe 4.0 platform. We know 2.4 and 5Ghz wireless there is a difference where 2.4Ghz is a better reliable connection where as 5Ghz is speed. Well 6Ghz is out so does that mean your going to get amazing speeds suddenly? Nope it doesn't as it all has to do with your provider and 6Ghz range will be much better but nothing a 5Ghz can't do. In closing what do you want your system for besides gaming? Intel powns AMD when it comes to machine learning and emulation and single threaded apps like Photoshop or 3D or DAW, or Premiere. These apps will still use your cores and threads but not much. The only way to really use all your cores with a 10700k would be to render, however with free app like Format Factory you can decode using GPU which is alike 600 percent faster then a CPU. Also the new Premiere allows for encoding and decoding both now using GPU on their latest version. Also a DAW will use other cores but wants fast and responsive cores no more then 2 to 6 for example where as Photoshop would use 2 cores at max doing crazy stuff and Premiere would be all overo the place using all cores and threads buy not a lot tho. Uusually you will have couple cores pegged with Premiere or any hardcore HT app for that matter and what not. Let us wait until PCIe 6.0 5080Ti to really get excited. Still 4k gaming is to small, better IMO with 2k with all AA methods maxed and turnred on and all nVidia settings or AMD settings maxed out and game maxed out. The 3080 series and beyond is for 4k gamine, but who wants to 4k game at 60fps. 4k is great for desktop with 100 percent DPI its small but very nice DPI and readable if your monitor is large enough. However for gaming 4k just makes things smaller and slower. Sure you wont see jaggies but you wouldn't see them anyhow if you were smart enough to turn on and max out your nVidia panel and in game max out to ultra settings and what not. The 10700k and 9700k and 9900k and 10900k all give pretty much the same performance in FPS give or takes 5fps. A 3700x will fall behind due to clock speed and yes there will be anywhere from 5 to 35fps difference, just depends on game and location and action on the map and what have you. 👍💯🤷‍♀️

I agree on intel being better choice NOW, for gaming. But hearing that 4000 is not far away, i will not need to upgrade my Mobo when it comes out if i go for 3700x. If i go for intel, im really placing hopes on an outdated system to be futureproof. IMO better option is to get the AM4 socket, use 3700x as a placeholder for when 4000 comes out, and get that instead. At that point, if all is right, AMD should really catch up to Intel on the core speeds, and what not, to make their games run faster, with the additional cores, and all the new features. God knows when Intel will launch new CPUs, and they will still probably use a new socket.

Im not an AMD fanboy. My current 6yo PC uses the i7 4790k, which is still a beast for its age, and honestly i was hoping to go for intel now as well.

Ill still wait for the benchmarks to come out before buying, but i doubt its wise to go for Intel.