[SOLVED] Bottle cap

Gruesomepc

Commendable
Mar 3, 2020
31
1
1,535
Hey, I'm planning on making the jump from a 1080 to a 2080 ti very shortly the only problem is I have an i7700k and I don't know if I need to upgrade my cpu as well.
 
Solution
Is this because the lower the res the more the game pushes its resources onto the cpu?
Yeah, more work the cpu has to do. A 2080Ti isn't going to stretch it's legs even at 1080p ultra...
As you go up in resolution and graphic detail, the more work the gpu has to do and less for the cpu. Vice versa, it's the opposite.

Some people have 9900K + 2080Ti combos for 1080p 240hz. They have to OC and run on low settings just to maintain 240+fps.
That's pretty bananas. All that fancy, expensive hardware, just to play on low settings on the quest for minimal input lag and image smoothness...
Is this because the lower the res the more the game pushes its resources onto the cpu?
Yeah, more work the cpu has to do. A 2080Ti isn't going to stretch it's legs even at 1080p ultra...
As you go up in resolution and graphic detail, the more work the gpu has to do and less for the cpu. Vice versa, it's the opposite.

Some people have 9900K + 2080Ti combos for 1080p 240hz. They have to OC and run on low settings just to maintain 240+fps.
That's pretty bananas. All that fancy, expensive hardware, just to play on low settings on the quest for minimal input lag and image smoothness...
 
Last edited:
  • Like
Reactions: Jason H.
Solution
Yeah, more work the cpu has to do. A 2080Ti isn't going to stretch it's legs even at 1080p ultra...
As you go up in resolution and graphic detail, the more work the gpu has to do and less for the cpu. Vice versa, it's the opposite.

Some people have 9900K + 2080Ti combos for 1080p 240hz. They have to OC and run on low settings just to maintain 240+fps.
That's pretty bananas. All that fancy, expensive hardware, just to play on low settings on the quest for minimal input lag and image smoothness...

Hmm I was under the impression that the CPU creates the frames for the GPU to render (hence why some people say a CPU cannot be a bottleneck). SO I was thinking that no matter which GPU and res you use, the CPU has to be able to render those frames for the GPU to render.
 
Hmm I was under the impression that the CPU creates the frames for the GPU to render (hence why some people say a CPU cannot be a bottleneck). SO I was thinking that no matter which GPU and res you use, the CPU has to be able to render those frames for the GPU to render.
They both do their own share of rendering:
-the cpu does the pre-rendering stuff - what all that entails is over my head
-the gpu does the remaining post-processing stuff, according to the resolution and other in game graphics settings

"Cpu cannot be a bottleneck"... depends on your POV, I guess. I don't agree with it.
 
  • Like
Reactions: Jason H.
They both do their own share of rendering:
-the cpu does the pre-rendering stuff - what all that entails is over my head
-the gpu does the remaining post-processing stuff, according to the resolution and other in game graphics settings

"Cpu cannot be a bottleneck"... depends on your POV, I guess. I don't agree with it.

Yea I also dont agree with that statement either but Ive seen it said multiple times so I figured there might be a little merit to the statement but I always disagreed. As if u have a great gpu and a bad cpu, the cpu will not be able to push the gpu HENCE, a bottleneck lol.

But hmm this is some interesting info to know.

I have a r7 3800x coming in the mail that I plan to use with the 2060 super. As of right now my gpu takes about 6ms to render, but my cpu takes about 9ms to render causing a very small bottleneck that I feel is my 2600's fault in why Im not getting the FPS I want in some titles which is why Im upgrading the cpu.

Any tips or advice on this setup? 2060s with the r7 3800? Even worth the upgrade? Im just looking to pull like 20 more fps in cpu intensive games or just 20fps more in general
 
Hmm I was under the impression that the CPU creates the frames for the GPU to render (hence why some people say a CPU cannot be a bottleneck). SO I was thinking that no matter which GPU and res you use, the CPU has to be able to render those frames for the GPU to render.
This is what I've heard as well. And from my limited understanding of it, the higher you go in resolution the longer it takes the gpu to render. Putting less stress on the cpu in a sense. The cpu is still working hard, but the gpu doing all the heavy lifting takes longer and it doesn't demand as many frames. So if at 1080 its demanding say 150, then at 1440 maybe it only wants 100. So obviously the cpu can render 100 faster than it can 150.

The 3800 should give you a fairly solid boost. Being that its slightly faster than the 3600, and the 3600 is about 15-20% faster than the 2600. I believe that was in both single and multi threaded applications. Whether the boost in performance is worth it to you as far as cost, will be entirely up to you.

While not the best way to check, you can hop onto youtube and do some comparing. Just search 2600 vs 3800 in xxx game. You'll get a general idea of performance difference.
 
  • Like
Reactions: Jason H.
This is what I've heard as well. And from my limited understanding of it, the higher you go in resolution the longer it takes the gpu to render. Putting less stress on the cpu in a sense. The cpu is still working hard, but the gpu doing all the heavy lifting takes longer and it doesn't demand as many frames. So if at 1080 its demanding say 150, then at 1440 maybe it only wants 100. So obviously the cpu can render 100 faster than it can 150.

The 3800 should give you a fairly solid boost. Being that its slightly faster than the 3600, and the 3600 is about 15-20% faster than the 2600. I believe that was in both single and multi threaded applications. Whether the boost in performance is worth it to you as far as cost, will be entirely up to you.

While not the best way to check, you can hop onto youtube and do some comparing. Just search 2600 vs 3800 in xxx game. You'll get a general idea of performance difference.

Well, my birthday is in a couple weeks and my woman asked me what I wanted and thats what she got me soo technically its free for me lol.

I was thinking about just getting the 3600 to save money but I felt that it wouldnt be enough of a boost that Im looking for. I know for FPS its ALWAYS best to upgrade the GPU 1st but for 1, I dont have enough cash for a 2080s atm and Id probably need a new PSU as mines only 550w, and a 2070s just seems like it wouldnt be that big of an upgrade for the cost as in most games I get over 100fps on high settings. In other games like Apex I get anywhere from 130-250fps but see thats just the thing, Im trying to raise that low number up above 140 because i want to make the switch to a 1080p 144hz gsync monitor and I want to be able to maintain that 144hz.

As of right now I play on 1080p 75hz monitor. It was 200$ which isnt really cheap to me lol. So it is really good for being only 75hz. basically I cant see these fps drops, but I feel them when it comes to FRAME TIMES, and not the FPS itself. Which is another reason Im opting to upgrade the cpu to boost those low fps drops. But oddly, I do sometimes see it get "less smooth" when these fps drops occur which in my mind, is just visually seeing the fps drops, which is impossible as my monitor is only 75hz lol.
 
@Jason H.
Here's another way to look at it:
-Fps min = cpu
-Fps max = gpu
-Average is the combination of the two

You're looking at like a 10-21% improvement on average going from a 2600 to a 3800X... unless that 3800X is on sale, it isn't worth it over a 3700X; the performance is darn near identical. That 40USD difference could be better spent elsewhere.
Some people really underestimate the little ol' 3600. It really isn't that far behind the bigger cpus. It's easily the best value cpu among them. The 3700X/3800X are less than 1% faster than the 3600...
But some people have to thrown in that 'futureproof' nonsense - I gave up trying to argue it. To the people who get 3700X over a 3600 to 'futureproof', only to end up replacing it at around the same time as the latter - LOL~!

Tip: Whether you still want to replace the cpu or not > look into tightening your current memory timings, if you don't plan to replace them.
Ryzen 3000: 3800mhz at 1:1 mode, FLCK at 1900mhz, custom timings > 3200mhz w/tweaked timings > 3600mhz w/XMP timings = 3200mhz w/XMP timings
 
  • Like
Reactions: Gmoney06ss
@Jason H.
Here's another way to look at it:
-Fps min = cpu
-Fps max = gpu
-Average is the combination of the two

You're looking at like a 10-21% improvement on average going from a 2600 to a 3800X... unless that 3800X is on sale, it isn't worth it over a 3700X; the performance is darn near identical. That 40USD difference could be better spent elsewhere.
Some people really underestimate the little ol' 3600. It really isn't that far behind the bigger cpus. It's easily the best value cpu among them. The 3700X/3800X are less than 1% faster than the 3600...
But some people have to thrown in that 'futureproof' nonsense - I gave up trying to argue it. To the people who get 3700X over a 3600 to 'futureproof', only to end up replacing it at around the same time as the latter - LOL~!

Tip: Whether you still want to replace the cpu or not > look into tightening your current memory timings, if you don't plan to replace them.
Ryzen 3000: 3800mhz at 1:1 mode, FLCK at 1900mhz, custom timings > 3200mhz w/tweaked timings > 3600mhz w/XMP timings = 3200mhz w/XMP timings

The CPU is already ordered and is arriving tomorrow lol. But tbh that percentage is what I expected and all I really wanted. Also I plan on getting a better GPU next year, so theres that futureproof babble lol. But Ill probably just get a 2080s when it drops lower, not sure if the 3800x is ideal for that or not though but as of right now its just what I need.

Ive tried messing with the RAM, and it wont budge to anything that would be a worthwhile OC. I could only get my ram stable at 3200mhz on XMP and I could only use the option that had HIGHER timings than my original (the RAM is cl15 but the only stable OC over 3000mhz was at 3200mhz cl16-18-18-36) which is virtually not even an overclock lol. I tried higher options, none worked. I also tried to do it manually, same thing.
 
Honestly if you're thinking of getting a 2080s, and you're not a competitive gamer, I'd go 1440p vs 1080/144. If you're keeping your current card you could probably get away with just lowering a few settings to try and maintain that 144. But with gsync it wont really matter much, as it should all remain relatively smooth. And if you're not competitive, do you really need that "magic" 144 number? Imo, you dont, and should just focus on smooth gameplay.
 
  • Like
Reactions: Jason H.
Honestly if you're thinking of getting a 2080s, and you're not a competitive gamer, I'd go 1440p vs 1080/144. If you're keeping your current card you could probably get away with just lowering a few settings to try and maintain that 144. But with gsync it wont really matter much, as it should all remain relatively smooth. And if you're not competitive, do you really need that "magic" 144 number? Imo, you dont, and should just focus on smooth gameplay.

I like to think of myself as competative. I play mainly esports games like Rocket League (250fps which is its max fps of engine), Apex Legends(130-250fps), COD (the new cod on a mix of ultra and medium settings I get a good 140-160fps). So I mean Im not doing terrible in the fps department for what I have. I just want the games I play to have higher 1% lows basically so when it does drop, it will remain above 150fps or even just 144fps lol. And all these games I play I train in to get better at and I want to be better so, Im pretty competative, not like go to a tournament competative but, just for my own sake of wanting to do good lol.

However there is just this 1 game I cannot stand the performance on called Rust. Its a terribly optimized game and with my setup only runs at like 55-90 fps but mostly stays around 70fps which is below my refresh rate n it drives me insane. BUT from some google searches, it runs much better with better CPU's.

The usage of my GPU in this game is only at lik 38% and the CPU is only at like 20%, and tbh I think those numbers are even lower. Which is just terrible optimization but nevertheless, the better the cpu, the better the game runs as its more cpu intensive than gpu. So this new cpu I believe will allow me to change the avg of 70, to 90. And I believe the 2600's single core performance is just not that great compared to the 3800 and this game runs better with MT off.
 
Last edited:
Honestly if you're thinking of getting a 2080s, and you're not a competitive gamer, I'd go 1440p vs 1080/144. If you're keeping your current card you could probably get away with just lowering a few settings to try and maintain that 144. But with gsync it wont really matter much, as it should all remain relatively smooth. And if you're not competitive, do you really need that "magic" 144 number? Imo, you dont, and should just focus on smooth gameplay.
I'm not looking to be a competitive gamer I just want a new card so I don't have to lower any settings and to future proof for later games I decided just to get an Asus Strix 2080 super because I don't think it will bottleneck and its not as expensive so I can always upgrade my cpu down the line.