Hey, I'm planning on making the jump from a 1080 to a 2080 ti very shortly the only problem is I have an i7700k and I don't know if I need to upgrade my cpu as well.
Yeah, more work the cpu has to do. A 2080Ti isn't going to stretch it's legs even at 1080p ultra...Is this because the lower the res the more the game pushes its resources onto the cpu?
Is this because the lower the res the more the game pushes its resources onto the cpu?As long as the monitor resolution and refresh isn't at, or below 1080p 200hz, the 7700K should be fine...
Yeah, more work the cpu has to do. A 2080Ti isn't going to stretch it's legs even at 1080p ultra...Is this because the lower the res the more the game pushes its resources onto the cpu?
Yeah, more work the cpu has to do. A 2080Ti isn't going to stretch it's legs even at 1080p ultra...
As you go up in resolution and graphic detail, the more work the gpu has to do and less for the cpu. Vice versa, it's the opposite.
Some people have 9900K + 2080Ti combos for 1080p 240hz. They have to OC and run on low settings just to maintain 240+fps.
That's pretty bananas. All that fancy, expensive hardware, just to play on low settings on the quest for minimal input lag and image smoothness...
They both do their own share of rendering:Hmm I was under the impression that the CPU creates the frames for the GPU to render (hence why some people say a CPU cannot be a bottleneck). SO I was thinking that no matter which GPU and res you use, the CPU has to be able to render those frames for the GPU to render.
They both do their own share of rendering:
-the cpu does the pre-rendering stuff - what all that entails is over my head
-the gpu does the remaining post-processing stuff, according to the resolution and other in game graphics settings
"Cpu cannot be a bottleneck"... depends on your POV, I guess. I don't agree with it.
This is what I've heard as well. And from my limited understanding of it, the higher you go in resolution the longer it takes the gpu to render. Putting less stress on the cpu in a sense. The cpu is still working hard, but the gpu doing all the heavy lifting takes longer and it doesn't demand as many frames. So if at 1080 its demanding say 150, then at 1440 maybe it only wants 100. So obviously the cpu can render 100 faster than it can 150.Hmm I was under the impression that the CPU creates the frames for the GPU to render (hence why some people say a CPU cannot be a bottleneck). SO I was thinking that no matter which GPU and res you use, the CPU has to be able to render those frames for the GPU to render.
This is what I've heard as well. And from my limited understanding of it, the higher you go in resolution the longer it takes the gpu to render. Putting less stress on the cpu in a sense. The cpu is still working hard, but the gpu doing all the heavy lifting takes longer and it doesn't demand as many frames. So if at 1080 its demanding say 150, then at 1440 maybe it only wants 100. So obviously the cpu can render 100 faster than it can 150.
The 3800 should give you a fairly solid boost. Being that its slightly faster than the 3600, and the 3600 is about 15-20% faster than the 2600. I believe that was in both single and multi threaded applications. Whether the boost in performance is worth it to you as far as cost, will be entirely up to you.
While not the best way to check, you can hop onto youtube and do some comparing. Just search 2600 vs 3800 in xxx game. You'll get a general idea of performance difference.
@Jason H.
Here's another way to look at it:
-Fps min = cpu
-Fps max = gpu
-Average is the combination of the two
You're looking at like a 10-21% improvement on average going from a 2600 to a 3800X... unless that 3800X is on sale, it isn't worth it over a 3700X; the performance is darn near identical. That 40USD difference could be better spent elsewhere.
Some people really underestimate the little ol' 3600. It really isn't that far behind the bigger cpus. It's easily the best value cpu among them. The 3700X/3800X are less than 1% faster than the 3600...
But some people have to thrown in that 'futureproof' nonsense - I gave up trying to argue it. To the people who get 3700X over a 3600 to 'futureproof', only to end up replacing it at around the same time as the latter - LOL~!
Tip: Whether you still want to replace the cpu or not > look into tightening your current memory timings, if you don't plan to replace them.
Ryzen 3000: 3800mhz at 1:1 mode, FLCK at 1900mhz, custom timings > 3200mhz w/tweaked timings > 3600mhz w/XMP timings = 3200mhz w/XMP timings
Honestly if you're thinking of getting a 2080s, and you're not a competitive gamer, I'd go 1440p vs 1080/144. If you're keeping your current card you could probably get away with just lowering a few settings to try and maintain that 144. But with gsync it wont really matter much, as it should all remain relatively smooth. And if you're not competitive, do you really need that "magic" 144 number? Imo, you dont, and should just focus on smooth gameplay.
I'm not looking to be a competitive gamer I just want a new card so I don't have to lower any settings and to future proof for later games I decided just to get an Asus Strix 2080 super because I don't think it will bottleneck and its not as expensive so I can always upgrade my cpu down the line.Honestly if you're thinking of getting a 2080s, and you're not a competitive gamer, I'd go 1440p vs 1080/144. If you're keeping your current card you could probably get away with just lowering a few settings to try and maintain that 144. But with gsync it wont really matter much, as it should all remain relatively smooth. And if you're not competitive, do you really need that "magic" 144 number? Imo, you dont, and should just focus on smooth gameplay.