Alucard_ggz

Honorable
Jun 4, 2017
90
2
10,635
I have seen people constantly talk over the internet about how you "should" pair good GPU with a good CPU, which in my opinion, over the long years of gaming, is becoming so hilarios that I start to think they don't even know the fundamentals or what they are talking.

So here's the thing. There are a two cases which I saw on some youtube benchmark and now I am confused.

Let's take for example a monitor 2K on 144hz!

First case: RTX 3080 + Ryzen 5 5600x
Cyberpunk: 80fps
AC Valhalla: 70fps
Red Dead: 96fps
Watch Dogs: 83fps
Horizon Zero: 126fps

Second case: RTX 3080 + Ryzen 7 7800X3D
Cyberpunk: 100fps
AC Valhalla: 110fps
Red Dead: 138fps
Watch Dogs: 113fps
Horizon Zero: 158fps

Now, we can clearly see that a better CPU give more fps, but are those frames worth it? Is it worth upgrading the CPU?

I am not taking G-Sync into consideration, somebody correct me if I am wrong but the optimal settings and fps is to have fps higher than a refresh rate, and in case that we get screen tearing we can easily lock them on to a monitor refresh rate with a V-Sync option. In other case around, or in this example here, both games give fps less than a refresh rate of a monitor, where V-Sync is not going to help us and we can get screen tearing.

Why in the world would you "pair" good gpu and good cpu, if for example in game you get 150fps with older CPU than 200fps with a newer CPU, where you still have 144hz monitor. For me, makes no sense.

As far as I understand, you have a target, and the target is monitor refresh rate, whether it is 60hz or 144hz and by taking that into consideration you make necessary adjustments in your build to make you reach those HZ with your in game FPS.

What's your opinion on all this?
 
That's entirely up to you and how satisfied you are with those FPS. For some 50-60 FPS would be fine and some even 300 wouldn't satisfy. If you were to build either of those two examples, newer generation would give you longer useful life and more options to upgrade.