Question CPU for 1080p gaming. What do you really need?


Dec 15, 2007
Just a thought, forgetting benchmarks and talking real world where I like the graphics turned up as far is I can get them while still achieving 60FPS..
From games like cyberpunk it seems relatively clear that the new 3060ti is a good choice for 1080p if you want ray tracing - and with pretty much everything else it'll walk it.

Looking at CPU capability, what I'm seeing is that even in today's most demanding of games - even a 10100F will far exceed 60fps unless GPU bound to less in any game out today with plenty of headroom.

It seems ridiculous, but.. is their logic in going for a low end CPU and a higher end GPU?

As much as a 3070 or even a 3080 seems ridiculous overkill for 1080p 60Hz, also conscious that GPU requirements for newer games increases far quicker than CPU requirements... with most CPU's seeing 2 or 3 generations of GPU.

Maybe that's the argument for a faster CPU..


no, there is no real logic in going with a low powered CPU when pairing it with a higher end GPU.

When it comes to what we call "bottlenecking" you want a GPU bottleneck always.
Because this means that it's the GPU that is holding back the potential of the other hardware such as RAM and CPU, which is what you want because it means you are getting the most out of the GPU that you can, and also means that when the time comes that you maybe want to up your game from a 3060 Ti to something more powerful than it down the line, you will have the CPU headroom to accommodate it.

In regards to GPU requirements getting higher the further we go with games, the answer to that is.... sort of.
In terms of graphics alone, we have pretty much hit the peak.
And as time goes on it won't really get so much better that it becomes harder to run, and it will actually get easier to run as we optimize the new graphics technologies.
The biggest issue when it comes to graphics getting harder to run right now is ray tracing, but with time that too will become less of an issue.

For a target of 1920x1080 resolution and a target of an average or minimum of 60fps, with the latest AAA titles, a 3060 Ti will suffice for most any game.
The only games you may have to opt out of the max setting and go with the high setting for would be games that try to push boundaries a little too far and aren't well optimized such as CP2077.

So to finalize the answer, a 3060 Ti for 1080p60 with more realistically understandable graphic setting options in games will be perfectly fine.

As for your CPU, I currently use a 6700K overclocked at 4.5GHz and still haven't hit a bottleneck with it when paired with my 1080 Ti running games at 1440p with the types of settings I mentioned (a mix of max and high)

So anything that outshines my 6700K would be plenty I believe, especially when you purposefully make yourself GPU bound in a game by cranking graphics.

So a Ryzen 5 3600 or better from AMD and whatever is equivalent on the Intel side of things will do great (I honestly haven't been keeping too much tabs on Intel performance these days since I am going to switch to Ryzen when it comes time to upgrade)


Below is my rant that would've gone in the middle of this post, but I decided to spoiler it below if you care not to read it.
To summarize, it basically explains what is actually realistic when it comes to "max" settings in some games to help you understand something about it and that I basically say that in some games, the "max" option for some graphics settings do absolutely nothing for you visually (especially at 1080p) but will drastically reduce your frame rates.

While I understand that you want to run games at "MAX GRAPHICS" because it sounds nice and sounds like you are getting the absolute BEST visual experience ever, it's entirely unrealistic.
When it comes to a lot of these super high end AAA titles, some settings (not all, and usually pertaining to things like lighting and shadows) have no real world visual difference between the max option and the option right before that.
Since you mentioned "real world" yourself I felt like I needed to also say that for this.
What I mean by real world is that with a lot of these types of settings, there is a difference between max and not max.
However, the only way you would see such differences would be to do the following all in conjunction with one another:

#1. Play at resolutions higher than 1080p, since at 1080p, the resolution will be low enough that you won't be able to discern said differences.
#2. Stand perfectly still, looking directly at the small and minor little things that said graphic options will affect the quality of.
#3. Take screen shots to be able to compare.
#4. Stare at the screen shots and spot the insignificantly minor difference.

In a "real world" scenario, you are playing at 1080p, you are not standing perfectly still in game either and trying to look at these things, you are engrossed in the experience. As long as the graphics aren't low quality enough that you notice some things being funky looking and thusly jarring you and taking you out of the immersion, then that's all there is too it. You don't NEED max graphic settings when it comes to certain games and certain settings, you only need the settings that look the best before the quality difference isn't noticeable in a bad way and not go higher than that, otherwise, you will only be hurting your gaming performance for no additional visual benefit that makes the performance drop worth it.

And since you plan on playing with a 3060 Ti, you cannot afford to be pushing boundaries with unnecessary graphic settings if you want to have a playable frame rate and enjoyable experience.

So you will have to opt out of "max graphics" on certain settings in certain games, but that doesn't mean you will be getting a visually lesser experience though as I just explained why above, especially at 1080p.
[ /spoiler]
Reactions: Phaaze88