Hello!
I'd like to get clarifications about what the practical implications of CPU bottlenecking are. I did some research to better understand this phenomenon and from what I understand, CPU bottlenecking occurs when a graphics card is so fast that it renders frames faster than the CPU can draw, so the GPU is basically held back and cannot produce its maximum FPS.
It's with this understanding that in 2019 I bought an RTX 2060 which was a good match for a Full HD Ultra 60 FPS experience. Back then, the idea of buying an RTX 2080 at that resolution was considered overkill for a lot of people because a CPU bottlenecking problem would occur.
Now in 2023 for around 90-95% of my games, my RTX 2060 still holds up pretty well and I can still play with an optimal performance. However, some newer games are starting to be too much like A Plague Tale: Requiem. Even when I lower the settings to medium, I roughly get 40-45 FPS and looking at the recommended settings in Steam, an RTX 3070 is advised to get a Full HD Ultra experience. This makes me realize that in 2019, getting an RTX 2080 would have been overkill for some years, but would have been a better future proof choice, since I could have played A Plague Tale with that card Today.
This experience is making me wonder about what are the actual practical impacts when CPU bottlenecking happens? For people that disable V-Sync, this makes sense to me since they'll want the maximum FPS their card can produce and getting an 80 class card for running Full HD games will be an issue (right now in 2023 at least), but in my case using V-Sync does cap my FPS at 60 frames all the time and my goal is also to get cards that could ideally last 5-6 years before I have to upgrade again, so I don't really get what the bottlenecking issue is all about. Now, I understand that everyone has different goals and gaming contexts, which is why I'm asking for your input so I can broaden my vision to better understand that issue.
Thanks!
I'd like to get clarifications about what the practical implications of CPU bottlenecking are. I did some research to better understand this phenomenon and from what I understand, CPU bottlenecking occurs when a graphics card is so fast that it renders frames faster than the CPU can draw, so the GPU is basically held back and cannot produce its maximum FPS.
It's with this understanding that in 2019 I bought an RTX 2060 which was a good match for a Full HD Ultra 60 FPS experience. Back then, the idea of buying an RTX 2080 at that resolution was considered overkill for a lot of people because a CPU bottlenecking problem would occur.
Now in 2023 for around 90-95% of my games, my RTX 2060 still holds up pretty well and I can still play with an optimal performance. However, some newer games are starting to be too much like A Plague Tale: Requiem. Even when I lower the settings to medium, I roughly get 40-45 FPS and looking at the recommended settings in Steam, an RTX 3070 is advised to get a Full HD Ultra experience. This makes me realize that in 2019, getting an RTX 2080 would have been overkill for some years, but would have been a better future proof choice, since I could have played A Plague Tale with that card Today.
This experience is making me wonder about what are the actual practical impacts when CPU bottlenecking happens? For people that disable V-Sync, this makes sense to me since they'll want the maximum FPS their card can produce and getting an 80 class card for running Full HD games will be an issue (right now in 2023 at least), but in my case using V-Sync does cap my FPS at 60 frames all the time and my goal is also to get cards that could ideally last 5-6 years before I have to upgrade again, so I don't really get what the bottlenecking issue is all about. Now, I understand that everyone has different goals and gaming contexts, which is why I'm asking for your input so I can broaden my vision to better understand that issue.
Thanks!