There's 3 things to consider. (apart from game limitations).
- Cpu. It sets the fps, pre-renders all the frames possible.
- Gpu. It puts those frames up on the screen according to resolution and detail settings.
- Monitor. Resolution and refresh rate.
All 3 need to be somewhat balanced. It's kinda pointless to have a 9900k and 2080ti on a 1080p/60Hz monitor or a 3570 and 1650S on a 4k/120Hz monitor etc.
You ask about bottlenecks, but there's always an imbalance somewhere. That's all on the game. Some games like gta5 or Witcher3 are hard on a cpu, but not so much on gpu, so a better, stronger, high core cpu is a must. Some games like CSGO or LoL are easy on cpu, so even a dual core works well if it has high IPC/clock speeds.
So there's going to be a lot of varience. All game dependent.
Most newer games will hurt fps on any quad core, doesn't matter if it's 3rd or 7th gen, it's the game code usage of higher thread counts. For most newer games you'll be limited by the cpu for fps, so the only difference between a 1650S and a 1660ti will be the detail levels, how high they can be pushed before it affects fps. For many games, you'll get ultra without issue on either, so the actual card used won't make a difference. It'll only be in the cpu-easy, gpu-hard games where details will become a factor, like using 4k DSR etc.