In any given pc, there's 3 primary limiting factors. The cpu, the gpu, the monitor.
The cpu is responsible for fps. It makes every frame. The gpu is responsible for taking the frame data from the cpu and rendering a picture, the monitor is responsible for refreshing all the frames sent by the gpu.
So let's say the cpu can put out 200fps in a game, and the gpu can put out 150fps to the monitor at ultra and you have a 60Hz refresh. You get a visible 60Hz period. You could upgrade the gpu to a 4090 and a possible 500fps output, but are still stuck at 60Hz. A gpu upgrade there will not increase any performance graphically or visibly.
So let's say the cpu can only put out 50fps in a game, the gpu capable of 200fps, the monitor at 60Hz. Change the gpu, you get no bonus. Change the monitor, no bonus. Wasted money on both accounts as the cpu is the major limiting factor.
So let's say the cpu can put out 400fps, the gpu 50fps at ultra and a 60Hz monitor. Gpu upgrade will work best as now it's the major limiting factor, so an upgrade to a gpu that put out more than 60fps at ultra will be visible.
Gpu upgrade only is of any benefit when it's the major limiting factor. If it can already put more frames out to the monitor at ultra than the monitor can show, the upgrade is pointless. This is the reason many will use a higher refresh monitor, just so the monitor isn't a limiting factor in such a big way. It's also the reason to use a bigger/faster cpu, so that it is not the limiting factor in fps production.
Play your hardest graphical game at ultra. What's the fps. Now change that to medium. If you see a big jump in fps, that's a gpu limitation. If fps barely changes at all, that's a cpu limitation. If either is over the refresh of the monitor, that's a monitor limitation and cpu/gpu upgrades won't change how you see the picture.
A gpu upgrade makes sense if you can't play at ultra without a sizable drop in fps. How much of a gpu upgrade is determined by 3 factors. Budget, need and psu.