If you are only interested in playing at 1440p at sane framerates, there is absolutely no need to upgrade beyond a 20-series (and there won't be for a long, long time). If you can maintain your target framerate and resolution without any form of V/Gsync active, there's nothing to gain - and if you can't, you're still much more likely to benefit more from a display upgrade, an overclock, or some well-thought-out driver level tweaking. That's the short version; if you're interested in some GPU food for thought, you can continue reading as you please.
ENGINEERED OBSOLESCENCE:
There are no obsolete premium (i.e. non-mobile, non-budget series) graphics cards still for sale within the current premium market. It's important to understand that NVIDIA has gone the way of Apple lately, and particularly with its RTX line; they are only producing new product lines with the intention of retiring old (still perfectly viable) lines MUCH sooner than the market would naturally push them out, thus keeping the median prices (and their profit margins) for their GPUs as high as possible. This is a terrible anti-consumer trend that, if not checked, is eventually going to run most of us out of the affordable gaming business as other component manufacturers and vendors follow suit. Don't encourage them by buying a card you don't need!
IT STILL WORKS!:
The current (and next) generation of PC games can be maxed out on properly installed, moderately-overclocked 10th (and even 9th!) generation GTX gpus with frame buffers of at least 4gb, at 1080p to 1440p resolution, playing at 60hz and higher, with no issues, by a competent user. That last part about competence is pretty important, though; a significant amount of "free" performance will cost a modicum of time to figure out how to attain it. Most GTX cards since at least the Fermis, for instance, have been capable of at least a 20-30% overclock (OVER their preconfigured vendor-overclocks!) out of the box, without even touching voltage. Better-binned cards can almost always do 40-50% overclock while still remaining well below voltage and thermal ceilings for added peace of mind, and the really good ones can go higher than that if you know what you're doing (yes, even on air). Before you even consider any GPU upgrade, first see how much free value you can get out of yours!
OVERKILL AND BETTER ALTERNATIVES:
Core and memory speed (i.e. render throughput) have both been ahead of the computational render curve in games for quite some time now. It is only when you combine drastic increases in pixel density (resolution), egregious levels of supersampling styles of AA, unnecessary resource-intensive render synchronization schemes and massively-overkill framerates that your framebuffer requirements begin to skyrocket, and your (considered) personal version of that formula is what should drive your GPU upgrade strategy. Instead of V/Gsync, use scanline synchronization - same or better result as a sync scheme, with none of the performance cost. Instead of AA, get a better display that performs scaling operations locally and with better accuracy - same or better result as higher levels of AA, with none of the performance cost. Instead of hiking up your framerate, take the time to learn how to optimize your drivers (and, for online FPS games, your network) for low latency - same or better result as higher framerates, with none of the performance cost. Most of the "features" you think you need a new GPU in order to attain/maintain are literally just lazy-layperson's options; if you can just easily change your behavior and save a ton of money, why not do it?
FRAMEBUFFER = BOTTOM LINE:
When your preference for higher resolution (biggest impact), higher anti-aliasing (almost equal impact), higher framerate (next biggest) and max detail levels (less impact) combined are causing your GPU to fully utilize its RAM and causing framebuffer churn, (and assuming you've done a decent overclock already), that's when it makes sense to upgrade. Until then, your CPU is always going to be your (render) bottleneck - and that will increasingly be true as time goes on, given that most of the "shiny" advancements in render technology are, and will continue to be, in the realm of real-time post-processing - which is going to be heavily tied to your CPU for at least as long as games primarily utilize only one or several cores (which, in turn, is going to be for a long time).
For reference, I'm personally still running a GTX 970 ME (technically only a 3.5gb card) with a very moderate overclock (1500 core, 4500 mem), and can easily run any high-fidelity AAA current title at 1440p, capped at 60hz, with everything but AA maxed (I usually run 2x - and even that is overkill for most titles at 1440p on my monitor). Because I don't need 2160p resolution, and because I prefer to remove latency from my games by correctly configuring gpu driver profiles specifically for those games rather than by simply throwing 120/144hz at it until I can't feel the difference, the only upgrade worth considering in my situation will eventually be an 8gb card - but it's just not necessary yet. There ARE a few games that can max my framebuffer on ultra at 1440p... but as long as I'm still overshooting my target framerate, that doesn't matter. When I do upgrade, I'm just going to try to snag a cheap old 1070 or 1080, because I know they'll both do 4k at 60hz with no issues well into the next several generations of games (and I don't care about ray-tracing gimmicks, which I won't get into).
Just some food for thought. People have been spending way, WAY more money on GPUs than they need to for a long time, now.