My kids made it happily through the crypto induced GPU winter with various GTX 980ti's and GTX1070 I had replaced when I made the mistake of switching to 4k on my KVM'd workstations.
Well actually CUDA and Machine Learning were the primary motivators for the GTX1080ti, GTX2080ti and RTX3090 that followed, but all of them disappoint at (after hours) 4k gaming to this day on my 8-18 core machines in games that run perfectly fine on Ivy Bridge to Kaby Lake quadcores on their 1920x1080 screens.
Unless you have one of those less popular screen sizes in-between, you really either don't need more power or it's simply not enough.
Consumers don't just buy hardware to fill a certain price point, but to satisfy a gaming experience.