The 2080 ti may have been overpriced but at least it had more than 8GB of vram. You would have had to pay $699 for a 3080 to get a high end GPU with more than 8GB from the 30 series. (I know the 3060 12GB exists which is why I said high end.)
Sure, but back then, there was no question about only having 8GB of RAM. At the time, 8GB was more than enough, even at 4K. The only time that I had a second thought about only having 8GB of VRAM was when I was playing Far Cry 6 on my RX 5700 XT. I couldn't use the HD texture pack because it required a really odd 11GB of VRAM. One of the things that made me laugh about the Ampere cards was the fact that an RTX 3080 couldn't use the FC6 HD texture pack and it was a new enthusiast-class card!
You see, nVidia realised that a lot of people who are willing to pay the extra for GeForce cards weren't knowledgeable enough to realise that it was a bit of a scam on anything that wasn't high-end. Like, the RTX 2060 was a total scam because it had RT performance similar to that of an RX 6600. RT performance that poor made the RTX designation irrelevant but the noobs bought them in droves. Since noobs buy by brand and experts buy by spec, nVidia knew that most people who bought their cards wouldn't question the absurdly-low 10GB frame buffer on the RTX 3080 and would just accept it because it was more than 8GB.
This is why the vast majority of people with Radeon cards are enthusiasts and/or experts. We understand what we're reading when we look at video card specifications and Radeons tend to be spec monsters when compared to GeForce cards. Now, nVidia is a
very smart company, far smarter than AMD ever was. What nVidia did was spec their cards with CUDA so that content creators (and influencers) would use them to make their videos. This is why so many YouTubers use them and people see that. They also make sure that they have the fastest single card so that TechTubers will use them for CPU tests. That gets the name out, basically free advertising. For the noobs, they just keep saying "RT" and "DLSS" over and over until it sticks.
When someone buys a high-end GeForce card because of DLSS, they're clearly a noob because by the time they actually need upscaling (maybe 4-5 years time), DLSS as we currently know it won't exist and neither will FSR. After all, neither of those technologies are comparable to what they were like only 3 years ago and our tech increases exponentially. In 4-5 years time, DLSS and FSR (and XeSS for that matter) will probably all be 100% perfected. The same is true for someone who buys a low-end GeForce for RT.
Now, don't get me wrong, I think that ray-tracing is a really cool thing but for the most part, our hardware isn't ready for it yet. Hell, the top-level of RT in CP2077 can bring a $1600 RTX 4090 to its knees! That tells me all that I really need to know about it, we're not even close to there yet so I don't factor it in to my GPU purchases yet.