From a consumer perspective, the only logical way to compare value is to pick price points and compare what you get now for that cost vs what you got in the past. Whether you're shopping for a $100 card or a $5000 card, the same process applies. It makes no sense to try and compare based on product names arbitrarily picked by a company.
Yep, what you can get for your money compared to recent years is what people should be comparing, not arbitrary model numbers. As I pointed out before, comparing the size of the graphics chips, the 2070 and 3070 chips are much larger than those used for even the 1080, so if anything, those cards are filling that product range, and the pricing reflects that. And the 3060 Ti uses a cut down version of the 3070's chip, making it comparable to what the 1070 was in that generation. The current successors to the 1060-class cards are the various 1660 models, and those should soon be replaced by whatever new cards Nvidia launches in that $200-$300 price range, even if the naming scheme changes to call them "3050" cards, or whatever. Now certainly, I imagine part of why Nvidia's marketing department did this was to upsell people to the next higher tier of card than they would normally buy, but it's not like they raised prices significantly for any given tier, they mostly just shifted around the model names.
I don't like this situation where the GPU is the most expensive component in a computer. The CPU used to be the most expensive part. It's also hard to continue supporting the PC gaming ecosystem when a decent GPU costs more than a console of equivalent performance/experience. So many games are anti-modding or anti-user content these days that PC gaming has lost its edge.
That's been the case at the launch of new generations of consoles for a long time. After a while though, the console hardware gets surpassed by the PC hardware again. And ignoring the availability issues, it certainly doesn't cost more for a "decent" graphics card than for a console of comparable graphics performance. The $400 3060 Ti should outperform the graphics hardware in a $500 Xbox Series X, and the 3060 (non-Ti) probably will as well, especially when performance with RT effects enabled is factored in. And the PS5's graphics hardware is less powerful still. In terms of actual graphics performance, the new consoles will likely be competing more with upcoming sub-$300 graphics cards.
There's also the point that not everyone cares about running games at increasingly higher resolutions, as we start running into diminishing returns in terms of apparent sharpness at typical viewing distances. A 3060 Ti running at 1440p resolution gets slightly higher frame rates than a 3090 running at native 4K, and most would be hard-pressed to notice much difference between those resolutions when actually playing a game. And the new cards capable of pushing those kinds of frame rates at 1080p will likely be priced around the $200-$250 range. The existing Radeon 5600 XT and 2060 already get similar performance at 1080p as a 3080 gets at 4K, and those were cards that could be found for around $300 or less the better part of a year ago.