Remember when 60 series were $250 and $300 (960 and 1060)?
$400 isn't exactly mainstream value. Work em over AMD!
Except the 1660, 1660 SUPER and 1660 Ti were all well within that price range. It ultimately just comes down to whatever arbitrary model number they decide to market a given card with. The 2060 was arguably more of a successor to the 1070, not the 1060. Nvidia just shifted model numbers to obscure the limited performance gains of the 20-series outside of things like raytracing. As far as the size of the GPU chips go, the one used in a 2060 or 2070 is over 40% larger than what was used in a 1070 or 1080, only a little over 5% smaller than what was used in a 1080 Ti or Titan Xp, and well over double the size of what was used in the 1060. Even the 1660 and 1660 Ti's chip is over 40% larger than what the 1060 used, and only around 10% smaller than what was used by the 1070 and 1080. Being stuck on what was essentially the same process node as the 10-series meant Nvidia wasn't able to increase performance significantly without making the chips substantially larger, which greatly limited performance gains, which they tried to disguise by numbering their higher-end cards differently from the previous generation.
But they ARE NOT on par with 30 series due to DLSS and RT.
We don't actually know that though, as not much information is out there about AMD's raytracing implementation yet. For all we know, it could perform better than Nvidia's hardware, or perhaps be better at some effects and not as good at others. And AMD already has their own non-proprietary upscaling and sharpening tech that was in most ways better than the original implementation of DLSS, and is arguably not far behind DLSS 2.0 as far as the end-result is concerned.
If they can deliver a 6900XT that runs as fast as a 6900 and offer it at 6800 Prices ($700)
I think you may have used some wrong model numbers there. : 3
Personally, I think we're going to get relatively similar pricing for a given performance level compared to Nvidia's 30-series MSRPs. It seems to me that Nvidia likely obtained information about AMD's 6000-series plans many months ago, and found them to be too competitive at the high-end for their liking, causing them to launch their cards at lower price-points and with higher clocks and power draw than originally planned. That may also be why the VRAM gains are modest to nonexistent, since they likely shifted GPUs to different price points, and any VRAM increases had to go to make that happen.