Can’t help but wonder where graphic card prices would be without the ray tracing crap nvidia has been pushing as the next great thing. It is a costly low value tech most will never benefit from.
I really don't think it's actually all that costly. From what I can tell, raytracing only takes up a relatively small portion of the graphics processor in the current 20-series cards. Judging by the die-sizes of the full 16 and 20-series cards compared to their relative performance, I estimate that the RT portion of the 20-series cards likely doesn't take up more than 10% of the chip. And of course, that chip is only one part of the card, with other components like VRAM taking up a decent portion of a card's cost as well. So, I wouldn't be surprised if raytracing didn't add much more than 5% to the manufacturing cost of those cards.
The mediocre pricing has more to do with Nvidia sticking with a similar manufacturing node for their graphics chips as the previous generation of cards. To add substantially more performance, they would need to make the chips substantially larger, not only increasing the cost of the chips, but also increasing power draw and heat output. So, they weren't likely to add substantially more performance for the money whether RT was there or not. Just look at the 16-series cards. Even without RT, it's not like they added a large amount of performance relative to their predecessors. The 1660 wasn't much more than 15% faster than a 1060 6GB on average, while costing only slightly less, the better part of 3 years after that card came out.
And of course, the lack of competition from AMD didn't help 20-series pricing either. Without any really viable competition, Nvidia was able to price those cards as they pleased. That's not related to RT, it's just them not having any pressure to offer significantly more value than they did. Once AMD finally arrived at the party close to a year later, Nvidia cut into their margins and started offering more value for the money in the form of their SUPER cards. The current pricing of their lineup is arguably closer to what the 16 and 20-series should have launched for, and it probably would have if AMD had their new generation of cards launching around that time.
Of course, AMD needed to wait for 7nm production to be ready, and even now, it seems like they are being limited by production capacity of the chips. I suspect that AMD probably makes more money per wafer off their CPUs than their GPUs, and they are also responsible for a lot of chips for the upcoming consoles, which is likely why the pricing of the 5000-series cards has been kind of mediocre as well.
RT itself is a fine technology. Performance is rather poor with these first-generation cards, but that will undoubtedly improve with future hardware. With Nvidia finally moving to a new process node for their next cards, they should have more headroom to improve RT performance, and my guess is that the 30-series will handle these effects a lot better. With the new consoles supporting the feature in some capacity, and AMD likely to add it in future cards as well, I suspect it will become the new standard for ultra graphics for the next generation of games.