"Life is short. How many months or years do you want to wait to enjoy a new experience? You can sit around twiddling your thumbs and hoping that an RTX 2080 gets cheaper, or you can enter the world of ray-tracing and high-speed, 4K gaming today and never look back. When you die and your whole life flashes before your eyes, how much of it do you want to not have ray tracing?
Video card companies know that people are willing to pay a premium price for RTX cards.
The 15-inch Apple Studio display, one of the first flat panel monitors, cost $1,999 when it came out . . . in 1998. Today, you can get a used one on eBay for under $50 or a new 24-inch monitor for under $150, but if you bought one at the time, you had the opportunity to use a fantastic new technology when others didn't."
Really? SMH
I've done that a couple of times. In the mid-90's, I bought a $2K graphics card and regretted it as it was obsolete, literally, within months of my buying it. Then, when LCDs first came out, I bought a 15" LCD for something like $1.8K. The quality of the picture was not much better than the legendary 13" Sony tube monitors at the time.
IMO, there is absolutely no value there in having the latest and greatest stuff unless you want bragging rights, and most people that I know would simply shake their heads and walk away.
Besides that, ATM there is little support for RTX in present day games. I get the idea that it looks better, but will that really give the average gamer an advantage that will make a significant difference for them? Maybe a pro gamer will get an advantage from these cards, and maybe that will make it worth it for them, but at that level, they have the spare cash sitting around to throw at a card that likely offers a marginal improvement in performance.
If people refuse to pay for the RTX hardware features, nVidia will take notice. Those who buy, at least to me anyway, are more like the sIntel or iCrap sheep willingly going to the slaughter.
Besides, it sounds like nVidia is trying to sequester gaming coding to run at high performance rates on their cards. IMO, game developers would get significant performance boots if they figure out how to use Conformal Geometric Algebra in their code - without hardware enhancements.
https://en.wikipedia.org/wiki/Conformal_geometric_algebra
As I see it, articles like this encourage the less knowledgeable to enter into nVidia's indentured servitude.