While the 1080 Ti has held up reasonably well since it came out 3 years ago, I'm not sure it is going to hold up particularly well for 5+ years unless one is okay with running new games at "medium" settings. We have already been seeing a number of games where the card can't handle "true" ultra settings, aka with raytraced lighting effects enabled, even at 1080p. With games designed for the next generation of consoles, raytracing will likely become the norm, not the exception. With raytracing enabled, a 1080 Ti drops to below the performance of a $300 RTX 2060, in some cases getting as little as half the performance of that card. Of course, it's also questionable how well first-generation RTX cards will be handling raytracing in future games, particularly at resolutions above 1080p.
🤷♂️
The 20 series was intended for RT and equipped with the necessary hardware and drivers to run it.
1080Ti was not intended, nor equipped for that and just has patch drivers to go on, and is expected to be as smooth or something compared to the 'RTX series'? Real nice what you did there...
RT is a niche area currently, and one I sure don't care about right now. A 1440p non-RT image looks better to me than a 1080p RT one.
I, at least, am not missing anything with that currently niche feature right now, nor do I have to concern myself with the performance impact or the issues with RT software development; time and money.
Until software development fully supports RT, it's going to be niche for awhile yet... Wow, I said it 3 times already.
Many times console progression has been behind that of PC hardware, yet the latter is held back by the former because the pacing of software appears to revolve around it.