1. This is a game. One of thousands that will be released this year. If you’re unable to run it, don’t buy it.
2. Ray tracing is now on way over 50% of all gamers systems on Steam (RTX alone accounts for over 60% of Steam gamers in the December survey) and I don’t know how old a computer needs to be to have been sold without it.
3. Systems with 16+GB RAM is around 7% on that same Steam survey. And y’all complaining about ray tracing?
Yeah, the RAM requirements are the only thing I consider concerning, too. Especially in 1440p and 4k, where it's at 32 GB. Most regular gamers should have 16 GB by now at least, it's basically considered standard nowadays. And RAM isn't exactly hard to upgrade and comes cheap, so there is that.
Btw, the last gaming-grade GPU without RT, the 1650, released 2020. However, even that one was weak enough to need to turn down settings in many games. It was the only X050 card of the generation, though - there was no 2050 back then (not until 2022, and that is no actual 2050, but a 3050 with lots of stuff disabled if it even released; see the article in this
link) - so the 1650 was rather popular, even leading the Steam charts for a while iirc.
I don't think this a healthy take.
Even relatively little RT can increase image quality in various ways, things like RT shadows and lighting are relatively cheap to use and are already used plenty. Even Series 20 can use these fine enough, let alone Series 30 that is already much better at RT.
You don't need to go full tech fad mode, like we had with bloom back in 2000s, when it was "discovered" and suddenly everyone and their mothers slammed bloom in everything where it was not needed. There is no need to run CP77 in full balls to the wall RT overdrive or have every game look like you're in a mirrors museum.
But even a small-to-moderate amount of RT goes a long way, and the games finally reached the phase where it is a requirement. Why bother with lightmaps, when pretty much every GPU from the last 4-6 years and consoles have enough juice to use RT-based lighting instead?
It's really about time people will move on and it's not even nothing new - remember all the DirectX evolutions we had back in the day? At some point a game came out and required DirectX 9, or DirectX 11 and it was GG for cards that did not have hardware support for it.
Couldn't have put it better. I don't get the issue people have with RT, either, it feels like that one grampa complaining about all those new things nowadays that didn't exist in ye good olde times, and totally ruin everything to me. RT is here to stay; Nvidia knows it, and so does AMD considering they improve their RT performance with every single generation. Fact is, more and more games release with it as a requirement instead of a feature, and the trend will continue rather than die. If you don't like it, you will have to stick to rasterizing games going ahead, and might have to stick to retro games altogether at one point. It's simple as that. But all this complaining leads nowhere, and the claims it's just a gimmick are ridiculous. The trend was there for years, going all surprised Pikachu now is just silly.