anti-aliasing took years to develop only to end up offering 16 fps for $2k+ graphics cards?
Metro Exodus runs on my $800 RTX 3080 at 45 FPS in 4k and max settings, so I don't really know what you're talking about, and I suspect neither do you.
Maybe you're too young to remember SSAA. It was one of the first anti-aliasing methods ever released, and it worked by rendering the game at a higher internal resolution. Kind of like rendering a 1080p game at 4k, then downscaling the frame back to 1080p to eliminate jaggies. SSAA came in different flavors, such as 2x, 4x, and 8x. I've seen 16x, but that was extremely rare, considering the improvement over 8x was almost non-existent.
Needless to say, the performance impact was (and still is) massive. Most people stayed with 2x, while 8x was often reserved for older games running on newer hardware that could handle rendering those simpler scenes at 8 time the native resolution. For the record, 4k is 4 times the resolution of 1080p. Over the decades, people have tried getting anti-aliasing without the performance penalty. We finally achieved it in 2009 with FXAA. Nowadays there are many games that simply enable anti-aliasing without giving players the option to turn it off, simply because the performance impact has now become negligible.
and what do you have to say to technologies such as chromatic aberration that majority don't like yet are imposed upon them?
Chromatic aberration has little to no impact on performance and can be turned off in a lot of games, even on consoles. What else is there to say, and how is it relevant to this discussion?
and how are you using the word "RAPID" to describe advancement rate of RT?
I'm not. I used the word "rapid" to describe advancements in computer graphics in general. Ray tracing is just one of many new technologies in use nowadays that have been introduced relatively recently. It was first introduced in 2018 by Nvidia in response to Microsoft introducing the DXR (DirectX Raytracing) feature in DirectX 12.
And ray tracing isn't even that new in and of itself. The idea has been around since before computers even existed in the 16th century, and was first used in computers in 1968. Since then it's been used extensively for movies and even video games. Except ray tracing is very computationally expensive, so until DXR/RTX our only experience with ray tracing in games came from
lightmaps made by using ray tracing to precompute lights and shadows in a statically illuminated scene, a process known as "baking".
So when people say "RTX ON doesn't look any better than RTX OFF", what they're really saying is: "dynamic ray tracing doesn't look better than static ray tracing", which is kind of dumb. That's like saying a frame rendered at 60 FPS doesn't look any better than a frame that was rendered over the weekend. Of course it doesn't. But who would play a game where each frame takes several days to render?
RT should've stayed in the labs until common sense decided it was ready. Yet it exists NOW in the industry, offering very little, asking for a lot (money and power), benefiting basically no one but virtual needs of rich, moron and corporate slaves.
And what makes you think ray tracing isn't ready? Because one game decided to crank all parameters to beyond the current limits?
The neat thing about ray tracing is that you can scale its settings to infinity if you want. 5 bounces and 3 rays per pixels isn't pretty enough? How about 1024 rays per pixel and 65 000 bounces per ray? I guarantee you won't find a GPU that can run those settings at over 1 FPS for many many years.
You sound like some kid complaining that games should never have transitioned to 3D because someone decided to render a game at 16k and needed a $2000 GPU to do so and still only achieved 13 FPS.