Perhaps he is right but I will be honest, this is the first time that I hear someone saying that (image quality worse BECAUSE its on a Radeon GPU) but as mentioned many times before, I simply fail to follow the current RT hype given the performance hit resulting in barely noticeable results.
It's not a universal problem, but there are certainly individual games where the DXR effects on a Radeon GPU (sometimes Intel as well) don't look the same as on an Nvidia GPU. I don't know if it's a drivers issue (probably) or some other root cause. Major games, this usually gets fixes pretty quickly, or at least mostly fixed. Bright Memory Infinite Benchmark, however, has some clearly incorrect/missing renderings on AMD. In the past I've seen some differences in Watch Dogs Legion as well with the reflections, but I haven't checked that recently so it may not happen these days.
What's particularly irritating to me is how DXR has become political, in that AMD fans always try to trash it, often just because they know it's a weak spot on AMD GPUs. To me, the ray tracing functionality is just one more aspect of a GPU now that everything should support. I test performance of the RT hardware because it can potentially matter. It doesn't
always matter, but used properly, there should be a noticeable difference between RT on and RT off.
Related to the DXR being politicized, look at most AMD-promoted games that use DXR. I can't off the top of my head name one where I would say, "Wow, ray tracing makes this look really different and a lot better." Star Wars Jedi: Outcast has some areas where DXR looks better, but also worse, than non-RT. Many other AMD-promoted games have DXR implementations where I can only think, "Why bother?" Far Cry 6 is one of those. And it's not because the non-RT graphics look amazing, it's just that the RT graphics have been "optimized" to not cause much of a performance hit... by not really doing much RT! That's my feeling at least. Deathloop was like that as well. And Dirt 5 and several others. Nvidia did that with the first generation of RTX cards and games as well, so you'd only get (weak) RT shadows in Shadow of the Tomb Raider, or (better but not often that important) RT reflections in Battlefield V, or (meh) RT global illumination in Metro Exodus.
Once the RTX 30-series came out, Nvidia began helping/encouraging developers to use more than just a single effect. Actually, it even happened before the 30-series, with Control, but that was the only game I can name offhand, and it also seems to do a few odd things. Like, it just doesn't look quite as crisp as I'd expect, with or without DLSS — almost like it's always rendering certain effects at half resolution. Anyway, it's very much a case of chicken and egg syndrome. We need better hardware to encourage the use of better and more complex RT, and better RT implementations will encourage more investment into RT hardware. The PS5 and Xbox consoles using midrange AMD RDNA 2 hardware definitely didn't spur an increase in the use of complex ray tracing, and we might not see a big change until the sequels arrive in 4~5 years.
Regarding DLSS, you have to understand that
there are cases where the default TAA is so blurry that DLSS looks better. Unreal Engine games often fall into this category. I don't know why the TAA is so bad on some games, but I actually prefer DLSS upscaling (using quality mode), or DLAA if it's available, over TAA. DLAA in particular is provably better. But even without DLAA, the performance uplift, latency reduction, and extremely minimal (in virtually all games I've checked) loss in image quality provided by DLSS makes it an excellent feature. The worst thing about DLSS is that it won't work on AMD/Intel GPUs, but unlike certain features (PhysX), it's not just because Nvidia forced it to be this way. DLSS uses Nvidia's tensor cores and ties into the drivers, and without those there's no way to make it work on non-Nvidia hardware.