And still, the 4090 is the worst GPU ever made. The amount of issues that SKU has is mind blowing.
Cracking PCBs, GPU solders failure, 12VHPWR...
I take an XTX over a 4080 or a 4090 everyday in the current gen... and that's without talking about pricing...
CUDA pays my GPUs these days, so there wasn't much choice.
What turned out rather mind-blowing was the progress DLSS brought to gaming.
I have both an RTX 4090 and an RTX 4070 side-by-side in two Ryzen 9 16 core systems, 4070 on a 5950X and 4090 now on a 7950X3D. Those CPUs are much closer in performance than I like at the moment, but what's been most impressive is how the 4070 has become rather competent at gaming on 4k.
My kids love ARK so that's what we play, Evolved first and Ascend now.
Evolved runs on a very early Unreal 4 engine and suffers from terrible load times (on Windows) with hundreds of thousands of files and is 100% raster-only.
Ascend runs on Unreal 5, collects its assets into a few large files and supports DLSS up to v3, but none of the Intel or AMD alternatives.
My screen is 4k at 42" below arm's length and the lower resolutions just look awful.
Both ARKs were unplayable on the 4070 initially and far from smooth on the 4090.
But after giving the 5950X some PBO attention I did another round with ARK Ascend on the 4070 at 4k, full EPIC effects but also with DLSS auto and frame generation and the result was rather impressive: there was simply no need to pay for the (GPU) moster to kill or tame them (in ARK): a 4070 with DLSS3 is just as good as a "GTX" 4090.
My first ATI was a Mach8, the very first graphics accelerator I owned. I stuck with them until the R9 290X and for some years ran the Nvidia equivalent in parallel for comparison. But once CUDA became important, AMD/ATI simply lacked work as a sponsor.
The only dGPU in my stables not from team green is an ARC 770m, which only made it into my home-lab because it came basically for free in an Intel NUC I bought as a CPU-only µ-server.
ARK Ascend is unplayable with ARC, and much of that may be lack of XeSS support, judging from some Hogwarts Legacy tests I've been doing, a game that support every tensor based acceleration variant currently on the market.
I love AMD for the Free-Sync impact on monitor prices and I appreciate Intel keeping keeping everyone on their toes with XeSS.
But they are no meaningful competition for my use cases today.
As for hardware issues, I just must have been lucky. Nothing cracked or burning, but I've been extra careful inserting cables, supporting and not transporting plugged boards.
The last real failure was a GTX780 from MSI, I believe, which was just factory overclocked a bit too much and never really worked.
Until I learned how to downclock it a notch, by which time it was too late to matter.