I haven't bought an nVidia discrete GPU since 2001, IIRC. In 2002, ATi stole the show for a couple of years with the R300 and its descendants. nVidia couldn't make anything that came close for that period. It's been a steady stream of ATi/AMD GPUs for me ever since. I have 0 complaints. My last Intel CPU was an original Pentium, circa ~1998. The A64 from AMD (Win2k & 64-bit VISTA, as I recall) was quite nice when Intel was sticking to 32-bits and running ad campaigns entitled:
You don't need 64-bits on the desktop (believe it or not!
), only because they weren't ready with an x86 64-bit CPU and were still trying to move Itanium with RDRAM. Eventually, Intel threw in the Itanium towel and licensed x86-64 from AMD, went to SDRAM, selling off its $1b stake in RDRAM, and then proceeded to bust AMD's performance chops at their own game for several years, when at last A64/Opteron ran out of steam and AMD had nothing planned in the pipe (most unlike the current AMD). But, I found AMD adequate, as I've said, a better buy, and there were no games I could not play. So, I've been all AMD ever since.
No problems with gaming along the way, either. I'm not overly impressed with benchmarks because companies (looking mainly at you, nVidia
cough) have been known to cheat with widely used benchmarks, especially synthetics. For me, angst with nVidia's tactics goes all the way back to 3dfx and I still recall the dirty tricks from nVidia in those days (the 3dMark nVidia cheat was a big deal in those days and I still recall it vividly.) My bottom line is value, and whether I can play games smoothly without stutter. AMD has been fine over the years, in that regard. Of course, today, there are no concerns over GPU frame rates or CPU performance with AMD.
But...what other people buy doesn't bother me. I want them to enjoy whatever they buy! Just like I do. We owe it to ourselves...