"In an attempt to hammer its point home, AMD supplied some of its own benchmark comparisons versus Nvidia chips, pitting its RX 6800 XT against Nvidia's RTX 3070 Ti, the RX 6950 XT with the RTX 3080, the 7900 XT versus the RTX 4070 Ti, and the RX 7900 XTX taking on the RTX 4080."
I had initially thought that comparing the RX 6800 XT to the RTX 3070 Ti and comparing the RX 6950 XT to the RTX 3080 was another bone-headed move by AMD's marketing team. We all know that the real rival of the RTX 3070 Ti is the RX 6800 and the real rival of the RTX 3080 is the RX 6800 XT. I wondered why AMD would do this because it's like comparing the RX 6800 XT and RTX 3090. It seemed insane...
Then I remembered just how overpriced the RTX 3070 Ti is. In fact, right now, it's
more expensive than the RX 6950 XT! (EDIT: Newegg now has a PNY Verto 3070 Ti that's only $600 but it's still an insane price when the 4070 is also $600). Meanwhile, the RX 6950 XT outclasses the RTX 3070 Ti
in every way. The RX 6950 XT beats the RTX 3070 Ti with RT on and absolutely demolishes it with RT off. This means that anyone who buys an RTX 3070 Ti at this point is fit for the funny farm:
The RTX 3070 Ti is actually $10 more expensive than the RX 6950 XT but $30 more expensive if you claim the $20 MIR from ASRock:
Gigabyte GeForce RTX 3070 Ti 8GB - $640
ASRock Radeon RX 6950 XT 16GB - $630 (-$20 MIR = $610)
I think the reason that they didn't compare the RTX 3070 Ti with the RX 6950 XT is because the pricing situation in the USA might not be the same everywhere else in the world.
No matter which company you prefer, there's no question that AMD is right. We've seen from all reviewers that nVidia hasn't been putting enough VRAM on cards that weren't made specifically for mining (aka anything not named RTX 3060). The stuttering that I've seen in modern AAA titles whenever a reviewer was using an 8GB card, even at 1080p, is not a surprise to me.
People paid
way more for these cards than their Radeon rivals (because they didn't realise just how unimpressive RT actually is) and that is a travesty, but it's a travesty of their own making. This is not a case of nVidia being dishonest or misleading people because the amount of VRAM was made clearly obvious in all listings and on all retail boxes.
Jensen Huang knows that a lot of people who buy nVidia will
only buy nVidia because they have all the tech-savvy of a clueless noob and have never owned anything else. He knew that those sheep would buy his cards even if they had 2GB of VRAM (I'm exaggerating of course, but you get the point). Anyone who is being forced to drop graphics settings on their RTX 3060 Ti, RTX 3070, RTX 3070 Ti or RTX 3080 have only themselves to blame.
I'd like to say that this is another example of nVidia screwing consumers but in this case, nVidia only had to let the consumers screw themselves with their own stupidity. I don't think that nVidia really did anything wrong (for once) because they actually weren't deceptive with their marketing (like they were with the RTX 3060 8GB).
This whole situation is a result of people being lazy and stupid. During the period that ran from 2020-2022, we all know that prices were astronomical. Wouldn't it have behooved people to actually understand just what the hell they were throwing $1,000+ at? Of course it would, but people who are intellectually lazy have a tendency to be brand-whores and often have to learn the hard way.
From the beginning, I had said that 10GB was a ridiculously small frame buffer for a card with the GPU power of the RTX 3080. Of course, lots of gamers (especially the younger ones) tend to be brain-dead egomaniacs who think that they know everything (walking Dunning-Kruger case studies). They're not brave enough to admit when they're wrong, but in this case, reality has kicked them square in the nads.
I only hope that, going forward, people remember this harsh lesson because while it didn't happen to me (RX 6800 XT), I don't like the idea of people getting fleeced, especially today.