I realize that the details of the card's architecture are irrelevant to me as a consumer, it's just something that the review on this site has mentioned, that the chiplet design may actually be working against the card.
And in general, I never liked the premise behind the whole chiplets + infinity fabric thing ever since the advent of Zen.
It always seemed like a measure to save costs for AMD at the cost of higher latency / worse performance, and not a measure to improve performance, and early Zen CPUs had all kinds of issues associated with this architecture.
Even today, on the 4th generation of this technology, AMD achieve the best performance on CPUs that have all their cores inside one chiplet.
Just something one of the reviews mentioned, that this card loses more relative performance as the resolution increases.
It's easier to explain this on an example. Let's say we're comparing nVidia RTX 4080 with 7900 XTX, and establish a baseline at 1080p. for the sake of the example, let's say they have exact same FPS at that resolution.
Then we increase resolution to 1440p, and let's say both cards lose 25% FPS. Even so far.
Then we increase resolution to 4k, and the RTX card loses 50% FPS, while XTX card loses 65% FPS.
Numbers are made up, this is just to explain what I meant.
EDIT: Found
real numbers:
While the
RTX 4090 does technically take first place at 1080p ultra, it's the 1440p and especially 4K numbers that impress. It's only 3% faster than the next closest RX 7900 XTX at 1080p ultra, but that increases to 8% at 1440p and then 23% at 4K.
Basically for me the choice was either to go with an 800$ nVidia Card, or a "more powerful" 1000$ AMD card, with better raw rasterization performance and massively larger amount of VRAM, so it can last a really long time. But the poor scaling with higher resolutions is an argument against this card's longevity.
The card is great for 1440p, which is what I'm using, and I don't really plan to upgrade to 4k in the foreseeable future, I actually think 4k gaming is a bit of a marketing scam, but I would consider using an ultrawide, and I don't want to feel that turning my GPU carriage into a pumpkin.
Other issues I had with the AMD card are seemingly still a bit raw chiplet design (first GPU series to feature chiplets at all, so this is Zen 1 all over again), lack of Reflex-like tech to reduce latency to offset added latency caused by frame generation, worse upscaling / raytracing performance.
Plus I generally try to avoid AMD tech if possible, they seem to be always playing catch-up in terms of tech and performance, the X3D chips in gaming applications and high core count on early Zen CPUs that forced Intel to lift their metaphorical ass for the first time in a decade are practically the only examples of the opposite in recent memory. I still might upgrade my 10600FK to 7800 X3D this year, though, but for the GPU 4070 Ti Super is pretty much perfect for me.