Nvidia's "Lossless Image Compression" (LIC) technology allows for more efficient use of VRAM by compressing graphics data.I read and heard somewhere (I can't recall exactly where) that Nvidia can get away with lower VRAM with their GPUs compared to their AMD counterparts since they have mastered something in relation to "Lossless Image Compression". If I'm not mistaken and from how I understand it, it works just like video encoding, their GPU can compress graphics and artifacts in a very lossless manner that the VRAM isn't used all too much. I've noticed this with the 6800XT and the RTX 3080, both cards are very capable and very similar in terms of performance. But the RTX 3080 most of the time is "faster" (Depends on how the benchmark was done really). Is this true? If this were the case, then the "longevity" that AMD offers with their GPUs because of their high VRAM isn't really a true "selling point" then, because both the 6800XT and RTX 3080 have been tested this year yet again in some video I watched, and their performance is relatively close with each other. But of course this argument will be different for people who play at 1440p and 4K, but for yours truly, I only game at 1080p, so the longevity with AMD GPUs is not really for "longevity" since at 1080p, those cards perform the same and the RTX 3080 with less VRAM pretty much performs the same, or in most cases, is faster, and comes with all the marvels like RT and DLSS. So what's the point of having more VRAM on a GPU if there's another one that pretty much performs the same but with less VRAM. I heard AMD is pretty much a million miles away from mastering this so-called "Lossless Image Compression" that Nvidia has already mastered, or so they say. Correct me if I'm wrong here guys, willing to learn as much as I can. Apologies for the long read.
GPU performance is influenced by various factors, including architecture, clock speeds, memory bandwidth, and software optimization, not just VRAM capacity.
Benchmark results can vary depending on test conditions, settings, and games used.
For gaming at 1080p, the benefits of higher VRAM capacity may not be as significant compared to higher resolutions like 1440p or 4K.
Both Nvidia and AMD continue to make advancements in GPU technologies and optimizations.
VRAM capacity is just one factor to consider, and it's important to assess benchmark results across different games and scenarios to understand overall performance capabilities.