razor512
Distinguished
Why infer when we have data? The proof of the pudding is in the eating.
I was talking about it from the down to the MB level. Unlike system memory we can't know exactly how much VRAM is actively being used but the allocated is often not far from what is being used, and the signs a lack of VRAM is clear, as once you dip into system memory, the performance hit is extreme. In the case of the the 970 it has aged poorly compared to equivalent AMD cards, and it is entirely due to the VRAM. While with newer games wanting 9+GB is a separate issue, for a few years we were in a period where games wanted 4-7GB while the GTX 970 had the compute power to run them, the lack of VRAM meant that those games would perform poorly as soon as the shared memory amount started to climb. In those cases, the AMD cards remained very playable while the Nvidia card would drop to the unplayable range.
In the case of system memory, we get a little more detail.
On a side note, the overarching issue with all of this when it comes to newer videocards, is that we are seeing games where the GPU is capable of handling higher settings but the VRAM is holding it back. As more and more PS5 level games get released on the PC, we will see more cases of cards like the RTX 3070Ti requiring the user to use lower settings than the 12GB RTX 3060
And cards like the RX6800 and 6800XT getting increasingly large performance leads.
Last edited: