I remember this video demonstrating the problems with 8gb vs 16gb cards.
Oh you can absolutely play with settings or mods to require more VRAM. But from a fresh vanilla install those are the results.
As for that video, people really really do not understand how modern graphics frameworks operate. I can not stress this enough, we now treat VRAM as cache. Just because a program loads 10GB of data into system RAM does not mean we need 10GB of L3 cache. Having 10GB of L3 cache and fitting the entire data set into it would be wonderful for performance, see the AMD 9800X3D.
Textures not rendering has nothing to do with VRAM, that is a graphics engine bug, likely as a result of some developer trying to tune something to prevent a potential performance hit. It appears to be an issue with async shader compilation interacting with a delayed render from the pipeline being paused, instead of waiting the game just seems to skip the whole thing and move on.
Resources are always loaded into a part of system memory called Display Memory or Shared Memory depending on who's asking. After this the game engine has done it's job and now its up to the framework (DX11/12/Vulkan) to preload that resource into VRAM. If it does it's job then assets are always there and a frame renders properly every time. If it misses something, then it needs to pause the rendering pipeline and transfer that resource from system memory to VRAM over the memory and PCIe bus, once received it resumes rendering that frame. From a users point of view this will be observable as a stutter or high frame latency. Every now and then it's not a big deal, but if it's constantly having to do this then it becomes very noticeable.
German website that does a good job of illustrating this. Then used high res pack mods and settings (DLSS/FG/RT) to demonstrate the difference between the cards.

RTX 4060 Ti 16GB vs. RTX 4060 Ti 8GB: Wo sich 16 GiByte zweifelsfrei lohnen - Test mit 11 Spielen
Profitiert eine Geforce RTX 4060 Ti von 16 GiByte oder nicht? Um das zu prüfen, haben wir elf Spiele mit großem VRAM-Hunger in drei Einstellungen überprüft.

Diablo 4 really highlights this happening.

You can see those massive stutters as the VRAM gets full and the graphics framework has to dump and load from system memory.
This all centers on the word "need". You absolutely do not "need" more then 8GB of VRAM for mid tier 1080p or possibly even 1440p gaming. Having more then 8GB of VRAM does allow someone to use DLSS / FG / RT along with 4K textures (high resolution), which is stuff you would normally associate with a 70 model or higher card. People may want more then 8GB to use the new shiny stuff, and that is perfectly fine.
Now the 50 series cards might change this as the memory bandwidth increase is pretty large and DLSS/MFG/RT might not be such performance destroyers as they were on older cards.
Last edited: