-Fran-
Glorious
This, again, is not theoretical: it has been proven to be the case. Not even on extreme edge cases, but normal usage within games.That's not accurate, the value is only how much VRAM has been allocated not what is currently in use. As I've stated modern engines do not evict old resources until they absolutely have to. Actually it's not really the game engine but WDDM that manages this. You have a lot more graphics memory then you do VRAM, run dxdiag and check Display 1 if you want to see how much you have available. In my case I have 32GB of system memory and a 3080 hydro (12GB) card, WDDM will report me as having 27.7GB of available graphics memory, meaning there can be a total of 28GB of graphics resources loaded at once.
Now that region is obviously split into two, 11.8GB display memory and 15.9 of system memory and it's WDDM's job to manage which of the two places graphics objects are located. I'll use a simple five room game level that is using dynamic loading (since nobody likes loading screens) to illustrate.
Game starts - loads 1 GB of common assets.
Player enters Room 1, 4GB of graphics assets are loaded.
Player enters Room 2, 2GB of new graphics assets are loaded that didn't exist in Room 1.
Player enters Room 3, 2GB of new graphics assets are loaded that didn't exist in either of the previous rooms.
Player enters Room 4, 2GB of new graphics assets are loaded.
How much total memory is "allocated" vs "needed" in this scenario? GPU-Z and other tools will show 11~12GB of total VRAM "in use" or "allocated", because there is indeed 11~12GB of total graphics resources loaded by WDDM. How much is "needed" at any one time is only 4~6 GB. Players inside Room 4 do not need the assets from Room 2 that aren't present in Room 4 and when the player is moving between rooms the engine is asking WDDM to pre-load the new assets, which is then moves into graphics VRAM. WDDM won't unload pre-existing assets unless it runs out of VRAM because they may be referenced again. Think of unused VRAM as a rudimentary graphics cache.
Now that we understand that, what is the resource that consumes the most VRAM, textures. This is where graphics pre-sets become very important. Ultra setting is almost always this unreasonably large texture sizes, as in the textures are larger then the display resolution. On a large 4K display with a high powered card these could create slightly better detail then textures half their size, but on a 1080~1440p display with a low powered card they are utterly useless. Reducing just the texture size alone will dramatically reduce the VRAM needed at any one moment in time.
More then 8GB of VRAM is only necessary on the larger more expensive cards using higher resolutions and ultra texture sizes. Nobody buying a 4060 / 4060Ti is going to be playing on a high resolution display with "ultra" settings.
Sorry, but you're wrong on this one. How the technicality works behind the scene is quite moot.
Regards.