Bingo! And even scaling with resolution does not heavily impact memory requirements: Textures are textures, you will be loading the same textures regardless of render resolution at the same game settings (and if you scale back texture resolution - if that's even an option the game exposes - at higher render resolutions you reduce memory footprint!), or geometry, so its buffers that scale with render resolution. Take 1080p to 4k for example: buffer size quadruples, but even if we assume a good 10 buffers (for depth and normals and Z and diffuse and whatever other buffers your render pipeline involves) and 32bpp for each buffer, that goes from ~79MB at 1080p to ~316MB at 4k. Not a huge impact to total vRAM usage.
The vast majority of vRAM is not taken up by buffers or active-use textures and geometry, but by opportunistically cached textures and geometry from the rest of the level that is crammed into any spare vRAM and overwritten (with zero performance impact) if/when actual live data needs that space. That opportunistically cached data may never make its way on screen before being overwritten, but any good engine should be trying to cache it anyway when the PCIe bus is not otherwise occupied and there is spare vRAM, because there is zero penalty from doing so and it may have a small chance of avoiding a cache miss and memory or drive read later. As DirectStorage moves from something individual developers implement to a commonly available API, even that will become less of a necessity as access overheads from out-of-vRAM data are reduced.
When you see a game 'use' large quantities of vRAM, the amount used is almost always what the game has reached by running out of data to cache for the level/chunk loaded, not the amount of data it actually needs for rendering.
I'm not basing the VRAM statements off of what utilities claim is used, but on actual real-world testing. Most games at 1080p are fine, a few at 1440p can exceed 8GB actual use, and a growing number are exceeding 8GB at 4K. From what I've seen, due to the number of buffers used, a game that uses perhaps 4GB of VRAM at 1080p and max settings will need just over 6GB of VRAM at 4K (Red Dead Redemption 2 and several other games that show approximate memory use follow this pattern). And a game that needs 6GB of VRAM at 1080p will need just over 8GB at 4K.
While modern games can opportunistically cache data into VRAM before it's needed, all you have to do is look at performance comparisons between cards with 8GB, 12GB, and 16GB at different settings to determine if the game is truly using more than 8GB. If the 16GB and 8GB cards perform roughly the same at 1080p but the 16GB card is twice as fast at 4K, that's a good indication the game is using more than 8GB VRAM.
The reality is that many games now have 4K textures and even 8K textures. The 8K MIPMAPS almost never get used, but even storing the 4K and lower MIPMAPS can use a lot of VRAM. For example, ONE 4K texture in VRAM needs:
4K x 4K: 64MiB
2K x 2K: 16MiB
1K x 1K: 4MiB
512 x 512: 1MiB
256 x 256: 256KiB
128 x 128: 64KiB
64 x 64: 16KiB
That's ~85MiB for a single 4K texture. Games can have literally hundreds and even thousands of textures, though not all of them are used in every scene. If you look at a game's install size and subtract any video and audio files, a 100GiB game often ends up with 50–75GiB of texture data as the primary reason for the large install size. And if you have an HD texture pack, you can get upward of 50GiB of additional storage space used.