I currently have a GTX 1060 3GB and I'm worried that I won't be able to run future AAA games as efficiently. I plan on gaming at 60 FPS 1080p with one monitor and I also have an i5 7600k CPU and 16GB of RAM.
Perhaps less than you think.
VRAM has become a marketing issue.
My understanding is that vram is more of a performance issue than a functional issue.
A game needs to have most of the data in vram that it uses most of the time.
Somewhat like real ram.
If a game needs something not in vram, it needs to get it across the pcie boundary
hopefully from real ram and hopefully not from a hard drive.
It is not informative to know to what level the available vram is filled.
Possibly much of what is there is not needed.
What is not known is the rate of vram exchange.
Vram is managed by the Graphics card driver, and by the game. There may be differences in effectiveness between amd and nvidia cards.
And differences between games.
Here is an older performance test comparing 2gb with 4gb vram.
Spoiler... not a significant difference.
A more current set of tests shows the same results: http://www.techspot.com/review/1114-vram-comparison-test/page5.html
And... no game maker wants to limit their market by
requiring huge amounts of vram. The vram you see will be appropriate to the particular card.
If you are using mods like many skyrim users do, there may be a real limitation since how you do the mods may cause a problem.
For example, take the GTX 1060 3GB vs the GTX 1050Ti 4GB. The 1060 3GB, in addition to the lesser memory, also has slightly fewer CUDA cores than the 1060 6GB.
The 1060 is the more capable card overall, but if it's the 3GB version, there are a few games in which the 1050Ti surpasses it due to the memory issue, even at 1920x1080.
Doom and Hitman definitely perform better with the 1050Ti vs the 3GB version of the 1060, and, while the 1060 still wins against the 1050Ti in Rise of the Tomb Raider, it just barely wins, and the minimum frame rate is worse than on the 1050Ti.
Yet in the other games, the 1060 3GB crushes the 1050Ti.
I got this from looking at the graphs for the review of the AMD RX 570.
The vast majority of VRAM is used to hold textures. When the card needs to place a texture on something it's drawing, it can then immediately read the texture from VRAM instead of having to pull it from system RAM or (heaven forbid) the hard drive.
A texture is actually stored in multiple sizes in VRAM (Mip Mapping), so the card can just grab the appropriate-size texture for how large the object will appear on the screen, instead of having to scale down the size of the texture every time. It used to be the card would generate the Mip Maps when it read the texture off disk. But nowadays, most artists create the texture along with its Mip Maps for maximum artistic quality.
So high-res textures (1024x1024) end up taking (1024^2 pixels)*(4 color bytes)*(1.5 for smaller versions) = 6 MB. 2k textures take 4x as much, or 24 MB. 4k textures take 4x as much again, or 96 MB. So unless you're using 2k or 4k textures, there isn't really much danger of running out of VRAM. If the game is running with a thousand different 1k textures loaded, it'll still be at about 2.5 GB of VRAM. Typically, a game only keeps a few hundred textures loaded at once (the loading screen for different levels is mostly for reading new textures off disk). Given that you're highly unlikely to run higher than 1080p on a GTX 1060, you should be fine. If you ever run out of VRAM, simply drop down the texture quality one notch.
Anisotropic filtering can also chew up VRAM. It's like Mip Mapping except it pre-draws the texture as skewed, to minimize distortion when viewing something with a flattened perspective like a road. You can also turn that down to reduce VRAM usage.
The other parts of the game don't take up much VRAM. Most models are a few thousand kB to a few MB in size. A 1080p framebuffer is only 8 MB, and you'll only have 3 of those if you have Vsync with triple buffering enabled. If you have 4x multi-sample anti-aliasing enabled, that will quadruple the framebuffer size (game pre-renders at 2160p, then anti-aliases 4 pixels down to 1 display pixel). But nowadays other less-intensive (both in processor time and VRAM use) anti-aliasing methods can be used which nearly as good results.