It's strictly a resolution issue.
The 3 GB has less shaders (1152 vs 1280) and obviously less RAM. On average the 3Gb is about 7% slower than the 6Gb version in most games. At 1080p, there's really no need for more than 3 GB and at 1440p, > 4 GB isn't doing anything for ya. So whether it's "worth it" can best be weighed by the difference in cost and that is varying daily.
So whether it's 'worth it" or not clearly depends on the price when you buy it. The 3GB version is sometimes found a slow as $195. The MSI Gaming X is $210 for 3GB... the 6GB is $240, both after $20 MIRs. So spending 14% more to get a 7% increase in performance ... not exactly a good ROI.
The MSI 3GB model has about the same overall performance as the RX 480 which is much more expensive, and about 6% slower than the reference 1o60
https://www.techpowerup.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/26.html
At 1440p, I'd get the 6 GB... if at 1080p and you could better spend that $30 elsewhere, then the 3 GB will do just fine.
Most folks have a misconception about VRAM usage ... can't count the number of posts posting that they did this or that or read this or that and under this condition the card **used more than x GB**. The fact that no tool exists which can measure VRAM usage doesn't seem to sway anybody. What GPUz and other utilities measure is VRAM allocation. This is very different from usage, They put in a 4 GB card and an 8GB card and see 4.5 GB in GPUz and have that big "aha" moment, not realizing that GPuz isn't saying what they think it is saying.
GPuz is like the credit reporting agencies. If you have $500 on your VISA card and that card has a $5,000 limit, which number do the credit agencies report ... $5,000. The bank you went to for your car loan wants to know your maximum liabilities, and VISA has authorized you to spend $5,000 already so $5k is what gets reported.
GFX cards do the same thing. When you install a game the install routine sees how much RAM is available ... if it sees 3 GB, it might allocate 2 GB...if it sees 6 GB it might allocate 4 GB.
http://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x
GPU-Z claims to report how much VRAM the GPU actually uses, but there’s a significant caveat to this metric. GPU-Z doesn’t actually report how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, whether it’s GPU-Z, Afterburner, Precision, etc. They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.”
Alienbabeltech did the most exhaustive test on this topic on some 40+ games using 2Gb and 4Gb 770s with resolution of 5760 x 1080 Only 5 games showed a significant difference and that was at 5960 when the card was so overloaded both the 2GB and 4GB cards could not deliver 30 fps. The one game that showed an oddity at playable frame rates was Max Payne. They could not install the game at 5950 with the 2Gb card ... so they installed it with th 4Gb card, then ... they swapped in the 2 GB card ... same frame rates, same visual quality, same everything.
You can see the same thing here and here:
http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,12.html
https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
There are games that have issues:
a) Poor console ports like Assassins Creed: Unity
b) Games with hi res textures for use with higher resolutions ... if you try and load the high res textures, the game may not let you ... but if at 1080P, why would you do that
c) Some DX12 titles (Rise of the Tomb Raider and Hitman) take a hit with 3GB, not sure why as yet as some with similar frame rates and textures don't. Not seen an explanation yet