The problem is you don't use more than 3.5 GB in anything at 1080 and 1440 when doing anything "normal" .... by the time you up the settings reach a point where VRAM becomes an issue, your fps is already at an unsatisfactory level. So what if you run out of VRAM at settings at which your GPU is delivering only 23 fps.
From above link. The take home message is in bold below.
http://www.guru3d.com/news-story/middle-earth-shadow-of-mordor-geforce-gtx-970-vram-stress-test.html
Thing is, the quantifying fact is that nobody really has massive issues, dozens and dozens of media have tested the card with in-depth reviews like the ones here on my site. Replicating the stutters and stuff you see in some of the video's, well to date I have not been able to reproduce them unless you do crazy stuff, and I've been on this all weekend.
At 2560x1440 I tried filling that graphics memory, but most games simply do not use more than 1.5 to 3 GB at that resolution combined with the very best image quality settings. This includes MSAA levels of up-to 8x. At the best settings and WHQD we tried, Alien Isolation, Alan Wake, BioShock Infinite, Hitman, Absolution, Metro Last Light, Thief, Tomb Raider, Asassin’s Creed Black Flag.
Can you create a condition where this occurs, of course you can ... like those youtube videos where people have Windows autostart 30 programs to show what a big wow their SSD is .... or opening their browser with 25 tabs ... it just doesn't affect what many of us like to call "real life". As the author goes on to say"
So if you start looking at that resolution and zoom in, then of course you are bound to run into performance issues, but so does the GTX 980. These cards are still too weak for such a resolution combined with proper image quality settings. Remember, Ultra HD = 4x 1080P. Let me quote myself from my GTX 970 conclusions “it is a little beast for Full HD and WHQD gaming combined with the best image quality settings”, and within that context I really think it is valid to stick to a maximum of 2560x1440 as 1080P and 1440P are is the real domain for these cards. Face it, if you planned to game at Ultra HD, you would not buy a GeForce GTX 970.
Well the "so does the 980" part sure blows the hell outta the 3.5 GB argument.
So the two titles that do pass (without any tricks) 3.5 GB are Call of Duty Advanced Warfare and of course that has been most reported to stutter is Middle Earth: Shadow of Mordor. We measured, played and fragged with COD, and there is just NOTHING to detect with the graphics memory fully loaded and in use
As for GamersNexus, I'm not gonna click it but guaranteed it's refers to an Assassin's Creed title. Unity had so many problems and was so "broken" that they offered all users free DLCs as a way of apology.
Read up on the implications behind the problems with the AC series. And I always suspect performance tests in games whose engine has a built in performance cap.
http://www.pcinvasion.com/assassins-creed-unity-pc-port-impressions-minimum-specs-edition
Syndicate has gotten similar complaints. And, more importantly, while to 2GB / 4GB testing was applicable to the games being played when the 7xx series is out, no one has suggested that 2GB is enough for the 9xx series @ ultra settings using the current games that have come out since it's release. Trying to use a 960 on a demanding game with ultra settings is asking a bit to much
Again, as I said above ... well extremetech said it better in the link above and they have 5 pages of data explaining why this question has been put to bed.
We began this article with a simple question: “Is 4GB of RAM enough for a high-end GPU?” The answer, after all, applies to more than just the Fury X — Nvidia’s GTX 970 and 980 both sell with 4GB of RAM, as do multiple AMD cards and the cheaper R9 Fury. Based our results, I would say that the answer is yes — but the situation is more complex than we first envisioned.
First, there’s the fact that out of the fifteen games we tested, only four of could be forced to consume more than the 4GB of RAM. In every case, we had to use high-end settings at 4K to accomplish this.
While we do see some evidence of a 4GB barrier on AMD cards that the NV hardware does not experience, provoking this problem in current-generation titles required us to use settings that rendered the games unplayable any current GPU.
The most we can say of a specific 4GB issue at 4K is that gamers who want to play at 4K will have to do some fine-tuning to keep frame rates and resolutions balanced, but that’s not unique to any vendor. If you’re a gamer who wants 4K and ultra-high quality visual settings, none of the current GPUs on the market are going to suit you. HBM2 and 14/16nm GPUs may change that, but for now playing in 4K is intrinsically a balancing act. The Fury X may require a bit more fine-tuning than the 980 Ti or Titan X, but that’s not grounds for declaring 4GB an unsuitable amount of VRAM in today’s games.
The games used by the way were Assassin’s Creed Unity, Battlefield 4, BioShock Infinite, Civilization: Beyond Earth, Company of Heroes 2, Crysis 3, Dragon Age: Inquisition, The Evil Within, Far Cry 4, Grand Theft Auto V, Metro Last Light (original), Rome: Total War 2, Shadow of Mordor, Tomb Raider, and The Witcher 3: Wild Hunt.
If that isn't enough, as indicated above, we are experiencing
the same problems with 3 GB 780s, 4 GB 970s and 6GB 980 Ti's. Obviously, VRAM is not the issue with this title.