And again, it does matter if the card is fast enough, which is what you need to max it.
Huh ? Did you read any of those articles ? In every case the authors have stated that the only time they could create a problem by exceeding the 4GB barrier was when they were already at unplayable fps because the GFPU couldn't keep up.
from the above extremetech link
Some games won’t use much VRAM, no matter how much you offer them, while others are more opportunistic. This is critically important for our purposes, because there’s not an automatic link between the amount of VRAM a game is using and the amount of VRAM it actually requires to run. Our first article on the Fury X showed how Shadow of Mordor actually used dramatically more VRAM on the GTX Titan X as compared with the GTX 980 Ti, without offering a higher frame rate. Until we hit 8K, there was no performance advantage to the huge memory buffer in the GTX Titan X — and the game ran so slowly at that resolution, it was impossible to play on any card.
So what can you take away from that ?
1. There was no performance, quality or other improvement gained from doubling VRAM from 6 to 12 GB when the game was playable.
2. There was a performance advantage when they got to 8k but at the settings required to see that VRAM advantage the game is not playable.
Every site I quoted says the same thing.... by the time you get the settings high enough where the extra VRAM matters, the GPU is overburdened by those settings and can not deliver a satisfactory experience.
At 4K, [in FC4] there’s evidence of a repetitive pattern in AMD’s results that doesn’t appear in Nvidia’s, and that may well be evidence of a 4GB RAM hit — but once again, we have to come back to the fact that none of the GPUs in this comparison are delivering playable frame rates at the usage levels that make it an issue in the first place
We began this article with a simple question: “Is 4GB of RAM enough for a high-end GPU?” The answer, after all, applies to more than just the Fury X — Nvidia’s GTX 970 and 980 both sell with 4GB of RAM, as do multiple AMD cards and the cheaper R9 Fury. Based our results, I would say that the answer is yes — but the situation is more complex than we first envisioned
.
Again, raising the settings to a point at which having more than 4 GB matters, the game is unplayable at those settings.
While we do see some evidence of a 4GB barrier on AMD cards that the NV hardware does not experience, provoking this problem in current-generation titles required us to use settings that rendered the games unplayable any current GPU.
So on every game tested, to see any advantage for more than 4 GB
at any resolution, you have to use settings at which no GPU made today can deliver acceptable frame rates. I just don't see myself sitting at a PC with a smug smile on my face knowing that i can create a situation where my superior card does not have a VRAM issue .... I can't play any games because fps is unsuitable for 60 Hz let alone a 144 Hz monitor at these settings but I sure can stand tall knowing I don't have a VRAM issue.
As for the stuttering, I got you on one side and I got Guru3D and a dozen other sites saying they can't duplicate your stuttering problem without "doing something freaky".
Tho it's off topic since what we are talking about is your assertion that 4 GB isn't enough for 1080p, you haven't given a single example even that shows a 4GB card doing something that the same card at 2 GB can't do. None of the links you have provided show 4 GB being inadequate at 1080p. In fact your latest link argues exactly the opposite in this respect.
When you need to stoop to rephrasing my position, you weaken yours.... I never said less VRAM is "better". Just said that it's never been shown that outside of poor console ports and "freaky stuff' that having more than 4 GB provides any substantial benefit at 1080p (and even 1440p). To get a benefit from > 4 GB, you need a bigger GPU, otherwise the extra RAM is curtailed by the GPUs performance.
As for you new references:
1. You chose an XBox port as your example. These oft don't do well but won't make an issue of it ... well why would I, they support everything I said
2. Your quote is taken from the methodology section where they are explaining how they test all cards .... not about the performance of the cards in this game.
3. You quote is not about the amount of VRAM available, it's about having enough system RAm and page file to support the VRAM and the entire article contradicts your position....read again
VRAM bottleneck is seen when system RAM usage as well as the pagefile start spiking when at or near the graphic card’s VRAM limitation. Using MSI Afterburner OSD or an equivalent program you can see the Vram, system memory as well as pagefile resources. You should feel a slight stutter when the latter start increasing
The use of the word latter in a list of three things .... when the 3 things are VRAM, system memory as well as pagefile resources ... the "latter" would be "page file resources". This is not about having too little VRAM, it's clearly about not having enough system RAM or pagefile to support the VRAM.
Again, your new reference goes on to strongly contradict your insufficient VRAM conclusion
Even the older [2GB] GTX680 Kepler seems to run the game fine at 1080p with normal and high running just under 60fps. The R9-290 has a clear advantage on all the settings against the newer Maxwell GTX970. We see no strange frametime variations throughout the full hd resolution.
At 1440p our GTX680’s VRAM fills * completely from the low setting (our reference Asus is the 2GB version). Looking at frametimes, even though the game is utilizing all the 680s video memory, there are no major hiccups. For our strict standards the game isn’t playable on the older Kepler, even though the scene we benchmarked isn’t entirely taxing and any minor dip below 30 FPS in other scenarios would be visible lag. The GTX970 and R9-290 seem to handle even high pretty well at just below 60 FPS and again we see the R9-290 in the lead.
* By "fills" we now know, he means "allocates".
So your new reference shows the 2 GB card doing just fine at 1080p (seems 4 GB shud be just fine then, no ?) and the author goes on to state quite clearly that the
2 GB is just fine at even at 1440p. ...
.the VRAM clearly is not a problem.... but playability is at 30 fps....exactly what I have been saying. In this game, you run out of GPU before VRAM can matter. Seems again, we run out of GPU before we run outta VRAM.
At the 4K resolution none of our GPUs can handle even the lowest of settings. The R9-290 has 20-25% higher frame rates and lower frametimes than the GTX970. Strangely though we didn’t feel any excessive stutter other than in the GTX680’s case with the limited VRAM .
Here again, at 4k the game had plenty of VRAM to support the game with the 4 GB 290 / 970 but the game remained at unacceptable frame rates. The 290 did better but they have the same amount of RAM so what's responsible ? Agaiun, ran out of GPU before VRAM. Only way they could get stutter was with the 2 GB card cat 4k. Their observations are exactly the opposite of what you are saying; I couldn't ask for a better link to support what I have been saying. The 4k 970 and 290 showed:
No stuttering at all at any resolution up to 4k with 4 GB of VRAM
No stuttering or other major hiccups at 1080p w/ 2 GB
In each instance, the game was limited by the GPU, not the VRAM.
Your dips in frame rate are a GPU limitation, not a VRAM limitation ... there was no "stuttering".
Consider the following:
Then there is a matter of textures. By default the game automatically determines the texture quality based on the available VRAM on your GPU. In the preview build, settings ranged from low to high but the final code has changed with settings ranging from low through to very high. GPUs sporting 3GB of memory or more default to the very high preset while 2GB cards are limited to high, 1.5GB cards limited to medium and 1GB cards access poverty-spec low-quality art. The game automatically selects the appropriate option and, by default, does not allow the user to adjust this setting.
Please explain why these two links help rather than hurt the "we need > 4 GB for 1080p" position when:
1. The game sets the texture quality to "highest available preset" when the card is equipped with 3 GB. Apparently the game developers don't think having > 4 GB is necessary at any resolution.
2. Why is it that they had no stuttering at any resolution, even 4k ?
3. Why were there no performance issues at 2 GB at 1080p ?
4. Why was the only problem with the game related to frame rates and running out of GPU rather than running out of VRAM ?
5. If you don't understand what impact a poor console port has on PC performance than you need to do more research before this conversation can be continued.
http://www.tomshardware.com/forum/111766-13-console-ports
http://www.pcgamesn.com/assassins-creed-unity/port-review-assassins-creed-unity
http://www.pcper.com/news/Graphics-Cards/Ubisoft-Responds-Low-Frame-Rates-Assassins-Creed-Unity
http://www.extremetech.com/gaming/194123-assassins-creed-unity-for-the-pc-benchmarks-and-analysis-of-its-poor-performance
Again, can you create programs ... of course saw youtube video about a guy who, if I am remembering correctly set one resolution then scaled to another resolution, set max detail and distance and then zoomed in to look at a leaf. So do we put that in the realm of typical for how most people start their gaming sessions.... or do we put that in freaky stuff ?
Reminds me of when an employee wanted me to add an SSD to his computer and to "prove his point" he showed me a side by side comparison of windows booting and auto starting 25 programs.
1. He basically used three programs a day (spreadsheet 90% / word processor 5%, e-mail 5%)
2. He usually left his machine on when he went home
3. When he did shut it down his morning routine was take off jacket, start PC, make coffee, chit chat till it was done and then sit down.
So yes, he was able to create a situation or a "case" for a SSD... but a) it was not representative of his normal activities and b) even it if it was, the payback period was in excess of 10 years based upon boot time saved.
Same thing here... you can create situations where 4GB does something for you, but as all the web authors have stated, ya gotta create some freaky scenarios.