mechan :
Amount of graphics memory, in general, does not affect frames per second. It affects the resolution of textures you can load and the level of antialiasing you can run. In two words, it affects image quality. See Part 1 of the Myths series for details.
Yes, I read the whole article. So that much is clear, except that VRAM will affect FPS if there's not enough of it (e.g. a hypothetical GTX 980 with only 512 MB VRAM would drop frames like crazy with high textures and AA even at 1080p). But that's beside the point. I was only talking about high VRAM-usage scenarios that have high textures and AA at high resolutions. Below is a graph from Digital Storm showing that
BF4 on Ultra at 4K with 4x AA can use more than 3 GB of VRAM when it's running on Titan (see blue bar labeled GTX Titan 4x AA), and that it will max out VRAM on the 780 ti (see blue bar labeled GTX 780ti 4x AA):
At the same time, a single 780 ti performs better than a single Titan when it comes to FPS, even though the same settings are used (Ultra, 4K, 4x AA):
The additional VRAM usage was unnecessarily addressed. That shows that, despite the fact that a program
can use more VRAM than a graphics card has, there is a certain amount of used VRAM that the program didn't actually need to be using in order to perform without sacrificing textures, resolution, or AA.
--->M-Y-S-T-E-R-Y<---
mechan :
Some algorithms can make use of more memory, but aren't always noticeably faster. Typically until you end up page-swapping on disk (a massive slowdown), Photoshop-style tasks depend exclusive on CPU (or GPU, if accelerated) speed.
Photoshop will page swap without enough RAM. The point was to show that there's a certain amount of RAM that's more than the amount that results in page swapping, but less than the amount that Photoshop actually uses, that will not result in performance drops. It seems analogous to what happened in BF4 above.
mechan :
We try
Filippo
And you do a good job, which is why I ask these nuanced questions. Where else could I go besides here?