Also I wanna know if Vram what matters or GPU model is what matters in game performance?
GPU model is not the same as VRAM. The performance is going to be more noticeable on a higher-end GPU model, rather than an low-end GPU which has more VRAM than the high-end GPU. GPU model is more about the board quality, brand from different AIBs (custom models).
I will say this once again. VRAM is not the only issue/criteria when upgrading any GPU. It depends on the screen RES, 1080p, 2K, 4K etc. The higher the screen resolution, then more VRAM you may require, but it won't double your FPS. Having more VRAM
doesn't make any GPU faster either. Nope. There are other technical specs which matter, like the amount of
CUDA CORES, STREAM PROCESSORS, ROP/TMU count, MEMORY BUS width, Bandwidth etc.
A GTX 960 2GB variant GPU is slightly faster than the GTX 1050 2GB having the same amount of VRAM. So you may ask WHY ? That's because the GTX 960 packs more Cuda cores/shading units than the 1050, 1024 Vs 640 on the GTX 1050.
Vram doesn't really make sense if GPU itself is crap?
It does make sense. If the GPU itself is crap, then that would be due to the other specs, not just VRAM.
This is a myth: GPUs with 3 GB of memory are faster than those with 1 GB ? Nope. The memory capacity a graphics card ships with has
no impact on that product's performance, so long as the settings you're using to game with
don't consume all of it.
What does having more video memory actually help, then? In order to answer that, we need to know what graphics memory is used for.
- Loading textures
- Holding the frame buffer
- Holding the depth buffer ("Z Buffer")
- Holding other assets that are required to render a frame (shadow maps, etc.)
Of course, the
size of the textures getting loaded into memory depends on the game you're playing and its quality preset. As an example, the
Skyrim high-resolution texture pack includes 3 GB of textures. Most applications dynamically load and unload textures as they're needed, though, so not all textures need to reside in graphics memory. The textures required to render a particular scene do need to be in memory, however.
The frame buffer is used to store the image as it is rendered, before or during the time it is sent to the display. Thus, its memory footprint depends on the output resolution (an image at at 1920x1080x32 bpp is ~8.3 MB; a 4K image at 3840x2160x32 is ~33.2 MB), the number of buffers (at least two; rarely three or more).
The most important factor affecting the amount of graphics memory you need is the resolution you game at. Naturally, higher resolutions require more memory. The second most important factor is whether you're using one of the anti-aliasing technologies. Assuming a constant quality preset in your favorite game, other factors are less influential.
So Vram does not transform trash into a Bugatti, Right?
Yes. VRAM is not the only deciding factor.