• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

Question How much/far can Higher GPU VRAM "Save" a Card

Gamefreaknet

Commendable
Mar 29, 2022
348
15
1,685
Essentially the higher the resolution/quality settings you use on games, programs, software, etc... generally more VRAM will be used.
GPUs also tend to follow a similar trend with lower end GPUs (using the RTX 30series as an example here) generally having less Vram: (using desktop variants although mobile chips also followed a similar "trend")
RTX 3050 6/8GB
RTX 3060 8GB/12GB
RTX 3060Ti 8GB (I think the Intel Arc A770 16GB was considered around same performance as the 3060ti/4060ti)
RTX 3070 8GB
RTX 3080 10GB/12GB
RTX 3080Ti 12GB
RTX 3090(Ti) 24GB

the RTX 40Series also followed a similar trend with:
RTX 4060 8GB
RTX 4060Ti 8GB/16GB
RTX 4070(_/Ti/S) 12GB
RTX 4070 Ti (Super) 16GB
etc... (I wont continue to fill out as all the specs are available from the Nvidia site)

Thus what separates out performance between choosing (for example) a 4070Ti Vs 4060Ti (16GB) Vs Arc A770 16GB ?
From this link I can get a Arc A770/4060Ti 16GB for around £100ish less than a 4070 (since the 4070Ti Super is the only 4070 to have 16GB VRAM). A short list to put it as pros and cons:
Pros: (for a 4060ti 16GB/Arc A770 16GB)
- More Vram
- Cheaper (generally)
- Lower power draw (generally)
- Advantage of larger 256bit bus (Arc A770 16GB) vs 192bit bus (up until the 4070Ti Super)
- tbc...
Cons:
- GDDR6 (not GDDR6X more commonly found on 4070 cards)
- Fewer cores (not sure if the performance difference would be that major/noticeable unless against a 4070Ti Super or more powerful card)
- Slightly held back by the smaller 128bit bus width (4060ti)

But since higher VRAM is generally very useful for storing Textures/Shaders/etc... and even GDDR5X can still (barely) stay relevant (thus GDDR6 is likely still going to be sufficient for a while) is it really worth it going for a 4070(_/Ti/Super)? To be fair the 4070Ti Super does close the gap but costs a fair bit more for similar specs...

(I have been considering swapping my current 4070Ti for a ARC A770 16GB for a while now...)
 
One way it might not help is if they think up some new technique that can't be run on the card. Lots of VRAM won't save it then..

Since my GPU isn't made by either I haven't looked into it. I don't expect to run out of vram on my card anytime soon outside of games with memory leaks (DIablo 4 once used 19gb) - I have a 7900XT

How good is intel RT compared to Nvidia? I don't use it so I also haven't looked into that either.
 
The 4060 Ti 16GB is effectively a marketing tool to get people to splurge on extra VRAM instead of saving for the next tier of graphics card. You have to look at benchmarks for the games you play that have numbers for the cards you're shopping for in your price range. Extra VRAM is going to sometimes save you from crippling framerates but it's useless to have more than necessary.
 
VRAM resides in the graphics card and has become a marketing issue.
My understanding is that vram is more of a performance issue than a functional issue.
A game needs to have most of the data in vram that it uses most of the time.
Somewhat like real ram.
If a game needs something not in vram, it needs to get it across the pcie boundary
hopefully from real ram and hopefully not from a hard drive.
It is not informative to know to what level the available vram is filled.
Possibly much of what is there is not needed.
What is not known is the rate of vram exchange.
Vram is managed by the Graphics card driver, and by the game. There may be differences in effectiveness between amd and nvidia cards.
And differences between games.
Here is an older performance test comparing 2gb with 4gb vram.
Spoiler... not a significant difference.
A more current set of tests shows the same results:
http://www.techspot.com/review/1114-vram-comparison-test/page5.html

And... no game maker wants to limit their market by
requiring huge amounts of vram. The vram you see will be appropriate to the particular card.
 
I don't think you are training AI models, but if you were, then the VRAM would matter a lot. So far as GDDR5X versus GDDR6 goes, "technically" the GDDR6 speed would always help, but it is questionable as to whether it will change anything or matter at all. The fill rate of the VRAM might be limited by your PCIe standard. Even if the PCIe is rather fast, then it might be limited by what the CPU data output rate is. If the GPU itself can't produce work as fast as the data is filled, then 5X versus 6 won't matter. To some extent it also depends on what your CPU can do.

If you are not working with the high end, I would ignore GDDR5X versus GDDR6.

Some games do now actually benefit from more VRAM. I would advise to always get at least an 11 or 12 GB VRAM for gamers.

I also think that NVIDIA and AMD GPUs have more mature drivers versus Intel's newer Arc. Drivers often matter a lot.