Question Any programs that show VRAM usage? (Not allocation!) [VRAM allocation is NOT VRAM usage]

kuba2002.11

Commendable
Apr 3, 2018
79
1
1,535
0
First of all my question is do you guys know any programs that show VRAM usage? I'm not talking about programs like MSI afterburner, Aurus Engine, or even the Task Manager because they show the allocation

And on the side of things, a lot of people are worried that the 8GB or the 10GB of VRAM that the 3070 and 3080 come with are not enough for 1440p or 4k because of people constantly talking about how "8GB of VRAM is not enough even at 1440p, Doom Ethernal uses 9GB of VRAM at 1440p and 11 at 4k" which is just crazy. They scare people with VRAM allocation maybe not even knowing themselves that VRAM allocation is NOT VRAM usage, a game can allocate 8GB of a VRAM on startup but use around 5 to 6GB while playing.
People need to chill out.
 

InvalidError

Titan
Moderator
There is no way for external software to know how much of the memory that applications request actually gets used short of running a debug environment to spy on the GPU's internal activity.

As for panicking about the amount of VRAM some games use, there are many benchmarks where GPUs that perform similarly in many games end up performing very differently in others with the main variable between the two being the GPU that holds up better has 2-4GB more VRAM. Then again, this mostly happens when using 8k texture packs on 8GB GPUs and a simple fix is to use 4k textures instead.
 

kuba2002.11

Commendable
Apr 3, 2018
79
1
1,535
0
There is no way for external software to know how much of the memory that applications request actually gets used short of running a debug environment to spy on the GPU's internal activity.

As for panicking about the amount of VRAM some games use, there are many benchmarks where GPUs that perform similarly in many games end up performing very differently in others with the main variable between the two being the GPU that holds up better has 2-4GB more VRAM. Then again, this mostly happens when using 8k texture packs on 8GB GPUs and a simple fix is to use 4k textures instead.
Thanks for the answer, what do you think about the vram reccomendations for 1440p? I would say 6GB is still solid for 1440p and 8GB will hold on for the next 2-3 years
 
Thanks for the answer, what do you think about the vram reccomendations for 1440p? I would say 6GB is still solid for 1440p and 8GB will hold on for the next 2-3 years
I don’t think it’s that straight forward. Let’s say you are using a 2060 6GB, at 1440p in gpu demanding games you are unlikely to be running max settings to achieve 60+ FPS, probably medium settings with no RT. If that gpu had 8gb it probably could not take advantage of it as it can’t run high enough settings at the desired FPS. However a 3070 8GB could run max settings with RT on and may need that extra 2GB. Is the 3070 likely to be running the latest gpu heavy games at max settings in 2-3 years, I’d doubt it. If you want the same performance as today’s games you will likely need settings equivalent to today’s games.
 

kuba2002.11

Commendable
Apr 3, 2018
79
1
1,535
0
I don’t think it’s that straight forward. Let’s say you are using a 2060 6GB, at 1440p in gpu demanding games you are unlikely to be running max settings to achieve 60+ FPS, probably medium settings with no RT. If that gpu had 8gb it probably could not take advantage of it as it can’t run high enough settings at the desired FPS. However a 3070 8GB could run max settings with RT on and may need that extra 2GB. Is the 3070 likely to be running the latest gpu heavy games at max settings in 2-3 years, I’d doubt it. If you want the same performance as today’s games you will likely need settings equivalent to today’s games.
Good point, that's why the 3070 is more of a 1440p GPU then 4k. It doesn't need more then 8GB of VRAM for 4k BECOUSE it can't run 4k60fps in most games.
But at 1440p that amount of VRAM and pure horsepower is a very good combination, don't you think ?
 
You ask an important question for which the true answer is a bit murky.
What you are asking is how limiting is VRAM for a particular game.

VRAM has become a marketing issue.
My understanding is that vram is more of a performance issue than a functional issue.
A game needs to have most of the data in vram that it uses most of the time.
Somewhat like real ram.
If a game needs something not in vram, it needs to get it across the pcie boundary
hopefully from real ram and hopefully not from a hard drive.
It is not informative to know to what level the available vram is filled.
Possibly much of what is there is not needed.
What is not known is the rate of vram exchange.
Vram is managed by the Graphics card driver, and by the game. There may be differences in effectiveness between amd and nvidia cards.
And differences between games.
Here is an older performance test comparing 2gb with 4gb vram.
Spoiler... not a significant difference.
A more current set of tests shows the same results:
http://www.techspot.com/review/1114-vram-comparison-test/page5.html

And... no game maker wants to limit their market by
requiring huge amounts of vram. The vram you see will be appropriate to the particular card.

GTX1060 does come in 3gb anf 6gb variants, but the 3gb version has fewer CUDA cores, so that is not a valid comparison.

RX5500 4gb does apparently suffer in comparison to RX55008gb.
I have not studied why.
 

InvalidError

Titan
Moderator
Thanks for the answer, what do you think about the vram reccomendations for 1440p? I would say 6GB is still solid for 1440p and 8GB will hold on for the next 2-3 years
Good point, that's why the 3070 is more of a 1440p GPU then 4k. It doesn't need more then 8GB of VRAM for 4k BECOUSE it can't run 4k60fps in most games.
If you are willing to nudge details down a bit, VRAM usage can go down a lot and much weaker GPUs become viable at 4k. Even my GTX1050 (2GB) is ok-ish at 4k on low/lowest details in what few new-ish games I have.
 
This topic can be a tricky one. Because AMD and nvidia have different "tolerance" to VRAM pressure. I suggest you to look at TPU test of several of the more recent games (horizon zero dawn, watch dogs legion, assassin's creed valhala and godfall to name a few) and look at the result at 1080p and 4k. Look how cards with small memory end up behaving at 4k where VRAM/bandwidth stress are much bigger.
 

kuba2002.11

Commendable
Apr 3, 2018
79
1
1,535
0
This topic can be a tricky one. Because AMD and nvidia have different "tolerance" to VRAM pressure. I suggest you to look at TPU test of several of the more recent games (horizon zero dawn, watch dogs legion, assassin's creed valhala and godfall to name a few) and look at the result at 1080p and 4k. Look how cards with small memory end up behaving at 4k where VRAM/bandwidth stress are much bigger.
Link us a video if you have any, It could be helpfull
 
I
Link us a video if you have any, It could be helpfull
to be honest i'm not sure it will be helpful even if there is video about this. even if we can give them the evidence that for example 3080 with it's 10GB should do just fine in many games right now we still cannot guarantee with 100% certainty that it won't be a limiting factor in the future. this uncertainty alone will make a strong case to go with cards with more memory just to be on the safe side. even nvidia should understand this. so either they have to be stubborn on it or bow to market demand. if RTX3080Ti with 20GB of VRAM end up being a reality that's mean nvidia simply bow to market demand.

and as i said before nvidia and AMD have different tolerance towards VRAM and bandwidth pressure. so for example having only 4GB VRAM might be fine for nvidia but it might not for AMD. and because of this you cannot make same assumption for both AMD and nvidia cards. look at this video:

View: https://www.youtube.com/watch?v=693NBRwk8z4


even if 1050Ti is slower the card still able to render more details than AMD card! and no one can say that nvidia sabotage AMD GPU in this game since this game is AMD sponsored title. obviously the 4GB on AMD card becoming an issue but not on nvidia. then to look further into this we can look at tech power up test done on the game.



from TPU test even at 1080p VRAM usage can exceed more than 6GB. let's look at 1080p performance:



so despite the game can use more than 6GB at 1080p we see more or less all the GPU tested able to perform just fine. even 6GB cards like RTX2060 and AMD RX5600 XT still able to outperform 8GB cards like GTX1080. now let's look at 4k result:



at 4k the game will use around 8GB of VRAM. here we start seeing AMD 4GB card simply fail to run the game. at this resolution even nvidia GTX1060 3GB are doing better than AMD 4GB card like RX570 and the new RX5500. 6GB cards like RTX2060 and RX5600 still doing quite well even if the game demanding around 8GB at this point.

now let's look at other game. watch dogs legion. again data from TPU:




VRAM usage is a bit less that what we see on horizon zero dawn although at 1080p the game also asking for 6GB of VRAM. let's see the performance at 1080p:



here at 1080p even AMD 4GB card start being hammered hard. and the game actually demand a bit less VRAM than horizon zero dawn. it becoming more absurd when we look at 4k.




forget about nvidia card. look where RX580 and RX5600 XT performance wise. the game require more than 7GB of VRAM (a bit less than horizon use at 4k) but it hit RX5600 real hard in performance. RX580 end up almost doubling the performance just because the VRAM usage is less than what the card had!

so when we have data like this we cannot make one definite conclusion that applicable to all like "what we see is VRAM allocation not actual VRAM being use by the game" since it will affect AMD and nvidia differently. at this point we can only say that nvidia GPU handles memory and bandwidth pressure much better than AMD. the game can actually asking less VRAM but when it's required VRAM are not met it can hit the performance real hard. or we can have something like horizon. it asking more but when not met it did not tank performance as hard as we think it should.
 

InvalidError

Titan
Moderator
so when we have data like this we cannot make one definite conclusion that applicable to all like "what we see is VRAM allocation not actual VRAM being use by the game" since it will affect AMD and nvidia differently.
I suspect the reason Nvidia is less adversely affected by low VRAM than AMD is because Nvidia's drivers and GPUs are more agile at swapping things between system RAM and VRAM as asset usage dictates - no point in wasting VRAM on assets that haven't been used in a while when there are assets being streamed from system RAM that would make better use the VRAM space. Also, where the RX5500 results are concerned, if those benchmarks were done on a 3.0x16 platform, then Nvidia has the additional advantage of having twice the PCIe bandwidth to do those swaps with.

Another thing to keep in mind with the RX5500 is that it only has an x8 interface and 4.0x8 vs 3.0x8 benchmarks have shown that the 4GB model performs 50-80% better on 4.0x8 in heavily VRAM-constrained scenarios, letting it almost catch up with its 8GB counterpart in some cases. AMD screwed the pooch on this one unless its plan was to cripple the RX5500 such that only the 8GB models would be viable for people on a PCIe 3.0 platform.

Once you push details to the point that even faster GPUs with more VRAM can't break 60fps though, you are much better off lowering your details expectations. By the time you have lowered details enough to be back within comfortably playable range, the "VRAM pressure" is rarely still an issue.
 

ASK THE COMMUNITY

TRENDING THREADS