Discussion Why do GPUs (especially older high end ones) have more VRAM than they realistically will be able to utilize?

Order 66

Grand Moff
Apr 13, 2023
2,166
912
2,570
For example, I thought the Radeon VII with its 16GB of VRAM would be good in 2023, given the fact that games are using more and more VRAM, but the GPU can't keep up with the 16GB of VRAM. My real question is why older GPUs have more VRAM than they need when by the time it is needed, the GPU won't be able to keep up.
 
AMD used vram capacity as advantage in their marketing.
Idea is - "our GPUs have more vram. They are better than competition because of this."
Which is 100% fine, the more VRAM the better, but I feel like if GPU is going to be released with more VRAM than it currently is going to use, I feel like the raw performance of the GPU needs to be as fast as is possible at the time of release. I think if the Radeon VII had the GPU horsepower of the 2080 ti, it would be much better off nowadays. (I am aware of the massive price difference between the 2080 ti and Radeon VII at launch, but without the need for RT or tensor cores on the VII, I think it could be feasible to have that much GPU horsepower with 16GB of VRAM for a bit more than the MSRP of the VII.)
 
At the time AMD was essentially taking their workstation GPU and pushing it as a high end gaming card. It had a lot of capabilities that didn't translate well to real time gaming applications.

Vega was poorly received because it had a max of 8GB of memory at a time when Nvidia was selling 11GB/12GB cards. 1080Ti was a pretty good deal at the time.

For a response the only way to go was up, so Radeon VII had 16GB of HBM. It was comparable in price to the RTX2080 and traded blows with it in games. And the 2080Ti had its huge launch price. I think the real problem was the margins. 16GB of HBM memory was a lot of money, so AMD didn't make as much on a Radeon VII as it would have liked.

AMD had to start up RDNA from scratch to correct and make something competitive. While the Vega line lives on as CDNA.
 
It's hardly an AMD-only thing.

Take the Titan X, with its 12 GB of VRAM.

KitGuru "12GB of memory is not only excessive for today’s video games, but is also very expensive...the new GeForce GTX Titan X should demonstrate unbelievable performance in ultra-high-definition resolutions, such as 3840*2160. The ultra-expensive GPU and a lot of memory will naturally make the new Titan X a very expensive graphics card.

Example benchmark from the time: Tomb Raider in 4K and no AA was 44 fps.
 
For example, I thought the Radeon VII with its 16GB of VRAM would be good in 2023, given the fact that games are using more and more VRAM, but the GPU can't keep up with the 16GB of VRAM. My real question is why older GPUs have more VRAM than they need when by the time it is needed, the GPU won't be able to keep up.
The counter argument to Nvidia still releasing new cards in 2023 with 8gb. rely on features to sell the cards. Its all marketing really. One side has too much, other only just enough... one side just needs cards that have legs to last long enough for the cards with just enough to struggle... here we are.
I would prefer to have too much than not enough... of pretty much anything.

the amount of ram on those cards wasn't matched by their ability. Some of the more recent ones does. AMD were making missteps all over the place, Bulldozer says hi. They have corrected their path.
 
The counter argument to Nvidia still releasing new cards in 2023 with 8gb. rely on features to sell the cards. Its all marketing really. One side has too much, other only just enough... one side just needs cards that have legs to last long enough for the cards with just enough to struggle... here we are.
I would prefer to have too much than not enough... of pretty much anything.

the amount of ram on those cards wasn't matched by their ability. Some of the more recent ones does. AMD were making missteps all over the place, Bulldozer says hi. They have corrected their path.
That’s more what I was talking about, I feel that my 6800 will be able to utilize its 16GB of VRAM even years down the line. I definitely feel like Nvidia is skimping on the VRAM, because as I’m sure you remember, there was an article posted here about someone who doubled the VRAM on a 3070, leading to massive performance gains. Although, I’m not sure 16GB of VRAM makes sense for a low end Nvidia GPU (like the 4060 ti) as tests have proven, the 16GB of VRAM makes little difference on such a low end card. I also think the 4080 should have been a 3090 replacement with 24GB of VRAM if it was going to cost $1200. The 4090 should have been called the titan with 32 or even 48GB of VRAM. I am aware that Nvidia would never consider this, but I think that is what they should have done.
 
Vram can be used as storage so graphics cards can later access that data for faster rendering. This is where we are now.

Difference back then was people had the idea that vram was solely used to render data in realtime and needed the GPU processing power to make use of all that data at once. Game development has changed and those older graphics cards with useless X amount of vram wouldn't be so useless now. Just so happens that they're slow anyway to today's standards.
 
  • Like
Reactions: Order 66
You may recall the two models of 960 with the differing VRAM total. One was too little even for its power and the other was too much. Marketing thing pretty much the same as it is now. The big thing now is if you don't have a card with "X" amount of RAM (greater than 8) then your rig is trash and you need to burn it down because you can't even play solitaire with it today.... (not the exact quote from Hardware Unboxed, but same spirit)

Market jargon and buyer manipulation, plain and simple.
 
  • Like
Reactions: Order 66
Although, I’m not sure 16GB of VRAM makes sense for a low end Nvidia GPU (like the 4060 ti) as tests have proven, the 16GB of VRAM makes little difference on such a low end card
The 4060ti 16gb is a bit of a misnomer. The problem with it is not so much the vram amount, but more the low end mem bus. At 128bit that GPU is just hobbled. The core count give it performance. Naff above 1080p.
 
Radeon VII was mostly intended to be a professional workstation card which is why it had 16GB of HBM2. The VRAM didn't have much (if any) effect on gaming workloads.

Cards with excess VRAM aren't a problem but cards with insufficient VRAM are.
 
  • Like
Reactions: Order 66
Nvidia didn't want to put huge amounts of Vram on the consumer cards as professionals would just buy them if they were cheaper. Its only now the RTX 6000 has 48gb of VRAM that they are likely to offer more on top end. https://www.nvidia.com/en-au/design-visualization/rtx-6000/
RTX 6000 costs $15k here, I know a 4090 is cheaper. I could guess the 5090 be somewhere between them in price. So I can't see it having that much VRAM or be cut down in some way to not be a good option.
I think AMD already offered higher VRAM amounts on their Pro cards so could offer 24gb on one card without disturbing their chain.
 
Nvidia didn't want to put huge amounts of Vram on the consumer cards as professionals would just buy them if they were cheaper. Its only now the RTX 6000 has 48gb of VRAM that they are likely to offer more on top end. https://www.nvidia.com/en-au/design-visualization/rtx-6000/
RTX 6000 costs $15k here, I know a 4090 is cheaper. I could guess the 5090 be somewhere between them in price. So I can't see it having that much VRAM or be cut down in some way to not be a good option.
I think AMD already offered higher VRAM amounts on their Pro cards so could offer 24gb on one card without disturbing their chain.
That’s a fair point. You changed your profile picture, why?
 
For example, I thought the Radeon VII with its 16GB of VRAM would be good in 2023, given the fact that games are using more and more VRAM, but the GPU can't keep up with the 16GB of VRAM. My real question is why older GPUs have more VRAM than they need when by the time it is needed, the GPU won't be able to keep up.
issues with pc hardware is the most variations and developers of games will aim for a middle ground gpu utilization.

most cards back in 2019 where 6-8gb in the lower price segment which is where most of the pc user base is developers are gonna develop games with that as there bar. game development takes years so utilization of most games from that period arent going to demand more.


699 was the radeon vii most people probly didnt fork out for that in 2019.
however my 2060 super which was round 330-400. at time is handling games better then the vii because those games where in development probly in 2019.


the good thing about cards like the 3060 12gb which is in the 250 -300 pound range which is a good thing as that card will potentially be more wildly adopted. i recon 12gb will be the norm in newer games because cards in that catagory 200-300 category. which means developers will be refining cards cramming as much in as you can get.

this is non-existent on consoles since ps5 and xbox only have 2 consoles each there only programming for 4 devices. consoles are helping with gpu memory usage though as developers are actually developing with 12gb or 16gb in mind. though in truth with consoles its probly more 8-12gb still due to they need to share ram in consoles.
 
  • Like
Reactions: Order 66
IDK, in relation to both of the last comments about too much being better. Let us not forget that the card manufacturer is charging you a premium amount of money such that you can look at your VRAM and think you have something that is really better. Bigger number = better product cause I paid more for it?

Nope.

As I mentioned before, it is a trick, a scam pulled on people that want to think the manufacturer has their desires in mind, or that it's "more future proof". It doesn't matter how much extra you tack on if there isn't enough horsepower to utilize.
 
  • Like
Reactions: Order 66
As I mentioned before, it is a trick, a scam pulled on people that want to think the manufacturer has their desires in mind, or that it's "more future proof". It doesn't matter how much extra you tack on if there isn't enough horsepower to utilize.
Exactly the point I was trying to make, you can put 256GB of VRAM on a budget GPU, but that doesn’t matter since the GPU won’t have the horsepower to utilize it.
 
IDK, probably a bit of a tangent, but this situation makes me think of all the "I have the same parts as such and such You Tuber and not getting the performance". As you go along this journey of building and tuning systems you rather learn things along the way. First off, never trust (most of) the tubers. And otherwise even outside of "silicone lottery" we see a LOT of variation in performance. This selection of RAM might get you 2 more FPS than that, and this motherboard will give you 100mhz, and so on and so forth. It is this constant pressure telling PC gamers that their system isn't good enough.

The top (3) GPU in the Steam survey right now are 3060/1650/1060.

31% of all Steam survey participants are on 8GB of VRAM. Everything above that only accounts for 24.6% of users polled. 44% of those users are on less than 8, and some of those lower results are a bit surprising. Nearly 4% of these folks are gaming on 1GB or LESS VRAM. 4GB is ~11% by itself (and still climbing), 6GB is ~16% and falling. The 11/16/24GB variants are very slowly increasing but that whole group is barely 6% of the market (by this survey) Of note that 12GB is just over 14% by itself, but falling.

It is also of note that (if I deciphering this correctly) just over 60% of the players polled are still on 1920x1080.

And in the meantime game producers just want people to buy and be able to play their titles. If there is one group of people paying close attention to the market realities, it is these developers...

.02
 
  • Like
Reactions: Order 66
IDK, probably a bit of a tangent, but this situation makes me think of all the "I have the same parts as such and such You Tuber and not getting the performance". As you go along this journey of building and tuning systems you rather learn things along the way. First off, never trust (most of) the tubers. And otherwise even outside of "silicone lottery" we see a LOT of variation in performance. This selection of RAM might get you 2 more FPS than that, and this motherboard will give you 100mhz, and so on and so forth. It is this constant pressure telling PC gamers that there system isn't good enough.

The top (3) GPU in the Steam survey right now are 3060/1650/1060.

31% of all Steam survey participants are on 8GB of VRAM. Everything above that only accounts for 24.6% of users polled. 44% of those users are on less than 8, and some of those lower results are a bit surprising. Nearly 4% of these folks are gaming on 1GB or LESS VRAM. 4GB is ~11% by itself (and still climbing), 6GB is ~16% and falling. The 11/16/24GB variants are very slowly increasing but that whole group is barely 6% of the market (by this survey) Of note that 12GB is just over 14% by itself, but falling.

It is also of note that (if I deciphering this correctly) just over 60% of the players polled are still on 1920x1080.

And in the meantime game producers just want people to buy and be able to play their titles. If there is one group of people paying close attention to the market realities, it is these developers...

.02
Yes, but if studios wanted people to buy their game, (which they do because they want to make a profit) (the following is my assumption and could be wrong) then why do they (again, assuming) impose strict deadlines on devs which doesn't give them enough time to finish their game. All it does is hurt their bottom line in the long term after all the pre orders have gone through. People are going to start waiting for the games to become more optimized or they are not going to buy the game at all.
 
I have often wondered this myself. I would think that the answer is probably some murky details based in the sheer amount of different hardware these are going to see once released into the wild?

It isn't that I think it is a new thing, but my memory span is short...so when I think of having purchased CP77 and waiting nearly a year for it not to be (crap).... I just figured that the NPC floating a few feet off the pavement was an indication they were probably vegans doing yoga. (Thinking of a movie title that won't come to me- the kid trying to date the girl and had to fight all the previous boyfriends)
 
Last edited:
  • Like
Reactions: Order 66
Yes, but if studios wanted people to buy their game, (which they do because they want to make a profit) (the following is my assumption and could be wrong) then why do they (again, assuming) impose strict deadlines on devs which doesn't give them enough time to finish their game. All it does is hurt their bottom line in the long term after all the pre orders have gone through. People are going to start waiting for the games to become more optimized or they are not going to buy the game at all.

The recent example of a successful launch was Baldur's Gate. Other developer's/executives might start taking the hint that a quality product at launch is more desirable then meeting deadlines.

It would be interesting to know the number of game launches where the sales were poor due to initial impressions and bugs, but the game carried on and gained popularity later. But then there will also be many games that failed due to those problems. Something Steam might be able to produce.

It seems the only ones we hear about are the ones where the marketing worked, everyone got excited, and then were let down. And then we hear about those ones when they get fixed, but there are will be many times more where the game development is abandoned immediately on a sales flop.
 
  • Like
Reactions: Order 66
I have often wondered this myself. I would think that the answer is probably some murky details based in the sheer amount of different hardware these are going to see once released into the wild?

It isn't that I think it is a new thing, but my memory span is short...so when I think of having purchased CP77 and waiting nearly a year for it not to be (crap).... I just figured that the NPC floating a few feet off the pavement was an indication they were probably vegans doing yoga. (Thinking of a movie title that won't come to me- the kid trying to date the girl and had to fight all the previous boyfriends)

Scott Pilgrim vs. the World​

 
  • Like
Reactions: punkncat
IDK, probably a bit of a tangent, but this situation makes me think of all the "I have the same parts as such and such You Tuber and not getting the performance". As you go along this journey of building and tuning systems you rather learn things along the way. First off, never trust (most of) the tubers. And otherwise even outside of "silicone lottery" we see a LOT of variation in performance. This selection of RAM might get you 2 more FPS than that, and this motherboard will give you 100mhz, and so on and so forth. It is this constant pressure telling PC gamers that their system isn't good enough.

The top (3) GPU in the Steam survey right now are 3060/1650/1060.

31% of all Steam survey participants are on 8GB of VRAM. Everything above that only accounts for 24.6% of users polled. 44% of those users are on less than 8, and some of those lower results are a bit surprising. Nearly 4% of these folks are gaming on 1GB or LESS VRAM. 4GB is ~11% by itself (and still climbing), 6GB is ~16% and falling. The 11/16/24GB variants are very slowly increasing but that whole group is barely 6% of the market (by this survey) Of note that 12GB is just over 14% by itself, but falling.

It is also of note that (if I deciphering this correctly) just over 60% of the players polled are still on 1920x1080.

And in the meantime game producers just want people to buy and be able to play their titles. If there is one group of people paying close attention to the market realities, it is these developers...

.02
exactly my point on why most games arent using more then 8gb most of time developers will develop within a scoop where they know most of the audience is hardware wise
 
Yes, but if studios wanted people to buy their game, (which they do because they want to make a profit) (the following is my assumption and could be wrong) then why do they (again, assuming) impose strict deadlines on devs which doesn't give them enough time to finish their game. All it does is hurt their bottom line in the long term after all the pre orders have gone through. People are going to start waiting for the games to become more optimized or they are not going to buy the game at all.

optimizing 100 percent isn't really possible on pc hence why there's a min requirement and a recommended on most steam pages.
there are far to many pc parts to optimise them all. most optimizing will be done on previous gens hardware when a game launches depending on when the started development a good example would be Hogwarts legacy played like arse on peoples 3060-3070-3080 etc but my 2060 super ( barely any issues apart from a bit of slow down. 4000 series cards high end had not long been released.

deadlines are mostly pushed by suits not the studio themselves for example ea will have a contract with a certain developer to get a game done and dusted.
 
  • Like
Reactions: Order 66
I am in the market for an older model gpu. I've settled on the RTX 3060 or 3060Ti. The Ti has 8gb and the former, 12gb. Given what I'm reading in your discussion, it seems the Ti would be fine, but the straight 3060 might be better in the near time.

I considered a mid-range card, but this is an upgrade to a system that is used as a streamer in our LR. My son is just dipping his toes into gaming so I thought I'd upgrade the gpu to a newer model. The PC tested well and is capable of running the games he plays. Any more insight into the benefits or lack of them regarding the memory difference in these two cards would be appreciated.