Question Why can't we add our own VRAM to GPUs?

ch33r

Distinguished
BANNED
Jun 13, 2010
316
4
18,685
We choose RAM modules for motherboards, why cant they put 2 RAM slots on a GPU where we can choose modules that would go onto the GPU for VRAM. Why is this not a thing?
 
There are a few reasons, some technical, some commercial, and also economical really
  • GPU manufacturers would always rather you just bought a new GPU rather than upgrading it yourself constantly.
  • The GPUs can often be limited to the bus size, so making a card that only utilises a portion of this just to leave headroom for expansions would add a lot more development cost.
  • You would need sockets and buses on the GPU for the expansion which would massively increase development and production cost and also increase the size of the GPU much bigger than it already is.
  • That and by the time that you would need to add more GPU memory, you are probably better off buying newer GPU technology anyway. You are likely already maximising the usage of the GPU with the VRAM it has.
  • Power requirements stipulated by manufacturers becomes more convoluted as each person could be using up different amounts of power in different ways. Not only power requirements, but cooling requirement also - all of a sudden, maintaining the cards life becomes much more difficult.
  • Theoretically you are increasing the amount of things that could go wrong, warranties could suffer, compatibility could suffer, and maintenance could also be worse. There are enough issues with DDR RAM compatibility (such as mixing modules) so imagine adding these problems to GPUs too.
Also you'd need to be able to do all of this, without needing to taking the card apart - the costs and inefficiencies just add up.

So ultimately, it massively saves cost for the manufacturer, helps standardise practice and usage far more, and ultimately likely helps you in the long run!
I believe there are a select few out there that have a kind of expandable memory, but they're not really aimed at general users.
 
Last edited:
  • Like
Reactions: WildCard999
There are a few reasons, some technical, some commercial, and also economical really
  • GPU manufacturers would always rather you just bought a new GPU rather than upgrading it yourself constantly.
  • The GPUs can often be limited to the bus size, so making a card that only utilises a portion of this just to leave headroom for expansions would add a lot more development cost.
  • You would need sockets and buses on the GPU for the expansion which would massively increase development and production cost and also increase the size of the GPU much bigger than it already is.
  • That and by the time that you would need to add more GPU memory, you are probably better off buying newer GPU technology anyway. You are likely already maximising the usage of the GPU with the VRAM it has.
  • Power requirements stipulated by manufacturers becomes more convoluted as each person could be using up different amounts of power in different ways. Not only power requirements, but cooling requirement also - all of a sudden, maintaining the cards life becomes much more difficult.
  • Theoretically you are increasing the amount of things that could go wrong, warranties could suffer, compatibility could suffer, and maintenance could also be worse. There are enough issues with DDR RAM compatibility (such as mixing modules) so imagine adding these problems to GPUs too.
Also you'd need to be able to do all of this, without needing to taking the card apart - the costs and inefficiencies just add up.

So ultimately, it massively saves cost for the manufacturer, helps standardise practice and usage far more, and ultimately likely helps you in the long run!
I believe there are a select few out there that have a kind of expandable memory, but they're not really aimed at general users.

On motherboards, we have our own RAM and CPU. Why cant they make it so we can choose our own RAM/GPU, like we do on a mainboard. The memory could use small chips like laptop memory that takes up very little room, and of course you would have a socket and a cooler just like you do on a mainboard. Why can they do this on a mainboard but not on a GPU?
 
On motherboards, we have our own RAM and CPU. Why cant they make it so we can choose our own RAM/GPU, like we do on a mainboard. The memory could use small chips like laptop memory that takes up very little room, and of course you would have a socket and a cooler just like you do on a mainboard. Why can they do this on a mainboard but not on a GPU?
For the same reasons I've stated above.

It's not as simple as just adding in a couple of RAM slots, this is easier on an entire MB due to the size of it, the development cost to improve the PCB of the GPU to take this would be much harder. The GPUs already have coolers, that might not be capable of dealing with extra VRAM heat. So it's even more development cost there.

Laptop and Desktop RAM also differ technically, it's not just a matter of smaller size, the larger size allows for more features and better ICs and equally DIMM has more features.

And again, all of a sudden it's a can of technical worms when it comes to compatibility and power requirements.
That and also, by the time that you would need more VRAM with that GPU, you'll probably need a new tech GPU anyway, so it doesn't make any sense. It'll be the same falacy that everyone follows today of "More RAM = More Speed" when this is only true if your software and hardware NEEDS more RAM.