Discussion "AI" GPU with lots of VRAM and mid-tier specs

MCH170

Great
Oct 13, 2024
93
27
60
Would there be a market for an entry level "AI" GPU from either Nvidia or AMD that has lots of VRAM (24-48GB, not necessarily the fastest/newest kind) and mid-tier processing power while removing all other non-essential stuff like encoders/decoders, display outputs, etc?

What do you think?
 
Would there be a market?

Start by posting what you think.

What, why, etc. and cite some references.

Provide vetted links from reputable sources/writers and reviews.

Seems to be a homework like question and per Forum rules we do not do homework.

Or research projects.
 
It's not homework or research. It is an idea I had yesterday and wanted to hear some opinions. From my personal experience in running local ai models, vram is the limiting factor and the performance deteriorates significantly when the model starts overflowing into ram.
 
Yes, but they won't build it. Easier to sell the workstation cards at 4 times the cost for those that want it.

It is one of the reasons a lot of professionals buy the 3090/4090/5090 in the first place. They have their customers exactly where they want them.

4060 Ti 16GB is about as bargain as you can get as well.

Nothing stopping you from getting out a soldering station and swapping in bigger memory chips. It is happening all over Asia for doing AI models on the cheap. Of course they are targeting older cards like the 3090 where it makes more sense.
 
  • Like
Reactions: Phaaze88