News GDDR6 VRAM Prices Plummet: 8GB of Memory Now Costs $27

Apr 1, 2020
1,449
1,107
7,060
Before GPU prices skyrocketed it was the norm, for a card with double the vram to carry about a $50 price premium, which is why TH and other publications would always put in the reviews to not cheap out and go with the smaller capacity card.
 

cypeq

Distinguished
Nov 26, 2009
371
2
18,795
Current generation cards only use 2GB ICs.
Entire article hangs on a thread that 2GB ICs, have strong price relation to 1GB ones.
Which is purely a guess because demand for those is high and for 1GB ones is low,
market prices are strongly driven by demand not only production cost.
 
  • Like
Reactions: bit_user

bit_user

Polypheme
Ambassador
I would just add that AMD and Nvidia have to be sure that they're spec'ing products which remain profitable to make, even when component prices recover towards long-term trends. So, I don't think we're going to see a wave of mismatched cards with low-end GPUs and huge amounts of RAM. That just doesn't make a lot of sense.

If anything, we might see more examples like Intel's A770 and the RTX 4060 Ti, with both an 8 GB and a 16 GB version of the card. If GDDR6 prices rebound too much, the 16 GB version can go out of production and their board partners can just focus on the 8 GB version.

In the meantime, maybe they can take advantage of the low memory pricing by providing discounts on existing models to end consumers.
 
Last edited:

bit_user

Polypheme
Ambassador
Current generation cards only use 2GB ICs.
Entire article hangs on a thread that 2GB ICs, have strong price relation to 1GB ones.
Which is purely a guess because demand for those is high and for 1GB ones is low,
market prices are strongly driven by demand not only production cost.
Not only that, but if the article is advocating for even higher-memory GPUs, then maybe it should include pricing data on 4 GB chips, since that won't add board costs like making double-sided PCBs would.
 

Geef

Distinguished
I think I'd pay an extra 100 dollars to buy a 32GB video card. There should be an option to buy like at a fast food restaurant.

SMALL = 8GB
MEDIUM = 16GB
LARGE = 32GB

You pick the card type... "Yeah I want a 4090... Medium size, with high clock speeds. Can I get that with Sweet and Sour sauce?"
 
So what is nVidia's excuse for being so stingy with VRAM on the 4000 series cards? In theory, the price difference between the announced 16GB RTX 4060Ti and the 8GB already launched should be <$50, right?
 
  • Like
Reactions: sherhi
So what is nVidia's excuse for being so stingy with VRAM on the 4000 series cards? In theory, the price difference between the announced 16GB RTX 4060Ti and the 8GB already launched should be <$50, right?
Their narrow bus width means that they'd have to clamshell on memory like they did with the 3090. I'd be surprised if the extra engineering cost of doing so isn't the same or higher than the cost of the memory itself.

That's not to say their current pricing is good or that there shouldn't be higher capacity available just that it isn't as simple as "just add some VRAM".
 

RedBear87

Commendable
Dec 1, 2021
153
120
1,760
Basically you made your question and answered it, nice, here in Italy there's an old journalist (Gigi Marzullo) who makes a late night program where he asks guests to do that.

The RTX 4060Ti 16GB might be, to a large extent, just Nvidia's answer to that silly boy who runs AMD's marketing and pointed out that you can have a previous generation RDNA2 card with 16GB for $500 and it will probably sell exactly for $500.

Neither Nvidia nor AMD will sell $300-400 GPUs with 16GB VRAM, in part because they planned their lineup when prices were much higher. We'll have to wait for more refreshes or possibly even the next generation, assuming that there won't be yet another crisis that will raise memory prices in the nearish future...
 

sherhi

Distinguished
Apr 17, 2015
79
51
18,610
Whole market, until recently, used to be a duopoly, third player doesn't change much. It almost feels like they collude not to disturb each other into all out price war. They know very well 8gbs are entry level 1080p cards (which is by no means "midrange", I don't know why journalists eat that marketing bs) and artificially force you to pay premium for standard feature (they can get away with it because every producer does the same). It's actually brilliant, you either spend more now (they win) or you spend more money much sooner than you would want to in next 2-3 years (they win again). Basically they are shortening the lifespan of their products which is ecologically shady at best.
 

virgult

Distinguished
Jan 20, 2013
27
11
18,545
To me the answer is obvious - market segmentation. They don't want to repeat a situation where professionals and AI data centres buy the RTX 2080 rather than the ex-Quadro 6000. Crippled VRAM (and bus width) are even more effective than the ban on blower cards to tell professionals that those cards are not for them.
 
  • Like
Reactions: bit_user

hannibal

Distinguished
Not only that, but if the article is advocating for even higher-memory GPUs, then maybe it should include pricing data on 4 GB chips, since that won't add board costs like making double-sided PCBs would.

Problem in this is that 2Gb are the biggest there are in the market now. We are waiting for 3Gb chips to come production somewhere in the future and 4Gb far, far in the future...

So if 8Gb of vram cost not $30, the clamshell system alone cost $60 or more... So near $100 including the profit margin.
We need bigger memory chips for cheaper memory upgrades and we just don´t have those yet...
When 3060 was released, it was suposed to use 1gb chips aka be 6Gb GPU, but they backed down and did use those big 2Gb chips and get 12Gb or memory. Now they already use 2Gb chips in 4060 so there is not easy upgrade path...
Also 3Gb chips most likely will be rather expensive when they are released! The prices will drop down eventually, but because of that... 5060 most likely will also have 8Gb of vram and 12 Gb version of using 3Gb chips is possible, but substantially more expensive....
 
  • Like
Reactions: thestryker

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,280
810
20,060
Problem in this is that 2Gb are the biggest there are in the market now. We are waiting for 3Gb chips to come production somewhere in the future and 4Gb far, far in the future...
GDDR6 is going to have to live with 1 GiB / 8 Gib or 2 GiB / 16 Gib.

GDDR7 is rumored to be getting 3 GiB / 24 Gib Chip support.

Rumor has it that JEDEC won't bother moving support for 24 Gib to GDDR6.
 
Last edited:

InvalidError

Titan
Moderator
Rumor has it that JEDEC won't bother moving support for 24 Gib to GDDR6.
JEDEC is only a baseline standard for shops that need to build stuff that can accommodate whatever random memory they can source at a given moment.

For large manufacturers like AMD, Intel and Nvidia who bake tweaked details for every memory chip they are officially supporting on their GPU chips, if they want 24Gbps GDDR6, they just have to agree on where to stuff the extra address bit and memory manufacturers will be happy to oblige for a small premium and sufficiently large volume commitment.
 
Last edited:
  • Like
Reactions: bit_user

bit_user

Polypheme
Ambassador
JEDEC is only a baseline standard for shops that need to build stuff that can accommodate whatever random memory they can source at a given moment.

For large manufacturers like AMD, Intel and Nvidia who bake tweaked details for every memory chip they are officially supporting on their GPU chips, if they want 24Gbps GDDR6, they just have to agree on where to stuff the extra address bit and memory manufacturers will be happy to oblige for a small premium and sufficiently large volume commitment.
Yeah, like I think GDDR5X wasn't an official standard, but rather something Micron did specifically for Nvidia.
 

domih

Reputable
Jan 31, 2020
187
170
4,760
At this price, I'd love to see an OEM making main PC memory using GDDR6 with some additional circuitry adapting it to the memory controller on the CPU.
 

bit_user

Polypheme
Ambassador
At this price, I'd love to see an OEM making main PC memory using GDDR6 with some additional circuitry adapting it to the memory controller on the CPU.
There are reasons besides cost why (almost) nobody ever did this. GDDR memory needs to be soldered down. Even if you're willing to accept that limitation, it has higher latency as well.

You can see the impact of trying to use GDDR memory for a CPU, here:

It's not a perfect comparison, but about as good as we're going to get.

Also, it's clear that the SoC wasn't architected to provide the CPU cores with the max theoretical memory bandwidth, but it still manages more than 2x the performance of the DDR4-based system on memory copies.

vKtQgNoS92kNhwoFq6F3e4-970-80.png.webp


At the time, it already outclassed the memory performance of DDR4-based mainstream desktops. Yet, even that wasn't enough to offset the impact of the increased latency.

jJJ6Mk2N88bT6MxpQpieWm.png

 

InvalidError

Titan
Moderator
At this price, I'd love to see an OEM making main PC memory using GDDR6 with some additional circuitry adapting it to the memory controller on the CPU.
GDDR6 has higher latency than good DDR5 and a complete PHY-MAC converter would add a heap more latency on top. At the absolute bare minimum, you are adding at least two cycles to clock command and data in+out, at least two cycles to gather both halves of each command to rearrange all of the address bits and possibly another cycle or two of slack to accommodate command/data alignment conflicts between the two standards. You also still wouldn't be able to run the DDR5 bus much faster than the best DDR5 memory can natively run at without all of the conversion overhead and LN2-cooling on the CPU.

It can probably be done, but the net outcome might be something equivalent to DDR5-8000-50-70-70-120 if you have a CPU and motherboard combination that can clock the memory that high in the first place.
 
  • Like
Reactions: bit_user
GDDR6 is going to have to live with 1 GiB / 8 Gib or 2 GiB / 16 Gib.

GDDR7 is rumored to be getting 3 GiB / 24 Gib Chip support.

Rumor has it that JEDEC won't bother moving support for 24 Gib to GDDR6.
The original spec allowed for up to 32 Gb (you can see this in some Micron documentation), but the fact that we haven't seen it yet given the datacenter need for VRAM leads me to believe manufacturing is the problem.