News Next-gen GPUs likely arriving in late 2024 with GDDR7 memory — Samsung and SK hynix chips showed chips at GTC

I'm sure Nvidia will probably have some "5090" class card targeted toward AI with 24-32 GB VRAM and a $2k+ price at some point later this year, and they'll have much more expensive workstation equivalents with more VRAM and unlocked CUDA. . .

But the question is: Why would Nvidia ever release anything other than the highest-end AI/workstation graphics?
They know what they have, and they aim to make all the money that has ever existed in the history of mankind before the AI bubble pops. Nvidia would be losing billions in opportunity cost if they released new gaming cards this year, unless those cards were made on an older non-competing node and completely incapable of AI - which is not that likely.
Maybe they'll release a severely overpriced 4090 Ti or a 4090 GDDR7 to cash in on inflated aftermarket 4090 prices. Otherwise, Nvidia said they want to be an AI company not a graphics company and their investors seem to agree.
 
I'm sure Nvidia will probably have some "5090" class card targeted toward AI with 24-32 GB VRAM and a $2k+ price at some point later this year, and they'll have much more expensive workstation equivalents with more VRAM and unlocked CUDA. . .

But the question is: Why would Nvidia ever release anything other than the highest-end AI/workstation graphics?
They know what they have, and they aim to make all the money that has ever existed in the history of mankind before the AI bubble pops. Nvidia would be losing billions in opportunity cost if they released new gaming cards this year, unless those cards were made on an older non-competing node and completely incapable of AI - which is not that likely.
Maybe they'll release a severely overpriced 4090 Ti or a 4090 GDDR7 to cash in on inflated aftermarket 4090 prices. Otherwise, Nvidia said they want to be an AI company not a graphics company and their investors seem to agree.
xx90 series doesn't compete with their AI class cards - they open the door to them. In my experience, people experiment with consumer cards before spending money on bigger systems. College kids can get a gaming card and 'play' with AI, and eventually bring that CUDA knowledge (and preference) to a fulltime job. This is THE primary reason why CUDA dominates the market.

I do agree with you about missed opp cost if the consumer cards are stealing capacity on leading edge wafers.
 
"NVIDIA did confirm that there will be less next gen gaming cards on the release aka there is very limited supply = profit!"

Guaranteed the scalpers out there will be back in full force.
 
I'm sure Nvidia will probably have some "5090" class card targeted toward AI with 24-32 GB VRAM and a $2k+ price at some point later this year, and they'll have much more expensive workstation equivalents with more VRAM and unlocked CUDA. . .

But the question is: Why would Nvidia ever release anything other than the highest-end AI/workstation graphics?
They know what they have, and they aim to make all the money that has ever existed in the history of mankind before the AI bubble pops. Nvidia would be losing billions in opportunity cost if they released new gaming cards this year, unless those cards were made on an older non-competing node and completely incapable of AI - which is not that likely.
Maybe they'll release a severely overpriced 4090 Ti or a 4090 GDDR7 to cash in on inflated aftermarket 4090 prices. Otherwise, Nvidia said they want to be an AI company not a graphics company and their investors seem to agree.
A lot of this will be down binning I think. They won't give up too much wafer space, if any to make these cards when that space/production capacity could go to something bigger. However, if all of these are just down binned then it is just using chips that don't meet spec for the big boy, only using one chip, adding less to the die when it comes to memory things like that.

Could even be entirely different production lines that couldn't handle their big boys. In short I highly doubt they will sell anything lower than the full on chip if it takes away any production possibilities from it since they are far exceeding their supply with demand.
 
Ooph. 3GB density chips arriving in 2025?
It sounds like there will be no interesting GPUs for a year at minimum.

That is a long drought for interesting GPUs. Flop after flop