News Nvidia Korea CEO Offers Further Explanation on RTX 4080 12GB Cancellation

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Mr. Smith

BANNED
Dec 25, 2021
55
11
35
If there is a 12 GB version some will not buy the 16 GB one. From the fear of cannibalizing their own sales they decided this, and whatever you read those are just to eyewash the consumers.
 

SethNW

Honorable
Jul 27, 2019
40
22
10,535
nVidias official reason, that babe was confusing is pretty much just excuse to sound like a good guy. But real reason is huge amount of Ampere stock and price pressure 4080 12GB would put on top end Ampere. So with bad reputation and bad reviews being pretty much guaranteed, they decided it isn't worth the trouble. When they can slightly cut the die and re-released it as 4070/4070Ti, so they can keep pretending that wasn't 4080 12GB. Unless RDNA3 forces then to do full die, as was intended for 4080 12GB. But rumor wise they barely made any 4080 12GB, since original plan was to do supply shenanigans. So it runs out of stock periodically and people who wanted to buy 4080 12GB consider Ampere instead. And upcoming holiday season was definitely planned with low availability. Unless RDAN3 would force them to do it differently.
 
Who in nVIDIA marketing thought this "4080 16 GB" & "4080 12 GB" naming scheme was a good idea?

Jensen Huang should fire them for this debacle.

most likely the same people/marketing team that come up with 1060 3GB and 1060 6GB. back then they get away with it. just not this time. rather than firing them jensen probably going to ask them to come up with even clever twist in the future since it no longer work like it did with those 1060s before.

The worst part is, if nVIDIA let this confusing naming scheme go on, it could've led to a long term class-action law suit for deceptive business practices.

Something nVIDIA wants to avoid.

they probably want to avoid those suit but it doesn't mean they want to avoid those "deceptive" tactics completely. they will push it until the very limit. this time nvidia decided they should not push their luck like it did with 1060.

We all remember the 3.5 GB VRAM controversy from back in the day and how there was a class action lawsuit over that 0.5 GB of useless VRAM

but despite all the controversy jensen outright saying that he proud what his team did with GTX970 memory configuration.
 
This is true, which is exactly why it's a BUYERS' market. There's an excess of inventory that is going down in value. The AIBs may have paid high prices for it, but Nvidia's own statements are that it expects supply to be higher than demand through Q4 of the fiscal year. Nvidia's fiscal year is basically aligned with the calendar year, however, which means January-ish is when it expects demand to catch up to supply.

Furthermore, if Nvidia's partners ALREADY paid for the inventory, it's no sweat to Nvidia if prices drop. That's PRECISELY WHY EVGA has "taken its ball and gone home." But the AIBs made massive profits on GPUs throughput 2020 and 2021, and even into the start of 2022, so they can certainly absorb some losses.

Finally, $22,000 per 4N wafer is probably what a small startup would pay if they wanted to do some chips on that node. Nvidia, who will be ordering tens of thousands of wafers, is not going to pay such prices, not even close. I'd estimate at best Nvidia is paying $15,000. If it were the upcoming N3 node, maybe $22,000 would be closer, but it's not — it's a revision of N5 5nm tech that has been out for two years now. Apple's A14 Bionic chip was shipping in quantity since October 2020.

Even at $22,000 per wafer, though, what does that mean? Well, that would mean about 90 chips per wafer, or a cost per chip of $244. For the AD103 chips, that would be about 148 chips per wafer and just $150 per chip. Nvidia could easily sell RTX 4090 at $999 and still make a profit, or RTX 4080 at $700 and make a profit. It won't because it can get much more than that for the GPUs, but it could. Unless AMD can come out with something so competitive that it forces prices down.
I think you are forgetting R&D costs which are very high in the graphics card business since they only have one season to sell before the next gen comes out. Most companies outside of graphics have years or even a decade to spread R&D costs to lower msrp.
 
I think you are forgetting R&D costs which are very high in the graphics card business since they only have one season to sell before the next gen comes out. Most companies outside of graphics have years or even a decade to spread R&D costs to lower msrp.
Oh, I didn't "forget" them though I didn't explicitly mention them here. I've talked about R&D before, though, like in this piece: https://www.tomshardware.com/news/why-nvidias-4080-4090-cost-so-damn-much

People can wish for less expensive enthusiast tier GPUs all they want, but if wishes were fishes than I'd need a much bigger boat! Look at the prices people are paying on eBay and it's pretty obvious that $1600 was not actually too high a price for launch. The cards may not sustain that over the long haul, but for the first month or two it was well within "reason" for recovering cost, R&D, etc.