Very unlikely accountants are responsible for branding, model naming and corporate strategy. They usually report the numbers and create forecasts based on the decisions others make but don’t make these types of decision.ACCOUNTANTS obviously.
Who in nVIDIA marketing thought this "4080 16 GB" & "4080 12 GB" naming scheme was a good idea?
Jensen Huang should fire them for this debacle.
The worst part is, if nVIDIA let this confusing naming scheme go on, it could've led to a long term class-action law suit for deceptive business practices.
Something nVIDIA wants to avoid.
We all remember the 3.5 GB VRAM controversy from back in the day and how there was a class action lawsuit over that 0.5 GB of useless VRAM
I think you are forgetting R&D costs which are very high in the graphics card business since they only have one season to sell before the next gen comes out. Most companies outside of graphics have years or even a decade to spread R&D costs to lower msrp.This is true, which is exactly why it's a BUYERS' market. There's an excess of inventory that is going down in value. The AIBs may have paid high prices for it, but Nvidia's own statements are that it expects supply to be higher than demand through Q4 of the fiscal year. Nvidia's fiscal year is basically aligned with the calendar year, however, which means January-ish is when it expects demand to catch up to supply.
Furthermore, if Nvidia's partners ALREADY paid for the inventory, it's no sweat to Nvidia if prices drop. That's PRECISELY WHY EVGA has "taken its ball and gone home." But the AIBs made massive profits on GPUs throughput 2020 and 2021, and even into the start of 2022, so they can certainly absorb some losses.
Finally, $22,000 per 4N wafer is probably what a small startup would pay if they wanted to do some chips on that node. Nvidia, who will be ordering tens of thousands of wafers, is not going to pay such prices, not even close. I'd estimate at best Nvidia is paying $15,000. If it were the upcoming N3 node, maybe $22,000 would be closer, but it's not — it's a revision of N5 5nm tech that has been out for two years now. Apple's A14 Bionic chip was shipping in quantity since October 2020.
Even at $22,000 per wafer, though, what does that mean? Well, that would mean about 90 chips per wafer, or a cost per chip of $244. For the AD103 chips, that would be about 148 chips per wafer and just $150 per chip. Nvidia could easily sell RTX 4090 at $999 and still make a profit, or RTX 4080 at $700 and make a profit. It won't because it can get much more than that for the GPUs, but it could. Unless AMD can come out with something so competitive that it forces prices down.
Oh, I didn't "forget" them though I didn't explicitly mention them here. I've talked about R&D before, though, like in this piece: https://www.tomshardware.com/news/why-nvidias-4080-4090-cost-so-damn-muchI think you are forgetting R&D costs which are very high in the graphics card business since they only have one season to sell before the next gen comes out. Most companies outside of graphics have years or even a decade to spread R&D costs to lower msrp.
The joke... -----------------------------> YouVery unlikely accountants are responsible for branding, model naming and corporate strategy. They usually report the numbers and create forecasts based on the decisions others make but don’t make these types of decision.