News Spitballing Nvidia's GB202 GPU die manufacturing costs — die could cost as little as $290 to make

Remember that's TSMC's cost per wafer. Doesn't include dicing the wafer into dies, testing, binning, packaging, testing again, etc, and the rest of the card (more component, more packaging, more testing, HSF, etc).
It's sort of like estimating car manufacturing cost from engine block casting cost - there's a positive correlation, but not much more.
 
  • Like
Reactions: artk2219
Remember that's TSMC's cost per wafer. Doesn't include dicing the wafer into dies, testing, binning, packaging, testing again, etc, and the rest of the card (more component, more packaging, more testing, HSF, etc).
It's sort of like estimating car manufacturing cost from engine block casting cost - there's a positive correlation, but not much more.
Yeah, probably should have mentioned all the additional cost points much sooner in the article, then go in-depth on die cost calculations later.

That said, those big dies probably aren't quite as expensive as some of us were thinking in our heads. I maybe had a number more like $400-$450. It really is hard to say since NVIDIA's contract pricing is unknown on 4NP and most nodes (NDA/Proprietary contract protections).
 
so that's 290$ plus roughly 1-200 million in research and development. 2 grand seems like a steal. *sarcasm*
All that r and d plus they threw everything at it inculding the kitchen sink to barely get 30% atypical delta gains means the margins are atypically even higher. The already recuperated the r and d cost 1000 fold over!
How do you guarantee the public pay for such atypical margin ask? You create fud around scarcity and throttle all stock to maximize the maximum profits per silicon sold. Nvidia tactics 101!
 
The thing is, you can calculate the difference in cost at TSMC between the 4090 and 5090. Just put both ICs into the costing calculator and subtract the smaller from the larger. Then add a markup for extra power, 8GB RAM, DDR7, and profit margins for NVidia(61%, 2021), AIBs(12%), Retailers(5%), to get the resulting price-increases from the 4090. The cost of dicing and wiring up the chip should be included in that.

I did the same calculation with the 9070xt and figured it should cost about $100 more (at retail) than a 7800xt, if yields are 90% for both dies.
 
Partner card MSRP tells us everything we need to know about the margins nvidia is making on these products. The lowest MSRP on any partner card that is known so far is $2200 (that may change with official launch, but with limited availability I assume stores are going to happily take a premium). There's no doubt that the nvidia FE cooling and board design costs more money than the partner ones, but they're still MSRP of $2000.

It also sounds like GDDR7 availability isn't great right now so AIBs are getting memory with the GPUs from nvidia.

It's extremely unlikely any material cost explains the $400 difference between the 4090 and 5090. The board partner MSRP difference is also higher than we've seen in the past. All of these things really seem to indicate that nvidia wants even higher margins than they got with the 40 series. They're able to do this due to their market dominance of course, but it seems very unhealthy for the industry as a whole.

For as much as people love to complain about the ~8 years of Intel quad cores they never really raised the price point (i7-920/860 ~$284 i7-7700K ~$339). In the graphics space everyone's being told to pay more (and some times to get less) which is just never sustainable and something I'm not sure any of the modern companies understand.