News Why Nvidia's RTX 4080, 4090 Cost so Much

Great analysis. Thanks a lot.

You have to wonder... For how long nVidia will blindly believe their most hardcore fans will keep buying all the crap they feed them?

Well, regardless, as long as people keep calling them out on their (ehem) incoherence/dissonance it should be ok for most people to consider themselves warned... I think?

Such great products, but such scummy behavior. Pity.

Regards.
 
Theres no doubt the new RTX 4xxx series will deliver lots of performance, but the cost to build, yields, the power need it to run them, the once again new connector, and the asking price does not match with the post pandemic world we are living in today.

Don't get me wrong, I understand the "it cost more, and it needs more power but it finish the work faster", but not everyone can go from a 500~600 watts PSU to a 750 to 1000 watts unit at the same time they need to pay for the new GPU.
 
Last edited:

King_V

Illustrious
Ambassador
And now, @JarredWaltonGPU , because of the wording in the article's title . . even though it's not exactly the same, I now no longer picture you as what you really look like.

I now picture you like this:
6ugfya.jpg
 

nimbulan

Distinguished
Apr 12, 2016
40
38
18,560
While silicon manufacturing costs are increasing rapidly, the costs are being greatly exaggerated. Even with the massive price increase of TSMC's 5nm node, that shouldn't add up to more than $100 extra for AD102 vs GA102. The much smaller AD103 shouldn't add more than about $45 in cost compared to GA102 (4080 vs 3080.) AMD's chiplet approach won't save them more than $60 compared to nVidia at the outside comparing their flagship cards - I expect other design considerations (lower TDP reducing power delivery and cooling costs, and cheaper VRAM) will bring more cost savings.
 
  • Like
Reactions: artk2219

blacknemesist

Distinguished
Oct 18, 2012
490
98
18,890
Honestly to the end consumer it does not matter if costs are higher when you are putting more and more tech into something that won't even be backwards compatible.
Just give us what you can at price points that are attractive, don't have 2 entirely different cards named the same and improve backwards compatibility so at least a card can still be "fresh" for 1-2 years more, which given the price is not asking for much.
Also, the USA is not the only country in the world bu apparently is the only one where you can buy these at their announced stock, for EU we already have reports of pricing that indicates that someone is eating through the MSRP and VAT, what gives?
 
  • Like
Reactions: artk2219

elforeign

Distinguished
Oct 11, 2009
101
142
18,770
It will be interesting to see how this plays out over the next 2 quarters as supply ramps up, AMD and Intel come into the channel and reviews and benchmarks come out. Nvidia gets <Mod Edit> on a lot for performance with DLSS etc, but the truth is the new consoles and AMD and Intel have followed suit with their versions of upscaling tech because it's genuinely great when applied correctly. Game engines will also catch up to better utilize these new methods and ray-tracing, while expensive, is definitely the future.

I agree the 4000 series is overpriced at the moment. I'm glad I bought a 3080 12GB @ 750.00 and since I only game at 2K i'm sure it will hold up for a refresh cycle or two. I do not plan to buy anything from Nvidia this go round. I will wait until Nvidia comes out with a new architecture with a stack that's meant to compete with AMD and Intel and is non-monolithic.

There is no way they can continue down the trend of monolithic dies with the complexity and pricing mechanics on the fabrication end and the competition from AMD and Intel.
 
Last edited by a moderator:

PiranhaTech

Reputable
Mar 20, 2021
136
86
4,660
I didn't even think of the non-compute part of the chip. I think AMD made the I/O on Ryzen 5000 at 12nm. Even if AMD used a single compute module, that could still be an advantage. How much of a processor is actual compute? Quite a bit of a processor is cache, and AMD looks like they are going to do the cache for Ryzen 7000 on an older node (no idea about the GPU)
 
  • Like
Reactions: artk2219

InvalidError

Titan
Moderator
AMD and Nvidia got funny ideas from how much crypto-miners were willing to pay for GPUs, designed their next-gen GPUs to meet the crazy sales they got from crypto and now they are expecting normal gamers to open their wallets as wide as crypto-miners once did. It also doesn't help that the market is still sitting on a large pile of unsold inventory.

Looks like AMD and Nvidia are about to get a stiff reality check.
 
Last edited:

TR909

Distinguished
Oct 7, 2009
18
9
18,515
So since you have the die size and the wafer size a simple calculation would provide the chips / wafer. Then let's say a 80% working chips per wafer (20%defects). And then divide the wafer cost to the working chips to get the cost per chip. Add to it the GDDR6x cost, the PCB and the cooling and there you have it. Does this have a value of 1599 USD (like 2k in retail)? Really doubt it.

PS - I am sure Tomshardware can do a better calculation and use its resources to get current component prices.
 

elforeign

Distinguished
Oct 11, 2009
101
142
18,770
So since you have the die size and the wafer size a simple calculation would provide the chips / wafer. Then let's say a 80% working chips per wafer (20%defects). And then divide the wafer cost to the working chips to get the cost per chip. Add to it the GDDR6x cost, the PCB and the cooling and there you have it. Does this have a value of 1599 USD (like 2k in retail)? Really doubt it.

PS - I am sure Tomshardware can do a better calculation and use its resources to get current component prices.
The BoM could be logically deduced, but it's not that simple. Nvidia and TSMC enter into agreements without whose details obscures the price to manufacture. It's pointless to guess the BoM at this stage. Also, the MSRP takes into account more than just a BoM, it's R&D, Yield, Marketing, etc.

I think the most interesting thing to help us understand how much more expensive it is to will be to see the Jan-March quarterly report and review YoY.
 
  • Like
Reactions: JarredWaltonGPU

lmcnabney

Prominent
Aug 5, 2022
192
190
760
PS - I am sure Tomshardware can do a better calculation and use its resources to get current component prices.

I kinda did the math. TSMC wants around $4k per wafer, ~80 GPU per wafer, ~$6GB for GDDR6X, and probably $50 for the rest of the card and cooler. Their COG delivered is going to be well under $300 for each 4090. How much margin do they need to cover all of the costs in the business?
 

JTWrenn

Distinguished
Aug 5, 2008
330
234
19,170
Moore's law is only dead if you don't innovate. Nvidia is just not thinking in next gen ideas, they are kind of going backwards. Really DLSS is a great idea, but by making it closed source they are just turning PCs into a version of a console where game makers have to optimize for certain hardware. It has a layer inbetween but then add ons that make people with different cards get different performance based on optimization. It's not good for the industry and a very Nvidia idea.

I can't wait until AMD comes out and crushes them on dollar per performance. Nvidia is going to have big issues next gen when they run out of room on this completely.

On other thing, I don't believe for a second that this is cost concern only...or even mostly. This feels like the typical corporate bs that there is inflation talk and they are taking advantage to increase growth in a downturn investment market. I hope intel and amd do the opposite and rip a ton of market share away from nvidia for this.
 

kal326

Distinguished
Dec 31, 2007
1,230
109
20,120
Intel mocked chiplets almost right up until they announced their own. Seems Nvidia figured they could ride out one or more generation of monolithic dies or let AMD test the waters first. They have the market and felt they could sell whatever at basically any price. Seems time well tell who was right. It certainly didn’t work out so great for Intel. Not catastrophic failure, but enough bad press that it finally clicked they had to right the ship. The take away I got from this is just wait out November 3rd.
 

Co BIY

Splendid
Seems Nvidia figured they could ride out one or more generation of monolithic dies or let AMD test the waters first. They have the market and felt they could sell whatever at basically any price.

I don't see any reason my this is not right.

Any lower price and they only steal sales from their own 30XX series.
 

InvalidError

Titan
Moderator
I kinda did the math. TSMC wants around $4k per wafer, ~80 GPU per wafer, ~$6GB for GDDR6X, and probably $50 for the rest of the card and cooler.
The numbers I've seen a couple of TSMC price hikes ago were at 10k$/wafer on N7P, 13k$ for N5 so we're probably talking close to $200 of wafer cost per 600sqmm monster.

Don't forget the VRM. 20+ phases at 70A each is easily above $50 in components.
 
While silicon manufacturing costs are increasing rapidly, the costs are being greatly exaggerated. Even with the massive price increase of TSMC's 5nm node, that shouldn't add up to more than $100 extra for AD102 vs GA102. The much smaller AD103 shouldn't add more than about $45 in cost compared to GA102 (4080 vs 3080.) AMD's chiplet approach won't save them more than $60 compared to nVidia at the outside comparing their flagship cards - I expect other design considerations (lower TDP reducing power delivery and cooling costs, and cheaper VRAM) will bring more cost savings.
I don't think anything costs less when making a graphics card today than it did compared to two years ago at the Ampere launch. Have GDDR6 prices gone down? Substrate, PMIC, resistor, capacitor, PCB, VRM, shipping, etc. prices have all gone up. Best case, if everything else stays the same and Nvidia goes from paying $100 per GPU chip to $200 per chip, I'd expect that to result in a final product that costs $200 extra compared to the previous generation. Add in everything else and it's not really surprising to see Nvidia trying to get $900+ for the 4080 series. Sure, prices could be lower, but unless Nvidia has so many Ada GPUs that it doesn't sell out at launch, there's no reason for it to actually be lower just yet.

Now when RDNA 3 comes out, we'll see if Nvidia reacts with price cuts and maybe an RTX 4070.
I kinda did the math. TSMC wants around $4k per wafer, ~80 GPU per wafer, ~$6GB for GDDR6X, and probably $50 for the rest of the card and cooler. Their COG delivered is going to be well under $300 for each 4090. How much margin do they need to cover all of the costs in the business?
I strongly doubt those prices. I suspect it's more like $8K per 4N wafer at a minimum (which was customized for Nvidia's needs and that increases pricing), and a maximum that might be twice that. 24GB of GDDR6X is probably $6 per 2GB chip, minimum, which would mean at least $72 for memory, could be more. Card and cooler and fans, probably $75–$150, depending on the model. Now add in all the PMICs, VRMs, capacitors, resistors, and any other surface mounted devices and I think a rough estimate of $50–$100 for all of those isn't out of the question.

So based on my estimates, it would be more like $300 as an absolute minimum (reality is probably somewhat close to $300 for a card like the RTX 4080 12GB), but a maximum would be more like double that, maybe more. Yeah, that's still $600 in base costs selling for $1600, but as Spongie points out, R&D and a lot of other things come into play, and this is the halo card for now. It was never going to be affordable.
 
  • Like
Reactions: ivan_vy

ien2222

Distinguished
I kinda did the math. TSMC wants around $4k per wafer, ~80 GPU per wafer, ~$6GB for GDDR6X, and probably $50 for the rest of the card and cooler. Their COG delivered is going to be well under $300 for each 4090. How much margin do they need to cover all of the costs in the business?

For TSMC, the cost per wafer at 5nm is around $17,000, $4k is about what they charge for 28nm and I believe their 7nm is going for about $12k.
 
Looks like AMD and Nvidia are about to get a stiff reality check.
except AMD will lower price unlike Nvidia.

Happened wi htheir last big flop gpu (think used hb memory?)

tbh if prices keep rising im sorry but theres a limit to what ANYONE will pay outside server grade stuff.

they'll juts buy used or lower tier cards.
and if it gets too extreme i could see consoles becoming the answer to the high price of gaming. (im pc master race but there IS a limit)
 
  • Like
Reactions: sherhi

TEAMSWITCHER

Distinguished
Aug 7, 2008
206
5
18,685
Both AMD and Intel are launching new processors and platforms this fall. Both vendors will be HEAVILY targeting gamers in all of their marketing. But here's the rub... If you have a decent machine that you have built in the last 2-3 years, you could spend as much or more than these new 4000 series RTX cards and see far less benefit in the games that you are playing. If you are gaming at 4K already, then the difference will be maybe a handful of extra frames per second on average. So, I think I may just give Nvidia my $1600 here .. and call it Good.
 

TEAMSWITCHER

Distinguished
Aug 7, 2008
206
5
18,685
except AMD will lower price unlike Nvidia.

Happened wi htheir last big flop gpu (think used hb memory?)

tbh if prices keep rising im sorry but theres a limit to what ANYONE will pay outside server grade stuff.

they'll juts buy used or lower tier cards.
and if it gets too extreme i could see consoles becoming the answer to the high price of gaming. (im pc master race but there IS a limit)

AMD doesn't lower prices... rather they lack the features and performance such that they cannot raise their prices any higher. I'm not sure saving a few dollars is always a good thing. Sometimes, you are paying less because you are getting less.
 

TEAMSWITCHER

Distinguished
Aug 7, 2008
206
5
18,685
I kinda did the math. TSMC wants around $4k per wafer, ~80 GPU per wafer, ~$6GB for GDDR6X, and probably $50 for the rest of the card and cooler. Their COG delivered is going to be well under $300 for each 4090. How much margin do they need to cover all of the costs in the business?

Value pricing. It's good to be king. If I were in charge at Nvidia, I would do the exact same thing.
 

RedBear87

Commendable
Dec 1, 2021
150
114
1,760
AMD started beating Intel on CPUs when it switched to chiplets, especially on cost, and now they're about to do the same for GPUs.
Beating Intel on cost in the sense that they started charging more? Do you guys remember AMD's launching MSRP for Zen 3 compared to Alder Lake's? Or do you live in an alternative universe? You can find the infos easily on wikipedia if you've already forgotten it, the line started from the $300 5600X and it ended with a $799 5950X. Alder Lake on the other hand started from as low as $42 for a Celeron G6900 (AMD can't effectively scale down its Zen 3 architecture below 6 cores, leaving on paper the whole low end to Intel) and ended with a $589 12900K, even the 12900KS halo chip released later was "only" $739.
 
  • Like
Reactions: UnCertainty08