News Why Nvidia's RTX 4080, 4090 Cost so Much

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

lmcnabney

Prominent
Aug 5, 2022
192
190
760
For TSMC, the cost per wafer at 5nm is around $17,000, $4k is about what they charge for 28nm and I believe their 7nm is going for about $12k.

Nobody pays rack sale prices - especially not big players like AMD and Nvidia. If you have a custom design and want to run 50 wafers - yeah, that's your price. The big players are paying a small fraction of that.
 

JTWrenn

Distinguished
Aug 5, 2008
253
169
18,970
Both AMD and Intel are launching new processors and platforms this fall. Both vendors will be HEAVILY targeting gamers in all of their marketing. But here's the rub... If you have a decent machine that you have built in the last 2-3 years, you could spend as much or more than these new 4000 series RTX cards and see far less benefit in the games that you are playing. If you are gaming at 4K already, then the difference will be maybe a handful of extra frames per second on average. So, I think I may just give Nvidia my $1600 here .. and call it Good.

I don't think the price per performance argument you are making here will hold up when the new amd cards hit. They moved past Nvidia in the overall architecture idea by going to chiplets. I think amd is going to trounce nvidia this gen in the price per performance, and that is 100% where the real rubber meets the road. Nvidia will likely hold the overall performance crown...but maybe not if Amd's system scales well on the high end because they could just make a monster packed with a billion chiplets.

This is feeling a bit like intel v apple to me. One made the investment and the architecture jump early and the other is playing catchup. edit: because they had a lead and milked their current tech too long.
 
Last edited:
People needs to remember the Radeon HD4870.

I wish AMD went back to that time in how they did things instead of trying to blow up their cards for a few extra frames. Same with nVidia, mind you.

Regards.
 
  • Like
Reactions: RodroX

ien2222

Distinguished
Nobody pays rack sale prices - especially not big players like AMD and Nvidia. If you have a custom design and want to run 50 wafers - yeah, that's your price. The big players are paying a small fraction of that.

Well, sure. But I extremely doubt it's $4k, I figured it would be around $8k - $10k Nvidia is paying, Jarred is guessing around $8k or so. I've usually heard it's somewhere in the 40-60% discount they receive.
 

SunMaster

Commendable
Apr 19, 2022
159
135
1,760
AMD doesn't lower prices... rather they lack the features and performance such that they cannot raise their prices any higher. I'm not sure saving a few dollars is always a good thing. Sometimes, you are paying less because you are getting less.

I think it's extremely wishful thinking that AMD lacks features and performance. Prior to nv4k the world suddenly forgot about dlss 1|2 due to AMDs fsr, which worked all cards. All of a sudden nVidias "3.0" happens to only work on the last gen. I'm not buying it, there's no reason AMD won't/can't keep improving FSR and nullifying the "advantage" nVidia has with their interpolated frames technology.

I'm quite sure AMDs chiplet technology has abled them a far cheaper manufacturing process. In addition they lack the stockpile of previous generation GPUs to get rid of. Nvidia is maybe about to enter the same <Mod Edit> Intel's been in the last few years. Both companies due to their own arrogance.
 
Last edited by a moderator:
Beating Intel on cost in the sense that they started charging more? Do you guys remember AMD's launching MSRP for Zen 3 compared to Alder Lake's? Or do you live in an alternative universe? You can find the infos easily on wikipedia if you've already forgotten it, the line started from the $300 5600X and it ended with a $799 5950X. Alder Lake on the other hand started from as low as $42 for a Celeron G6900 (AMD can't effectively scale down its Zen 3 architecture below 6 cores, leaving on paper the whole low end to Intel) and ended with a $589 12900K, even the 12900KS halo chip released later was "only" $739.
Ryzen is only reason Intel has reasonable prices at all. (to point they actually cut price of stuff...which they hadn't done in ages)

Competition means better prices.

Intel use to charge a ton for what you got...AMD had nothing near their performance and Intel could charge the fee.

AMD is now in that position. They are unmatched in anythign multi core. and world is more multithreaded now than ever before. (so matters a lot)

Neither intel/nvidia/amd are charities. They are business and they have investors.
They will charge as much as they can get away with.

about low end...yes amd ditched it. not much you can say about that (its not the greatest profit sector).

and they do have a sub6 core coming in the 5100. (its 4 (8) core (thread) cpu on zen 3) and it will likely be aroudn 100 or less (given the price stack for the next tier is at 129)

and look at the 12900k vs the 5800x3d.
intels is $580+
amd's? $420

in anything that can use the cache it beats the intel cpu.
outside that? they trade wins (depending on what ur doing)


again neither 1 is a charity.
 
  • Like
Reactions: TCA_ChinChin
Beating Intel on cost in the sense that they started charging more? Do you guys remember AMD's launching MSRP for Zen 3 compared to Alder Lake's? Or do you live in an alternative universe? You can find the infos easily on wikipedia if you've already forgotten it, the line started from the $300 5600X and it ended with a $799 5950X. Alder Lake on the other hand started from as low as $42 for a Celeron G6900 (AMD can't effectively scale down its Zen 3 architecture below 6 cores, leaving on paper the whole low end to Intel) and ended with a $589 12900K, even the 12900KS halo chip released later was "only" $739.
Of course, Ryzen 5000 series chips launched when compared to Alder Lake? That's right: November 5, 2020 for AMD 5000-series, and exactly one year later for Alder Lake 12900K. There's a reason Intel dropped generational pricing, and it's that AMD had actually launched a competitive CPU. Core 11th Gen Rocket Lake CPUs did not do very well at all against Zen 3. Intel hit AMD hard with Alder Lake.

To the other point on pricing a leading product, though, I absolutely agree. If RDNA 3 actually has better performance than Ada Lovelace, AMD isn't going to price it far below the RTX 40-series. If AMD can match RTX 4090 performance, it will probably charge $1,200 at least.

And we still need to factor in DLSS 2/3, which are used in far more games (DLSS 2 specifically) than FSR 2/2.1 right now. It's good that AMD has an alternative upscaler, but it needs more adoption. If DLSS 3 can double the FPS and even look close to native, though, that's going to be hard to beat. I frankly have no problem with the idea of frame generation that doesn't depend on the CPU doing a bunch of work, and assuming it looks good I hope we see widespread support. I want widespread support for FSR 2.1 and XeSS as well! Because choice is a fine thing to have.
 

RedBear87

Commendable
Dec 1, 2021
153
120
1,760
Of course, Ryzen 5000 series chips launched when compared to Alder Lake? That's right: November 5, 2020 for AMD 5000-series, and exactly one year later for Alder Lake 12900K. There's a reason Intel dropped generational pricing, and it's that AMD had actually launched a competitive CPU. Core 11th Gen Rocket Lake CPUs did not do very well at all against Zen 3. Intel hit AMD hard with Alder Lake.
Intel's CPU prices weren't terribly above AMD's ones even in the previous generation, when AMD actually introduced chiplets in desktop CPUs and supposedly gained a pricing advantage because of it. The 9600K was $262, 3600 was $250; 9700K was $374, 3800X was $399; 9900K was $488 , 3900X was $499.
Ryzen is only reason Intel has reasonable prices at all. (to point they actually cut price of stuff...which they hadn't done in ages)

Competition means better prices.

Intel use to charge a ton for what you got...AMD had nothing near their performance and Intel could charge the fee.

AMD is now in that position. They are unmatched in anythign multi core. and world is more multithreaded now than ever before. (so matters a lot)

Neither intel/nvidia/amd are charities. They are business and they have investors.
They will charge as much as they can get away with.

about low end...yes amd ditched it. not much you can say about that (its not the greatest profit sector).

and they do have a sub6 core coming in the 5100. (its 4 (8) core (thread) cpu on zen 3) and it will likely be aroudn 100 or less (given the price stack for the next tier is at 129)
Since the release of Alder Lake AMD is not really unmatched in multicore, in part because the high end parts have those additional small cores (but even the parts like the 12400 can slightly beat the equivalent 5600X in Cinebench) and they have a lead in single thread. What really AMD gets for itself is energy efficiency, but they still benefit from the positive initial coverage of Zen 3. It's getting to the point that AMD fanboys have stopped relying so much on Cinebench because it makes Intel parts look better...

About the 5100, last time I read about it was in March, I think? Before the release of all those obsolete 4100/4500 parts. I'm not sure it's really going to come out and if it does it might have a limited release like the 3300X back in its days, in my opinion. No idea about pricing, but I think you should compare it to the closest Zen 3 part, which is the 5500 at $149, so I'd guess it could fall around $120 (MSRP) or so.
 
Last edited:

Barefoot

Distinguished
May 30, 2013
8
3
18,515
Don't get me wrong, I understand the "it cost more, and it needs more power but it finish the work faster", but not everyone can go from a 500~600 watts PSU to a 750 to 1000 watts unit at the same time they need to pay for the new GPU.

From what I'm gathering you're going to need a much larger PSU than 1000 watt for this card! Jayztwocents runs it at approx 1600 watt.
 

dimar

Distinguished
Mar 30, 2009
1,041
63
19,360
Why do people even care about the price? Nobody's forcing anybody to buy anything. I might go for the cheapest 4090 if I can sell my 3090 at a good price. But I'm thinking going 5090 instead.
 

tamalero

Distinguished
Oct 25, 2006
1,132
138
19,470
The Nvidia RTX 4090 costs a lot, but part of that has to do with the design decisions Nvidia made. Here's our analysis of the AD102 die shots Nvidia has posted, as well as AD103 and AD104 thoughts, along with a breakdown of why monolithic dies are hitting the end of the road.

Why Nvidia's RTX 4080, 4090 Cost so Much : Read more
Why would you attempt to compare core and shader counts even when the architecture is not even similar?
 
D

Deleted member 2783327

Guest
We just started to recover from the RTX 30 series pricing at 2 years of insane prices. Barely any relief and then the same pricing insanity.

It's like nVidia don't want poeple to own their new tech. "You will own an antique GPU and you will be happy". Or maybe they learned from the last two years that some people will pay absurd prices for GPUs. the "life is not worth living if I can't have the latest [insert tech]" crowd.

<sigh> I guess some of us will sill be running 10th gen GPUs when humans have long since settled Mars :)
 

Red_In_The_Sky

Commendable
Jul 27, 2020
6
0
1,510
Even if Nvidia has to charge this much for the cards for various reasons, the 4080s are a ripoff, with the 4090 just getting a pass because the top model is always overpriced; the 4080 16gb is only 60% of Lovelace's resources, compared to the base 3080 being 80% of Ampere, and the 3080 ti being 97%. It and the 4080 12gb are way more cut down than they have to be, even allowing for future Ti versions.
 

InvalidError

Titan
Moderator
Even if Nvidia has to charge this much for the cards for various reasons, the 4080s are a ripoff, with the 4090 just getting a pass because the top model is always overpriced; the 4080 16gb is only 60% of Lovelace's resources, compared to the base 3080 being 80% of Ampere
I doubt many people look at GPUs gen-on-gen based on the percentage of die they get at any given tier relative to the previous generation. Most are far more interested in how much extra performance they can get for a given price or how much performance they are getting per dollar - it doesn't matter how big or small the die is or how much of it you get to use as long as the price for the performance makes some degree of sense.
 
From what I'm gathering you're going to need a much larger PSU than 1000 watt for this card! Jayztwocents runs it at approx 1600 watt.

For RTX 4090 maybe, but what I wrote was more about RTX 4060 and 4070 lines. I have never been able to aim higher than the "70" segment (nor that I ever need it anyways).

I mainly play sim trucking games (ETS 2, ATS and Snowrunner) at 1440p @ 60Hz. I would love to play at higher refresh rates, but 60Hz is more than enough for now. Lucky me the RTX 2070 in my system is capable of give me that level of performance with graphic settings on very high and ultra.
 

Mr. Smith

BANNED
Dec 25, 2021
55
11
35
Because 3000 series cards are still expensive, if they try to price the 4000 series card any less they will cannibalize the sales of 3000 series cards.
 

InvalidError

Titan
Moderator
Because 3000 series cards are still expensive, if they try to price the 4000 series card any less they will cannibalize the sales of 3000 series cards.
The 3000 series' sales are already being "cannibalized" by high prices making people not buy them. Even higher prices mean even more people will not buy them harder. Something will eventually have to give since the alternative is an even steeper sales downturn, reduced income and increased stock of deprecating inventory. Even if you hold stock to sell at inflated prices, everyone who has stocks still has to pay for space to store them, insurance in case something goes wrong, wages of people managing the bloated inventory, etc. At some point, someone will want to cut their losses.
 
  • Like
Reactions: King_V

BILL1957

Commendable
Sep 8, 2020
59
17
1,535
I kinda did the math. TSMC wants around $4k per wafer, ~80 GPU per wafer, ~$6GB for GDDR6X, and probably $50 for the rest of the card and cooler. Their COG delivered is going to be well under $300 for each 4090. How much margin do they need to cover all of the costs in the business?
Do you think that the equipment on the assembly lines, the factory buildings they are housed in or the workers to man those lines and the electricity for those factories come at $0 cost?
Do you think shipping a container of these half way around the world is free as well?
Even when a engineer on their R&D team takes a weeks vacation those cost are covered how, by the profits made on the gpu's said engineer helped design!
Actual cost figured into that product are lot more long reaching than just the cost of components that go in the box!
And oh yeah, that box or the box art on it were not at a 0 cost either!
Look at the entire cost and you can see these companies have a lot of expenses to cover.
 

hannibal

Distinguished
You have to wonder... For how long nVidia will blindly believe their most hardcore fans will keep buying all the crap they feed them?

I have to say... Long, long time!
AMD has had better price / performance ratio many times and still people buy Nvidia, so Nvidia don´t have to worry people going to buy AMD... They still buy Nvidia no matter what. Same this as Apple vs Android phones. The price means nothing to some people. And it seems that that group is the big majority! About 80% of the customers.