News Why Nvidia's RTX 4080, 4090 Cost so Much

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

lmcnabney

Prominent
Aug 5, 2022
192
190
760
Do you think that the equipment on the assembly lines, the factory buildings they are housed in or the workers to man those lines and the electricity for those factories come at $0 cost?
Do you think shipping a container of these half way around the world is free as well?
Even when a engineer on their R&D team takes a weeks vacation those cost are covered how, by the profits made on the gpu's said engineer helped design!
Actual cost figured into that product are lot more long reaching than just the cost of components that go in the box!
And oh yeah, that box or the box art on it were not at a 0 cost either!
Look at the entire cost and you can see these companies have a lot of expenses to cover.
It is almost like you didn't read the last sentence in my post. You know, the one that indicated that the margin had to cover all of the costs of the business.
 
If it is a true 8nm to 4nm move and the die size is roughly the same, the you could have up to close to 16x the transistors. Yet only 2x improvement? Something not passing the sniff test. Not that I'm doubting the die size but that is a horrible decrease in efficiency. Soomething is wrong.
 
Why would you attempt to compare core and shader counts even when the architecture is not even similar?
LOL. Everything Nvidia has said about core and shader counts basically indicates that Ada is refined Ampere, on a new process node, with some enhancements that will specifically require developer intervention.

SER? Needs the dev to us NVAPI for it. OMM? Same. DMM? Same. DLSS 3? Same. Why would we not look at and talk about theoretical performance? ¯\(ツ)

We're not talking about Nvidia cores vs AMD cores vs Intel cores. We're talking about Nvidia current gen cores vs next gen cores. Is it perfectly apples to apples? No, of course not. But this is about as close as you'll get to having teraflops actually mean something. And it's a lot different than Ampere vs. Turing where FP32 pipelines were doubled on each SM.
 
  • Like
Reactions: digitalgriffin
From what I'm gathering you're going to need a much larger PSU than 1000 watt for this card! Jayztwocents runs it at approx 1600 watt.
JayzTwoCents says a lot of factually incorrect things and gives some bad advice. He also gets free hardware like most people testing this stuff, which means he can run whatever is "best" and not worry about cost.

If these are 450 TBP (Total Board Power) for RTX 4090, add in 250W for CPU and motherboard and other stuff and you're at 700W peak power draw.
Most PSUs run at optimal efficiency between 50-70% load, so worst-case 1400W, yes, but 1000W should also be perfectly fine.
And if you have a higher quality 80 Plus Platinum PSU, you could even run an 850W model and be okay.

Now if you want a custom overclocked model with a 600W TBP, then yes, go for a 1500-1600W PSU. Give yourself some headroom. 1300W would still suffice as well, but it's probably easier to just find a 1500/1600 W.

Nvidia is also talking about better transients with RTX 4090 vs. RTX 3090 Ti, which means if you have a PSU that can handle a 3090 Ti, it should definitely handle a 4090. It will put less of a strain on the PSU because it won't spike or dip as much.
 
  • Like
Reactions: -Fran-

Dean0919

Honorable
Oct 25, 2017
271
40
10,740
If they keep increasing prices for video cards, amount of buyers will keep decreasing, because not everyone can afford $1000 & $1500 for video cards. For some people it's whole month salary & in poorer countries they don't even have that much salary. It's pretty sad that company doesn't care about low class gamers anymore. Personally I'm stuck with GTX 1070 & since that I can't afford upgrade, sadly.
 
If it is a true 8nm to 4nm move and the die size is roughly the same, the you could have up to close to 16x the transistors. Yet only 2x improvement? Something not passing the sniff test. Not that I'm doubting the die size but that is a horrible decrease in efficiency. Soomething is wrong.
The actual numbers are largely meaningless these days. But we don't scale in three dimensions, just two. So 50% smaller is potentially 4X the density, but you need more spacing (proportionally) with smaller nodes. In actuality, GA102 was 45.04 MTrans/mm^2, and AD102 is 125.41 MTrans/mm^2. So density is 2.78X higher, and that's quite impressive.
 

sherhi

Distinguished
Apr 17, 2015
80
52
18,610
they'll juts buy used or lower tier cards.
and if it gets too extreme i could see consoles becoming the answer to the high price of gaming. (im pc master race but there IS a limit)

Well I reached my limit already, Nvidia decided that average GPU price = new console price, so mathematically it means that any solid lower/mid tier card (3060/3060ti) will (and in EU it does) cost around 500€. My options are:
  1. replace my PC for 1500-2000€ or
  2. buy 500€ console for 4k TV or
  3. buy 500€ GPU to play on my 1080p monitor

I am not the only one facing this dilemma, steam HW survey says most people have older 1080p systems...
 

King_V

Illustrious
Ambassador
I have to say... Long, long time!
AMD has had better price / performance ratio many times and still people buy Nvidia, so Nvidia don´t have to worry people going to buy AMD... They still buy Nvidia no matter what. Same this as Apple vs Android phones. The price means nothing to some people. And it seems that that group is the big majority! About 80% of the customers.



I mean, there is still a lot of misinformation out there.

I gave some advice to someone just the other day... They wanted to get a 3440x1440 GSync monitor, and was asking for a recommendation.

I explained that GSync would lock them in to Nvidia-only GPUs, and that FreeSync monitors (designated "GSync compatible" if you want to be super-cautious) is the way to go, letting you have adaptive sync whether you went Nvidia or AMD with a future GPU.

The replied that they had thought that they NEEDED a GSync monitor for compatibility purposes, and after relieved to learn that this wasn't the case.

So, whether it's brand loyalty, or people pushing incorrect info, there's still that "you HAVE to get Nvidia" sort of mentality lingering.

Nvidia, I have little doubt, knows this, and, I also have little doubt, shamelessly takes advantage of it.

I miss the days when their behavior was that of the little guy.
 
  • Like
Reactions: Phaaze88
Nvidia is also talking about better transients with RTX 4090 vs. RTX 3090 Ti, which means if you have a PSU that can handle a 3090 Ti, it should definitely handle a 4090. It will put less of a strain on the PSU because it won't spike or dip as much.
Highlight mine and an important nitpick that is worth mentioning over and over: they mentioned the FE card had a better power delivery, not the overall 4090 design(s) they've shared with the partners. As always, I'd love to be wrong on such matters, but I doubt it.

As for the overall statement, I'll agree with it initially, since it appears like the 3090ti was more of a test-bed of what insane power envelope they could get into a card for them and AIBs.

Regards.
 
If they keep increasing prices for video cards, amount of buyers will keep decreasing, because not everyone can afford $1000 & $1500 for video cards. For some people it's whole month salary & in poorer countries they don't even have that much salary. It's pretty sad that company doesn't care about low class gamers anymore. Personally I'm stuck with GTX 1070 & since that I can't afford upgrade, sadly.
Who or what is forcing people to upgrade?
 

Tac 25

Estimable
Jul 25, 2021
1,391
421
3,890
So, whether it's brand loyalty, or people pushing incorrect info, there's still that "you HAVE to get Nvidia" sort of mentality lingering.

Nvidia, I have little doubt, knows this, and, I also have little doubt, shamelessly takes advantage of it.

I miss the days when their behavior was that of the little guy.

not brand loyalty for me. It's the terrible review at gpubench which made me choose nvidia. Fighter 6600XT and Asus RTX 3050 are both priced at around 18,500 pesos (approx 320 usd) brand new with warranty, in the local store I went to a few months ago. Purchased the RTX 3050 without a second thought, even though I knew the 6600XT is the stronger gpu.

 

Chung Leong

Reputable
Dec 6, 2019
494
193
4,860
Semiconductor manufacturing is very energy-intensive. Cost of electricity is sky-rocketing everywhere. The new chips are made using EUV lithography, which requires 10 times more energy than previous generation tech.
 

Phaaze88

Titan
Ambassador
not brand loyalty for me. It's the terrible review at gpubench which made me choose nvidia. Fighter 6600XT and Asus RTX 3050 are both priced at around 18,500 pesos (approx 320 usd) brand new with warranty, in the local store I went to a few months ago. Purchased the RTX 3050 without a second thought, even though I knew the 6600XT is the stronger gpu.

Holy-!
You used userbenchmark to justify that purchase...
That site is biased against AMD's cpus and gpus...
 

SunMaster

Commendable
Apr 19, 2022
216
195
1,760
not brand loyalty for me. It's the terrible review at gpubench which made me choose nvidia. Fighter 6600XT and Asus RTX 3050 are both priced at around 18,500 pesos (approx 320 usd) brand new with warranty, in the local store I went to a few months ago. Purchased the RTX 3050 without a second thought, even though I knew the 6600XT is the stronger gpu.

Nobody ever got fired for choosing Microsoft.... or Intel.... or nVidia. Unfortunately. At least it was your own money :)
 

Tac 25

Estimable
Jul 25, 2021
1,391
421
3,890
Holy-!
You used userbenchmark to justify that purchase...
That site is biased against AMD's cpus and gpus...

yeah, their black propaganda worked. I'm now brainwashed.

using the 3050 right now. It runs all my playstation 4 and playstation 5 games beautifully with no lag, so it feels the money went to good use. The 3050 even runs Dead or Alive VI at max graphics easily, something that the 1050ti in my backup pc cannnot do.



if I had purchased the 6600XT, who knows what evil things have happened to my rig - just joking with that. lol
 
Last edited:

Phaaze88

Titan
Ambassador
yeah, their black propaganda worked. I'm now brainwashed.

using the 3050 right now. It runs all my playstation 4 and playstation 5 games beautifully with no lag, so it feels the money went to good use. The 3050 even runs Dead or Alive VI at max graphics easily, something that the 1050ti in my backup pc cannnot do.


if I had purchased the 6600XT, who knows what evil things have happened to my rig - just joking with that. lol
Every now and then someone uses links from cpu/gpubench not realizing that it's just UBM.

Well, what's done is done... enjoy it.
 

chalabam

Distinguished
Sep 14, 2015
154
37
18,720
I have 2 broken nvidia cards. They failed suddenly.

The frequency at which they fail, makes spending 1000 in a single card even more expensive. And a card that consume so much power is even more risky.
 
And how AMD's RDNA3 chiplets could seriously undercut Nvidia
AMD likely could undercut Nvidia, but I'm not sure they will, at least not by much. Maybe the AMD of a few years ago would have, but lately they seem to be reducing costs, but not actually passing those savings on to consumers. If they have a card roughly as fast as the 4090, then it wouldn't be surprising to see them price it similarly. And if they have anything notably faster, it might even be priced higher. I would like to see them undercut Nvidia by a decent amount, but I have some doubts that's going to happen, considering the not-so-competitive pricing of their new Ryzen lineup.

A decade ago, the GTX Titan released at $1,000 MSRP

Today, the RTX 4080 at $1,200.

Bringing multiple times the performance, for 20% higher price.
That's not a great example though, seeing as the Titan was generally considered a terrible value at the time, and was barely any faster than the GTX 780 that launched around the same time. Most sites didn't even bother reviewing Titan cards, and Nvidia generally didn't send out review samples. Of course, the 3090 was much the same relative to the 3080, or at least would have been if it weren't for crypto-shortages pushing the 3080's pricing up to a similar level.

And while the Titan was the top-end consumer card, meant for those willing to spend an exorbitant amount to get leading performance even if it only meant a few extra frames, that doesn't apply to the 4080 16GB, which is going to be well behind the 4090 in terms of performance. Though even there, the 4090 doesn't seem to be as far ahead of the 4080 cards as it probably should be, considering it has nearly double the hardware resources. The scaling seems really bad in Nvidia's charts, perhaps due to power and heat limitations.

And the 4080 12GB and 4080 16GB are weird. They are totally different cards, but given the same name. Not only do they have different VRAM amounts, but they use entirely different graphics chips. The 16GB card has 27% more shader cores, 33% more cache, 40% more TMUs, and 46% higher memory bandwidth. The "4080 12GB" was probably meant to be more of a "4070", but the marketing department likely decided that people would be up in arms if the "x80" card cost $1200 and the "x70" cost $900, so they did some shenanigans with model names once again to make the price hikes not look quite as bad. So now we have two different cards with the same name, and people will see reviews about how fast the 4080 16GB is, and buy the 12GB model not realizing its actually a different card at a lower performance level.

People needs to remember the Radeon HD4870.

I wish AMD went back to that time in how they did things instead of trying to blow up their cards for a few extra frames. Same with nVidia, mind you.
That was more ATI though. AMD had already bought the company prior to the launch of that series, but owned it less than two years at that point, and the companies were likely still operating mostly separate from one-another. The cards were still branded as "ATI" cards, and no doubt were designed by the same crew that had been designing ATI's prior cards up to that point. By now, the Radeon division has undoubtedly been integrated more into AMD, so it's arguably a different company.

not brand loyalty for me. It's the terrible review at gpubench which made me choose nvidia. Fighter 6600XT and Asus RTX 3050 are both priced at around 18,500 pesos (approx 320 usd) brand new with warranty, in the local store I went to a few months ago. Purchased the RTX 3050 without a second thought, even though I knew the 6600XT is the stronger gpu.
Userbench is so bad. Their writeups for hardware can hardly be considered "reviews", and are more like the factually incorrect rantings of a 12 year old fanboy in some tech forum. : P And their benchmarks are synthetic, often okay for getting a rough idea of the relative performance of hardware released years apart, and generally decent at comparing cards using identical architecture, but often inaccurate when comparing different architectures. And they clearly have an agenda against AMD, perhaps for stock manipulation, if not getting paid off directly by their competitors. When comparing cards currently on the market, it's a much better idea to look at actual reviews that test performance in real games and make at least some attempt to be objective with their recommendations.
 
  • Like
Reactions: JWNoctis

Dean0919

Honorable
Oct 25, 2017
271
40
10,740
Who or what is forcing people to upgrade?
No one is forcing people to upgrade, but when your card gets so old that it struggles in most games, it means it's time to upgrade, but when card costs twice & 3 times more than it used to be in the past, you're basically stuck with your current card, because Jensen Huang or whatever his is name decided to become richer and doesn't give a rat's ass about middle or low class people. You understand it now?
 
Last edited:
D

Deleted member 14196

Guest
No one is forcing people to upgrade, but when your card gets so old that it struggles in most games, it means it's time to upgrade, but when card costs twice & 3 times more than it used to be in the past, you're basically stuck with your current card, because Jensen Huang or whatever his is name decided to become richer and doesn't give a rat's ass about middle or low class people. You understand it now?
Yes, the only reason for this is pure greed and I don’t understand why people keep giving them their money
 
  • Like
Reactions: Dean0919