Rumors are that Nvidia has cooked up a new piece of silicon for the GeForce RTX 4080.
Nvidia Prepping RTX 4080 With New Silicon, Report Claims : Read more
Nvidia Prepping RTX 4080 With New Silicon, Report Claims : Read more
Best seller? Flying off the shelves?
Umm... they are literally gathering dust on the shelves and selling very poorly. Where do you guys get this stuff?
Despite the steep price tag, the GeForce RTX 4080 is still one of the best-selling graphics cards on Newegg.
Why? 4090 was 100$ above 3090. 3080 was $699, so proper price should be 799$ for 4080. And 4070 Ti was the last one out from nvidia and should be priced... 699$. You take it out in the wrong order. Latest released graphics cards shouldn't determine the price point for the earlier released SKUs. That is to start in the wrong end.Based on my previous analysis, the RTX 4080 should have a list price of about $1049, to be in line with how the RTX 4070 Ti and RTX 4090 are priced.
I would like it to be much cheaper than that, of course, as that's still much more than I'd spend on a GPU for mere personal use.
Based on $ per mm^2 and $ per GB. The RTX 4080 is an outlier in both dimensions.Why?
GPU | Die Area (mm^2) | Memory (GB) | MSRP | cost/area | cost/GB |
---|---|---|---|---|---|
RTX 4070 Ti | 294.5 | 12 | $799 | $2.71 | $66.58 |
RTX 4080 | 378.6 | 16 | $1,199 | $3.17 | $74.94 |
RTX 4090 | 608.5 | 24 | $1,599 | $2.63 | $66.63 |
If you count in the die size... 4080 is around 60% smaller than 4090. And have less vram. Then you have the calculus for 3080. Minor differenc in die size vs 3090 but a hell lot cheaper. You can't always calculate this way. And neither did Nvidia. They defended the much higher price due the increased performance.Based on $ per mm^2 and $ per GB. The RTX 4080 is an outlier in both dimensions.
GPU Die Area (mm^2) Memory (GB) MSRP cost/area cost/GB RTX 4070 Ti 294.5 12 $799 $2.71 $66.58RTX 4080 378.6 16 $1,199 $3.17 $74.94RTX 4090 608.5 24 $1,599 $2.63 $66.63
To better align with both costs and its relative performance, it should be about $1049.
Note that I'm not arguing $1049 represents a reasonable value for consumers. Just that it's an outlier within its own family.
Their costs correlate pretty closely with die size and GDDR memory. So does performance, in fact.You can't always calculate this way. And neither did Nvidia. They defended the much higher price due the increased performance.
Unlike a couple years ago, there are tons of lower-priced cards readily available.Porsche pricing strategy (constrain supply to enable massive margins) only works when there is an ocean of lower priced car(d)s to support the market.
The die is cast, so to speak. I think their upper-end designs are simply too large and costly to make at the prices people want to pay, which essentially seems to be their old price scale from a couple generations ago. Pay close attention to the amount of L2 cache they used in this generation, compared with all prior generations - that chews up a lot of die space, especially in newer process nodes.Now Nvidia should focus on getting next gen cards into the hands of PC gamers,
Based on $ per mm^2 and $ per GB. The RTX 4080 is an outlier in both dimensions.
GPU Die Area (mm^2) Memory (GB) MSRP cost/area cost/GB RTX 4070 Ti 294.5 12 $799 $2.71 $66.58RTX 4080 378.6 16 $1,199 $3.17 $74.94RTX 4090 608.5 24 $1,599 $2.63 $66.63
To better align with both costs and its relative performance, it should be about $1049.
Note that I'm not arguing $1049 represents a reasonable value for consumers. Just that it's an outlier within its own family.
Thanks for confirming my earlier point about price expectations.Case in point: 1080 was ~$500. The 4080's "MSRP" is ~240% the price in 3 generations.
It's just greed. AMD is no better. And they are paying for it.
The RTX 4070 Ti reportedly outperforms the RTX 3090. We have yet to see how their lower-end 4000-series GPUs perform, but they could represent a compelling value at more affordable prices. Waiting is the part that sucks.I see Nvidia's consumer net profits plummeting in 6 months. They are running out of suckers who have the cash.
We here in Portland, WA have (2) Best Buy’s and the wire-cages are stocked and or full with 4080’s. I counted about 30 4080's overall sitting in the cages at close to MSRP. Simply they are not moving off the shelves even with OR having no State sales tax! The latest (2) AMD GPU’s for most buyers maintains to be a dirty word and of course they are available in big numbers as well right next to the 4080's. The masses or enthusiasts still simply want 4090’s at MSRP and 'Partner' cards always go first. I was told that they hadn’t any 4090’s in stock for any length of time during the past 4-6 months. The only 4090’s which were available are ‘Open Box’ items which took just days to sell instead of hours. It’s a crazy world out there with cash supposedly being in short supply.
The thing is the "4080 12GB" was going to be $899 and that comes out to be $3.05 per mm² and $74.92 per GB; much closer to the current 4080 pricing than the 4090 in terms of value. The 4070 ti being a decent value (relatively) is purely due to backlash. The 4080 16 GB was already launched so Nvidia changing the price was just not going to happen. The new AD103 die will probably be part of a second wave of 4080s launching at a reduced price. The die itself isn't that much cheaper, but if it needs a new PCB it'll probably go into simpler designs with smaller coolers and maybe a $1099 MSRP.Based on $ per mm^2 and $ per GB. The RTX 4080 is an outlier in both dimensions.
GPU Die Area (mm^2) Memory (GB) MSRP cost/area cost/GB RTX 4070 Ti 294.5 12 $799 $2.71 $66.58RTX 4080 378.6 16 $1,199 $3.17 $74.94RTX 4090 608.5 24 $1,599 $2.63 $66.63
To better align with both costs and its relative performance, it should be about $1049.
Note that I'm not arguing $1049 represents a reasonable value for consumers. Just that it's an outlier within its own family.
What I don't understand is how you reach the conclusion that it's "just greed".
Determining what a graphics cards should cost based on the price of a transistor is about as bad a method as there is. The actual cost of making a GPU is a smaller percentage of the overall cost of developing and selling a graphics card each generation. Nvidia's R&D costs have increased every quarter going back to 2016. They spent about $6.9 billion over the past year. Just 4 years ago, or 2 GPU generations, they had spent about $2.25 billion the previous year. More than triple the expenditures in 4 years. Where in your cost per transistor calculation is that cost increase reflected? How much do you think it costs Nvidia to pay their increasingly large engineering staff? I bet that cost isn't decreasing. Is their marketing budget stagnant year after year? Probably not. The real world costs of developing high tech products today can not be simplified down to one basic metric like cost per transistor.Moore's law says 2x transistor density in 18months. I know that is no longer feasible. But transistor density is largely what determines performance and cost per transistor. With each generation the cost of each transistor should go down.
That said, yes wafer cost are going up. However this merely slows down the price decrease per transistor per generation.
Looking at transistor count and cost per transistor based on wafer node, the margin Nvidia and amd are pulling are insane compared to what it used to be.
I agree with you. I could care less about the Uber high end. I'm more a 1 to 2 down the product stack. But not above $750 for #2 card. Six years back $750 would have gotten you a #1 card. $550 #2
Determining what a graphics cards should cost based on the price of a transistor is about as bad a method as there is. The actual cost of making a GPU is a smaller percentage of the overall cost of developing and selling a graphics card each generation. Nvidia's R&D costs have increased every quarter going back to 2016. They spent about $6.9 billion over the past year. Just 4 years ago, or 2 GPU generations, they had spent about $2.25 billion the previous year. More than triple the expenditures in 4 years. Where in your cost per transistor calculation is that cost increase reflected? How much do you think it costs Nvidia to pay their increasingly large engineering staff? I bet that cost isn't decreasing. Is their marketing budget stagnant year after year? Probably not. The real world costs of developing high tech products today can not be simplified down to one basic metric like cost per transistor.
The last time Nvidia sold a halo single GPU gaming card under $1000 was 2012 with the 600 series.
The RTX 4070 Ti is the RTX 4080 12 GB. They simply renamed it!The thing is the "4080 12GB" was going to be $899 and that comes out to be $3.05 per mm² and $74.92 per GB; much closer to the current 4080 pricing than the 4090 in terms of value.
Whatever the reason, its pricing is in line with the RTX 4090, which makes the RTX 4080 the outlier.The 4070 ti being a decent value (relatively) is purely due to backlash.
Price changes happen all the time, though I think it is harder for Nvidia to reprice something that's already on the market and in the channel. They're probably hoping some of that inventory will burn off, before they take the financial hit of doing it. Perhaps, they're even waiting so that the financial impact falls in a different quarter of their financial year.The 4080 16 GB was already launched so Nvidia changing the price was just not going to happen.
Let's hope. It would be progress.The new AD103 die will probably be part of a second wave of 4080s launching at a reduced price. The die itself isn't that much cheaper, but if it needs a new PCB it'll probably go into simpler designs with smaller coolers and maybe a $1099 MSRP.