Nvidia has a list of RTX 4070 cards going on sale tomorrow, priced from $599 to $699. There are a lot of $599 models, and plenty more that are within $30 of that mark.Most cards are not going to be $599 most will cost more.
Nvidia has a list of RTX 4070 cards going on sale tomorrow, priced from $599 to $699. There are a lot of $599 models, and plenty more that are within $30 of that mark.Most cards are not going to be $599 most will cost more.
RTA. There are screenshots on the OC and test setup page, showing HWiNFO64. I think the FE maxed out at under 70C.What are the VRAM temps like on the 4000 series like? I assume the GDDR6X has sensors like the 3000 series did.
Is this also why 4080 (new, not the current 4070 ti which nvidia wanted to sell for $900 as 12 GB 4080) jumped price in contrast to 3080? Suddenly cost of making doubled?! (from $700 3080 to $1200 16GB 4080 or $900 12GB 4080) This is not simply cost of making alone at work and more about "RT performance tax". Even if we assume it was true, Nvidia is severely holding back the technology for only highest grade cards and giving meager lower tier mainstream cards (all 70 series cards before enjoyed that generation's improvements with lower cost than before. $350 GTX 970 for example overpowered $700 GTX 780 ti with less power. Ever since RT was introduced with Turing series, 70 series and below are being depraved of their true potential). This increase in performance gap by nvidia results in PC Gaming industry heading to the direction of rich man's hobby.You say that based on what exactly? TSMC 4N at 294.5mm^2 versus Samsung 8N at 392.5mm^2, 50% more memory with significantly higher clocks, R&D costs... Given most people say TSMN N4 costs at least twice as much per mm^2 as Samsung 8N, that basically means the AD104 is equivalent to a ~600mm^2 8N part. So by that metric, the card probably costs almost the same to make as the RTX 3080. Well, maybe. PCB costs went down (192-bit interface), and a 200W TGP means it's cheaper to make than a 320W TGP board.
But my point is that I would not confidently assert that "it almost certainly costs less than a 3070 to make." On the contrary, given we know that TSMC had a ton of demand for 5nm-class manufacturing last year and a lot of companies had to "overpay" for capacity, I'd wager it's more likely that it costs slightly more to make than its predecessor. Then factor in inflation plus economic conditions and $599 feels about right. I don't love the price, but I can't really get angry over it for what is mostly a luxury item.
I know in the past year or so, my monthly food and gas expenses (for the family) easily increased by 25%. I have it in my budget spreadsheet. So by that metric $599 almost feels like a good deal.
Obviously Nvidia is charging as much as it feels it can get away with, and I think it was hoping demand would be higher on the rest of the 40-series after the 4090 sold out for several months. The thing is, the 4090 is an absolute monster at "everything" where the other 40-series are whittling away the core performance. You want great AI? 4090. You want great rasterization? 4090. You want great RT? 4090. You want a prosumer card for video editing or other professional tasks? 4090.First thank you for your articles (I've been enjoying reading your articles since years ago. thank you very much and God bless you).
Is this also why 4080 (new, not the current 4070 ti which nvidia wanted to sell for $900 as 12 GB 4080) jumped price in contrast to 3080? Suddenly cost of making doubled?! (from $700 3080 to $1200 16GB 4080 or $900 12GB 4080) This is not simply cost of making alone at work and more about "RT performance tax". Even if we assume it was true, Nvidia is severely holding back the technology for only highest grade cards and giving meager lower tier mainstream cards (all 70 series cards before enjoyed that generation's improvements with lower cost than before. $350 GTX 970 for example overpowered $700 GTX 780 ti with less power. Ever since RT was introduced with Turing series, 70 series and below are being depraved of their true potential). This increase in performance gap by nvidia results in PC Gaming industry heading to the direction of rich man's hobby.
Also we have TSMC 6N Arc A770 with 406 mm² costing $329! Yet more reason nvidia's pricing is more greed than reality.
Launch price of 3090 also was $1500 yet only $100 more for 4090 ($1600).
Awww, they deserve a cookie from Nvidia!!techpowerup gave it a 5 star review.
That's why I didn't consider anything but the 4090 when upgrading from the 3090. I was sold once I saw the reviews showing it's a massive improvement over the 3090... which it is. No other card has 24GB either so it was an easy decision despite the cost.
What's funny is I'd be willing to bet the same people who complain about GPU pricing are also the same people buying $1400 iPhones from Apple... as if those aren't equally price inflated.
Of course you think this price is great. After all, your PC costs over $10,000l!
We peasants don't have the luxury of spending that much unless we sell a kidney or 2.
I actually don't think the price is "great"... I'd much rather it be $699 like my 1080 Ti cost back in 2017.
As for my $10,000 PC... yes... that's what I spent... but you also have to look at the stuff I bought... OLED display, professional 3D printer... Flight Simulator hardware... color laser printer... 2 different VR headsets... none of which is really required for a gaming PC and without it my PC cost is cut in half.
$4-5k is the normal for a top of the line gaming PC and that's where I'd be without all the extras.
Completely agree... Sure, you might get away with 12g on 1080p, but at 1440p, this is now an entry level card.If there were no previous gen high-end GPUs available this would be a good buy but at this price you can just buy a new AIB 6950XT.
I can't see a $699 ever coming back for a top mainstream GPU from Nvidia without a demand crash. For one, $699 isn't even $699 anymore; $699 in March 2017 dollars is about $865 in 2023 dollars.
I can't see a $699 ever coming back for a top mainstream GPU from Nvidia without a demand crash. For one, $699 isn't even $699 anymore; $699 in March 2017 dollars is about $865 in 2023 dollars.
I'm tired of the inflation argument. In 2017 an 8 core Zen launched at $499. In 2022 the 5700X launched at $299.
You can be tired of it all you want, but the fact is that the value of money has changed. $1 in 2023 buys a lot less of everything, from goods to services to employees to raw materials to housing than it did in 2017. $500 in 2023 is literally cheaper than $500 in 2017 was.
If we accept the argument that AMD released the 5700X at a relative bargain out of the goodness of their hearts -- highly debatable since the 5700X wasn't remotely a "new" product in any meaningful sense -- that inflation happened would have just made AMD more generous in this case.
The 4060 at 8GB and 4060Ti at 12 GB would be a possibility, though.Obviously Nvidia is charging as much as it feels it can get away with, and I think it was hoping demand would be higher on the rest of the 40-series after the 4090 sold out for several months. The thing is, the 4090 is an absolute monster at "everything" where the other 40-series are whittling away the core performance. You want great AI? 4090. You want great rasterization? 4090. You want great RT? 4090. You want a prosumer card for video editing or other professional tasks? 4090.
The step down to the 4080 is basically an equal drop in price and performance, but you lose VRAM and it's no longer the halo part. 3080 Ti would have been an $800-ish part had crypto not happened, but then post-crypto Nvidia decided maybe $1200 was a good price for the second tier 40-series. The 4070 Ti followed that pattern at $800 (formerly $900 as the 4080 12GB). The 4070 finally gets us back to mostly equal footing with the 30-series. I mean, 3070 was $500 and 3070 Ti was $600, but a $100 upcharge isn't the end of the world (and RTX 2070 FE was $600 at launch as well). Going from $700 3080 to $1200 4080 was a different story, and pretending the 4070 Ti was a 3080 replacement was equally disingenuous (plus it was a $100 upcharge still).
I still worry about where Nvidia is going to go with 4060 and 4050. Like, 8GB really isn't going to fly in 2023. We had 1070 with 8GB in 2016 for $380! But I don't see how Nvidia can do anything other than 8GB on the 4060 and 4050. Which means, for all intents and purposes, the "good" or at least truly desirable 40-series parts bottom out at the 4070, perhaps a 4060 Ti but I suspect not. 4070 I'm okay with. $200 extra for the 4070 Ti? Not so great. $400 extra to go from 4070 Ti to 4080? Definitely greedy. $400 more to go from 4080 to the 4090 halo? Eh, it's the halo part, so sure.
Maybe if 4050 costs $250 (for real), 8GB is justifiable. 4060 with 8GB would also need to be $300 at most, but with 12GB I'd be okay with up to $400.
That's the issue, though: 1060 3GB and 6GB were both still using a 192-bit interface, though the 6GB card had a few more SMs and shaders. The 3060 8GB and 12GB are entirely different beasts, and the former (8GB) was really only introduced late in the product cycle as a way to try and get more money out of a GA106 GPU. The RTX 3060 8GB ends up being barely faster than RTX 3050 in many cases, but with a price that's often the same as the 3060 12GB.And who knows, just as there has been a 3 GB and a 6 GB variant of the venerable 1060 GTX card, so maybe we might see some sort of 8 and 12 GB variants of the 4060 at least from 3rd party vendors.
I have always considered a price bracket of up to 350 USD as the upper limit of what could be considered as "gaming mainstream". The non-gaming mainstream is actually what >90% of users get: an integrated iGPU.No, 600 dollars for a graphics card IS NOT MAINSTREAM. Why are all news outlets saying this as if it's the norm???? It's an extremely high price, regardless of how the market is.
Jarred, thanks a lot for this hugely informative reply! It's like an encyclopaedia of GPUs but much more interesting and fun to read!That's the issue, though: 1060 3GB and 6GB were both still using a 192-bit interface, though the 6GB card had a few more SMs and shaders. The 3060 8GB and 12GB are entirely different beasts, and the former (8GB) was really only introduced late in the product cycle as a way to try and get more money out of a GA106 GPU. The RTX 3060 8GB ends up being barely faster than RTX 3050 in many cases, but with a price that's often the same as the 3060 12GB.
If we get 4060 8GB and 4060 12GB, the latter will require an AD104 chip with 192-bit memory interface, while the former could be an AD106 chip with 128-bit interface. It could also be a severely cut down AD104, just to get rid of "bad" chips that would otherwise have no use. But I'm really not expecting RTX 4060 12GB to be a thing. I think RTX 4060 Ti with GDDR6 memory (instead of GDDR6X) could happen, but that's as far as I think Nvidia would go. Well, or it skips RTX 4060 Ti and just does that GDDR6 part as RTX 4060. 🤷