Review Nvidia GeForce RTX 4070 Review: Mainstream Ada Arrives

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I'm a little bummed the 4070TI is only 12TB and not16GB, nivida is stingy, however I have had not great look with the last AMD card drivers one issue or another. I"m not brand loyal. 7900XTX be great but I don't trust the software.......(I know tons of people no issues) but that has not been my experience.
 
First thank you for your articles (I've been enjoying reading your articles since years ago. thank you very much and God bless you).
You say that based on what exactly? TSMC 4N at 294.5mm^2 versus Samsung 8N at 392.5mm^2, 50% more memory with significantly higher clocks, R&D costs... Given most people say TSMN N4 costs at least twice as much per mm^2 as Samsung 8N, that basically means the AD104 is equivalent to a ~600mm^2 8N part. So by that metric, the card probably costs almost the same to make as the RTX 3080. Well, maybe. PCB costs went down (192-bit interface), and a 200W TGP means it's cheaper to make than a 320W TGP board.

But my point is that I would not confidently assert that "it almost certainly costs less than a 3070 to make." On the contrary, given we know that TSMC had a ton of demand for 5nm-class manufacturing last year and a lot of companies had to "overpay" for capacity, I'd wager it's more likely that it costs slightly more to make than its predecessor. Then factor in inflation plus economic conditions and $599 feels about right. I don't love the price, but I can't really get angry over it for what is mostly a luxury item.

I know in the past year or so, my monthly food and gas expenses (for the family) easily increased by 25%. I have it in my budget spreadsheet. So by that metric $599 almost feels like a good deal.
Is this also why 4080 (new, not the current 4070 ti which nvidia wanted to sell for $900 as 12 GB 4080) jumped price in contrast to 3080? Suddenly cost of making doubled?! (from $700 3080 to $1200 16GB 4080 or $900 12GB 4080) This is not simply cost of making alone at work and more about "RT performance tax". Even if we assume it was true, Nvidia is severely holding back the technology for only highest grade cards and giving meager lower tier mainstream cards (all 70 series cards before enjoyed that generation's improvements with lower cost than before. $350 GTX 970 for example overpowered $700 GTX 780 ti with less power. Ever since RT was introduced with Turing series, 70 series and below are being depraved of their true potential). This increase in performance gap by nvidia results in PC Gaming industry heading to the direction of rich man's hobby.
Also we have TSMC 6N Arc A770 with 406 mm² costing $329! Yet more reason nvidia's pricing is more greed than reality.
Launch price of 3090 also was $1500 yet only $100 more for 4090 ($1600).
 
Last edited:
First thank you for your articles (I've been enjoying reading your articles since years ago. thank you very much and God bless you).

Is this also why 4080 (new, not the current 4070 ti which nvidia wanted to sell for $900 as 12 GB 4080) jumped price in contrast to 3080? Suddenly cost of making doubled?! (from $700 3080 to $1200 16GB 4080 or $900 12GB 4080) This is not simply cost of making alone at work and more about "RT performance tax". Even if we assume it was true, Nvidia is severely holding back the technology for only highest grade cards and giving meager lower tier mainstream cards (all 70 series cards before enjoyed that generation's improvements with lower cost than before. $350 GTX 970 for example overpowered $700 GTX 780 ti with less power. Ever since RT was introduced with Turing series, 70 series and below are being depraved of their true potential). This increase in performance gap by nvidia results in PC Gaming industry heading to the direction of rich man's hobby.
Also we have TSMC 6N Arc A770 with 406 mm² costing $329! Yet more reason nvidia's pricing is more greed than reality.
Launch price of 3090 also was $1500 yet only $100 more for 4090 ($1600).
Obviously Nvidia is charging as much as it feels it can get away with, and I think it was hoping demand would be higher on the rest of the 40-series after the 4090 sold out for several months. The thing is, the 4090 is an absolute monster at "everything" where the other 40-series are whittling away the core performance. You want great AI? 4090. You want great rasterization? 4090. You want great RT? 4090. You want a prosumer card for video editing or other professional tasks? 4090.

The step down to the 4080 is basically an equal drop in price and performance, but you lose VRAM and it's no longer the halo part. 3080 Ti would have been an $800-ish part had crypto not happened, but then post-crypto Nvidia decided maybe $1200 was a good price for the second tier 40-series. The 4070 Ti followed that pattern at $800 (formerly $900 as the 4080 12GB). The 4070 finally gets us back to mostly equal footing with the 30-series. I mean, 3070 was $500 and 3070 Ti was $600, but a $100 upcharge isn't the end of the world (and RTX 2070 FE was $600 at launch as well). Going from $700 3080 to $1200 4080 was a different story, and pretending the 4070 Ti was a 3080 replacement was equally disingenuous (plus it was a $100 upcharge still).

I still worry about where Nvidia is going to go with 4060 and 4050. Like, 8GB really isn't going to fly in 2023. We had 1070 with 8GB in 2016 for $380! But I don't see how Nvidia can do anything other than 8GB on the 4060 and 4050. Which means, for all intents and purposes, the "good" or at least truly desirable 40-series parts bottom out at the 4070, perhaps a 4060 Ti but I suspect not. 4070 I'm okay with. $200 extra for the 4070 Ti? Not so great. $400 extra to go from 4070 Ti to 4080? Definitely greedy. $400 more to go from 4080 to the 4090 halo? Eh, it's the halo part, so sure.

Maybe if 4050 costs $250 (for real), 8GB is justifiable. 4060 with 8GB would also need to be $300 at most, but with 12GB I'd be okay with up to $400.
 
That's why I didn't consider anything but the 4090 when upgrading from the 3090. I was sold once I saw the reviews showing it's a massive improvement over the 3090... which it is. No other card has 24GB either so it was an easy decision despite the cost.

What's funny is I'd be willing to bet the same people who complain about GPU pricing are also the same people buying $1400 iPhones from Apple... as if those aren't equally price inflated.

Of course you think this price is great. After all, your PC costs over $10,000l!

We peasants don't have the luxury of spending that much unless we sell a kidney or 2.
 
Of course you think this price is great. After all, your PC costs over $10,000l!

We peasants don't have the luxury of spending that much unless we sell a kidney or 2.

I actually don't think the price is "great"... I'd much rather it be $699 like my 1080 Ti cost back in 2017.

As for my $10,000 PC... yes... that's what I spent... but you also have to look at the stuff I bought... OLED display, professional 3D printer... Flight Simulator hardware... color laser printer... 2 different VR headsets... none of which is really required for a gaming PC and without it my PC cost is cut in half.

$4-5k is the normal for a top of the line gaming PC and that's where I'd be without all the extras.
 
I actually don't think the price is "great"... I'd much rather it be $699 like my 1080 Ti cost back in 2017.

As for my $10,000 PC... yes... that's what I spent... but you also have to look at the stuff I bought... OLED display, professional 3D printer... Flight Simulator hardware... color laser printer... 2 different VR headsets... none of which is really required for a gaming PC and without it my PC cost is cut in half.

$4-5k is the normal for a top of the line gaming PC and that's where I'd be without all the extras.

I can't see a $699 ever coming back for a top mainstream GPU from Nvidia without a demand crash. For one, $699 isn't even $699 anymore; $699 in March 2017 dollars is about $865 in 2023 dollars.
 
If there were no previous gen high-end GPUs available this would be a good buy but at this price you can just buy a new AIB 6950XT.
Completely agree... Sure, you might get away with 12g on 1080p, but at 1440p, this is now an entry level card.

Regardless of inflation, $600 for an Entry Level GPU card just seems WAY to high. This is the price fora and Xbox or PlayStation.

Moving forward these are Entry level VRAM minimum requirements, by 2025 (projection next Gen).
12g - 1080p
16g - 1440p
20g - 4k.
 
  • Like
Reactions: Elusive Ruse
I can't see a $699 ever coming back for a top mainstream GPU from Nvidia without a demand crash. For one, $699 isn't even $699 anymore; $699 in March 2017 dollars is about $865 in 2023 dollars.

Oh I agree. I've said many times the $699 flagship card will never happen again. I personally didn't have a problem paying $1750 for the 4090... considering it's basically a 50% improvement over the 3090 which is the biggest generational leap between card models that I think we've ever seen.

The best products generally are the most expensive.

Rest of the 4000 series isn't all that impressive though.
 
4080 is the real kick in the teeth. That should at most be $1000 if inflation is accounted for. 2080 was $799/$699, 3080 was $699/$799 (12GB) So those would be roughly $1000.

To have it have the launch price of the 80Ti class cards of the last two generations... And they haven't even talked about a 4080Ti....

AMD got the flagship price closer to correct, though the 7900XT is a little overpriced. All pricing to make the leap to the top tier seem reasonable.
 
I'm tired of the inflation argument. In 2017 an 8 core Zen launched at $499. In 2022 the 5700X launched at $299.

You can be tired of it all you want, but the fact is that the value of money has changed. $1 in 2023 buys a lot less of everything, from goods to services to employees to raw materials to housing than it did in 2017. $500 in 2023 is literally cheaper than $500 in 2017 was.

If we accept the argument that AMD released the 5700X at a relative bargain out of the goodness of their hearts -- highly debatable since the 5700X wasn't remotely a "new" product in any meaningful sense -- that inflation happened would have just made AMD more generous in this case.
 
You can be tired of it all you want, but the fact is that the value of money has changed. $1 in 2023 buys a lot less of everything, from goods to services to employees to raw materials to housing than it did in 2017. $500 in 2023 is literally cheaper than $500 in 2017 was.

If we accept the argument that AMD released the 5700X at a relative bargain out of the goodness of their hearts -- highly debatable since the 5700X wasn't remotely a "new" product in any meaningful sense -- that inflation happened would have just made AMD more generous in this case.

I know money is worth less. I don't think most people whould have as much of a problem if salary prices have kept up. For most, they sure seem to have not.
 
After Elon Musk admitting to "Everybody and their dog is buying GPUs right now" for AI-purposes, it becomes very clear, why GPUs are so expensive - and even the 4070 being not exactly a great value proposition.

GPUs were affordable when there was just a gaming market.

Their prices started to increase with VFX-artists entering the market (for rendering and video acceleration), and ultimately skyrocketed when crypto-mining became a thing.
Now, with crypto market on the decline, AI has become the new mega-market for GPUs. And yes, nVidia does have dedicated GPUs for AI (that sell for like 30.000 USD per unit). But innumerable smaller AI-developers as well as larger developers still populate their workstations and server farms with dozens or hundreds of standard GPUs. So, there goes the gaming market again being shared with tons of other groups that all offer their money in order to get as many GPUs as possible.

That's where I see an opportunity for lower-tier GPUs such as those in the upcoming 4060-range because such cards might be less attractive for heavy AI workloads while still delivering great performance for 1080p gamers (which are the absolute majority).
The Ada Lovelace chips being on a substantially smaller and more efficient process node, I also expect the power consumption of upcoming 4060 models to be significantly tamed compared with their previous Gen counterparts.
So, an efficient 4060 model with 12 GB on board and with a TDP of less than 150W and a price less than 449 USD would probably be the best value proposition for price- and energy-conscious gamers.

Let's see if we can get such a card anytime soon....
 
Obviously Nvidia is charging as much as it feels it can get away with, and I think it was hoping demand would be higher on the rest of the 40-series after the 4090 sold out for several months. The thing is, the 4090 is an absolute monster at "everything" where the other 40-series are whittling away the core performance. You want great AI? 4090. You want great rasterization? 4090. You want great RT? 4090. You want a prosumer card for video editing or other professional tasks? 4090.

The step down to the 4080 is basically an equal drop in price and performance, but you lose VRAM and it's no longer the halo part. 3080 Ti would have been an $800-ish part had crypto not happened, but then post-crypto Nvidia decided maybe $1200 was a good price for the second tier 40-series. The 4070 Ti followed that pattern at $800 (formerly $900 as the 4080 12GB). The 4070 finally gets us back to mostly equal footing with the 30-series. I mean, 3070 was $500 and 3070 Ti was $600, but a $100 upcharge isn't the end of the world (and RTX 2070 FE was $600 at launch as well). Going from $700 3080 to $1200 4080 was a different story, and pretending the 4070 Ti was a 3080 replacement was equally disingenuous (plus it was a $100 upcharge still).

I still worry about where Nvidia is going to go with 4060 and 4050. Like, 8GB really isn't going to fly in 2023. We had 1070 with 8GB in 2016 for $380! But I don't see how Nvidia can do anything other than 8GB on the 4060 and 4050. Which means, for all intents and purposes, the "good" or at least truly desirable 40-series parts bottom out at the 4070, perhaps a 4060 Ti but I suspect not. 4070 I'm okay with. $200 extra for the 4070 Ti? Not so great. $400 extra to go from 4070 Ti to 4080? Definitely greedy. $400 more to go from 4080 to the 4090 halo? Eh, it's the halo part, so sure.

Maybe if 4050 costs $250 (for real), 8GB is justifiable. 4060 with 8GB would also need to be $300 at most, but with 12GB I'd be okay with up to $400.
The 4060 at 8GB and 4060Ti at 12 GB would be a possibility, though.
And who knows, just as there has been a 3 GB and a 6 GB variant of the venerable 1060 GTX card, so maybe we might see some sort of 8 and 12 GB variants of the 4060 at least from 3rd party vendors.
 
And who knows, just as there has been a 3 GB and a 6 GB variant of the venerable 1060 GTX card, so maybe we might see some sort of 8 and 12 GB variants of the 4060 at least from 3rd party vendors.
That's the issue, though: 1060 3GB and 6GB were both still using a 192-bit interface, though the 6GB card had a few more SMs and shaders. The 3060 8GB and 12GB are entirely different beasts, and the former (8GB) was really only introduced late in the product cycle as a way to try and get more money out of a GA106 GPU. The RTX 3060 8GB ends up being barely faster than RTX 3050 in many cases, but with a price that's often the same as the 3060 12GB.

If we get 4060 8GB and 4060 12GB, the latter will require an AD104 chip with 192-bit memory interface, while the former could be an AD106 chip with 128-bit interface. It could also be a severely cut down AD104, just to get rid of "bad" chips that would otherwise have no use. But I'm really not expecting RTX 4060 12GB to be a thing. I think RTX 4060 Ti with GDDR6 memory (instead of GDDR6X) could happen, but that's as far as I think Nvidia would go. Well, or it skips RTX 4060 Ti and just does that GDDR6 part as RTX 4060. 🤷
 
  • Like
Reactions: ottonis and King_V
No, 600 dollars for a graphics card IS NOT MAINSTREAM. Why are all news outlets saying this as if it's the norm???? It's an extremely high price, regardless of how the market is.
 
  • Like
Reactions: oofdragon
No, 600 dollars for a graphics card IS NOT MAINSTREAM. Why are all news outlets saying this as if it's the norm???? It's an extremely high price, regardless of how the market is.
I have always considered a price bracket of up to 350 USD as the upper limit of what could be considered as "gaming mainstream". The non-gaming mainstream is actually what >90% of users get: an integrated iGPU.
Back to the mid-range gaming market: In my opinion, none of the RTX X070-models have ever fit into that price bracket, they have always been way too expensive, with the current 4070 model being overpriced as well.

According to Steam analytic data, the top 3 spots of most frequently used GPUs on Steam go to 3060,2060,1060. 3070 is #4, followed by further X060 and X050 and even 1660 family members

Currently, you can get last gen 3060 (12 GB) for around 350 bucks, So, for me, that's mainstream segment, suitable for 1080p gaming at moderate to high details/effects levels.
 
That's the issue, though: 1060 3GB and 6GB were both still using a 192-bit interface, though the 6GB card had a few more SMs and shaders. The 3060 8GB and 12GB are entirely different beasts, and the former (8GB) was really only introduced late in the product cycle as a way to try and get more money out of a GA106 GPU. The RTX 3060 8GB ends up being barely faster than RTX 3050 in many cases, but with a price that's often the same as the 3060 12GB.

If we get 4060 8GB and 4060 12GB, the latter will require an AD104 chip with 192-bit memory interface, while the former could be an AD106 chip with 128-bit interface. It could also be a severely cut down AD104, just to get rid of "bad" chips that would otherwise have no use. But I'm really not expecting RTX 4060 12GB to be a thing. I think RTX 4060 Ti with GDDR6 memory (instead of GDDR6X) could happen, but that's as far as I think Nvidia would go. Well, or it skips RTX 4060 Ti and just does that GDDR6 part as RTX 4060. 🤷
Jarred, thanks a lot for this hugely informative reply! It's like an encyclopaedia of GPUs but much more interesting and fun to read!
 
Just got around to seeing the reviews for this card.
This one is my favorite review out of the 10 or so that I've seen/read. Just wanna give you some props; you do awesome work, Jarred. I love how much depth you go into, and answer further questions here in the forum.

As someone who is still on an RTX 2060, who tried but failed to get a 3080 - the 4070 is exactly what I'm looking for. Same performance as the 3080 I wanted, with a lower price and a little more VRAM. I was planning to buy a 4070 Ti within the next month or two, but I'm glad I got to see the 4070 benchmarks first. I can't justify an extra $200 for the 20%ish percent increase that the Ti offers. This would be more than good enough for me.

I guess my eyes aren't too picky. I'm at 1440p... I'm "okayish" with 30 FPS if necessary. I'm happy as a clam with 50 FPS, but am absolutely ecstatic with 90 FPS and can't ask for more (because I can't tell a difference beyond that.) Can't tell a difference between high and ultra in most games, either. And DLSS balanced looks good to me unless I pixel-peep, but I can rarely tell the difference from native res during gameplay. But with all of that said, I'm a huge fan of ray-tracing, especially shadows and ambient occlusion.

So seeing these charts that use a lot of ultra settings and "Quality" DLSS makes me optimistic that this card could last me quite a good while. Hopefully DLSS3 will increase the life of the card even further. I don't get the hate for DLSS 3. In marketing, sure, it's terrible. But some people act like its very existence is evil and poisonous. It's not mandatory for anyone... I'm excited to try it out. I'm not too sensitive to latency, and don't play twitchy games.

Power is actually a big deal to me. My small room in a hot climate gets warm pretty quickly when gaming on a 190W card. So it's good to see how relatively efficient this card is.

Although I'm excited for it and will almost certainly buy it... I gotta say - with such large performance deltas between each 40-series card, this kinda seems like a low-end card, with pricing that still feels high-end to me.
But I dig this card.

/random scattered ramblings