News Nvidia's GeForce RTX 5070 at $549 — How does it stack up to the previous generation RTX 4070?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
No need for more memory when you are forced to use upscalling anyway...

smh
Who is forcing you to use anything?

The GPUs Nvidia introduced yesterday would still have the best raster performance you can have for now for almost every GPU they put out.

I do think it would be dumb to not take advantage of the DLSS improvements in games, but if you want to die on that purity check hill - you can bet starting 5080, if not even 5070Ti, there won't be any competitor solution that will even be close in raster performance anyway.
 
But frankly, it seems like every gen they are just pushing for these AI generated stuffs more and more, and individually it might not take too much of an effort, but combined for game developers to cater all these generations of DLSS and FSR stuffs into their code it likely will result in more AAA games getting so bloated graphics the game content becoming more and more hollow...
I know it might be a shocking revelation, but game content quality and "AI generated stuffs" are not really related. That you will have AI driven graphics, does not mean the game needs to be trash.

Case in point BG3 that is a great game through and through, despite having that evil DLSS in it or well... CP77.
 
except its not universal.

A friend a month ago went to 9800 xt (reason is they needed more vram) and has had issues resulting in blue screens.

Its not "going to happen to everyone" but it is more common to have amd driver issues than nvidia.

Between your claim and their i would side more w/ friend as i have personally seen it happen.
And I can say...knock on wood, I have had a mix of ATi/AMD video cards over the last 30 years and never had an issue, well I would be lying. I did have a few with my ATi All in Wonder card.

Point is...a lot of people, depending on games and software have issues with both. Nvidia is no better. Bashing one over the other is being bias, which is ok. You can have your favorite, just dont bash one because you like the other more.

Me, I like Intel cpus...refuse to use an AMD one, maybe my loss maybe not...but I wouldn't tell people not to buy them as my reasons are anecdotal. I do have a 5700G, when it came out I was actually amazed.
 
  • Like
Reactions: TCA_ChinChin
Right. All my issues with AMD cards over the years are just propaganda 🙄

I can't remember when I last had issues with my Nvidia cards. Even two months ago when I wiped my driver because I had monitor issues, it ultimately turned out to be Windows, not the driver, and I overreacted DDU wiping it (though that should be done regularly anyways and I was lazy on that count since I got this card two years ago, so eh...). Meanwhile, I had AMD cards that were overheating for no reason and constant driver issues, and I'm far from alone. There is advice out there not to update AMD drivers every time a new one drops even today. But of course, your own experience is more true and it's all just propaganda from "Nvidi0ts"...
and Nvidia has connectors melting....
Im not stabbing at you, just pointing out they both have issues and are going to...no one is perfect.
 
This generation of cards from Nvidia are lackluster to say the least. The 5070 is the most egregious, with 1,024 fewer CUDA cores, 32 fewer TMUs, and 16 fewer ROPs compared to the outgoing 4070 super. No wonder they are comparing the 5070 to the old 4070, at least then we see a +4% increase in raster resources, which is still incredibly weak for a whole new series. Without the AI fake frames tech, this entire lineup minus the 5090 is a wash.
 
except its not universal.

A friend a month ago went to 9800 xt (reason is they needed more vram) and has had issues resulting in blue screens.

Its not "going to happen to everyone" but it is more common to have amd driver issues than nvidia.

Between your claim and their i would side more w/ friend as i have personally seen it happen.
Should have DDU’d the old nvidia drivers before installing the AMD drivers.
 
  • Like
Reactions: TCA_ChinChin
I know it might be a shocking revelation, but game content quality and "AI generated stuffs" are not really related. That you will have AI driven graphics, does not mean the game needs to be trash.

Case in point BG3 that is a great game through and through, despite having that evil DLSS in it or well... CP77.
Actually, for quite some years since RTX or so gets into the picture and yet more graphics tricks are available, more and more AAA titles just get into the photorealistic graphics and effects but no content/innovation category.

BG3 have DLSS support BUT they didn't really make focus into graphics or textures, those are styling of 1-2 decades ago graphics.

CP77 is kind of ok, but not really that great nor have a ton of innovation/addictive gameplay.

What I mean is that with that extra new DLSS frame gen support for each generation of new cards. it's the era of poor graphics not needing forever to code into we have all those fantastic gameplay ideas getting invented. One could argue that back then it was because there are so many rocks unturned one can easily create new yet addictive gameplay mechanics or ideas, but somethine as simple as a in depth and coherent campaign story isn't available in 99% of the games now, just slap in some good looking trailers with all those eye candy these new DLSS or AI allows and release the game in a broken state or with a hollow story line (e.g. COD6).

It's like a race to eyecandy only, and all these new ver. of DLSS and FSR is getting the coding more complicated each year, just to try optimise all these with raster is killing the dev/debug time to make sure all these arn't F up, and another apparent trend is that gameplay time/depth generally decreases as to create a better looking texture world with a 100+ GB size, but let's compress the story line to not make the game overly large.
 
Actually, for quite some years since RTX or so gets into the picture and yet more graphics tricks are available, more and more AAA titles just get into the photorealistic graphics and effects but no content/innovation category.

BG3 have DLSS support BUT they didn't really make focus into graphics or textures, those are styling of 1-2 decades ago graphics.

CP77 is kind of ok, but not really that great nor have a ton of innovation/addictive gameplay.

What I mean is that with that extra new DLSS frame gen support for each generation of new cards. it's the era of poor graphics not needing forever to code into we have all those fantastic gameplay ideas getting invented. One could argue that back then it was because there are so many rocks unturned one can easily create new yet addictive gameplay mechanics or ideas, but somethine as simple as a in depth and coherent campaign story isn't available in 99% of the games now, just slap in some good looking trailers with all those eye candy these new DLSS or AI allows and release the game in a broken state or with a hollow story line (e.g. COD6).

It's like a race to eyecandy only, and all these new ver. of DLSS and FSR is getting the coding more complicated each year, just to try optimise all these with raster is killing the dev/debug time to make sure all these arn't F up, and another apparent trend is that gameplay time/depth generally decreases as to create a better looking texture world with a 100+ GB size, but let's compress the story line to not make the game overly large.
Sorry, but that's some wild logic leaps there.

I will repeat again, blaming DLSS and other AI based techniques for studios putting out games with bad gameplay is a very long reach there and there are loads of obvious examples where these were used and yet the games somehow are good?

Wukong, Stalker 2. Heck pretty much most of the hit games of 2024 use these techniques and somehow, they are still good?


That you had Bioware put failure like DA:V, or Ubisoft's memes was not because DLSS made them do it.
 
  • Like
Reactions: RTX 2080
Sorry, but that's some wild logic leaps there.

I will repeat again, blaming DLSS and other AI based techniques for studios putting out games with bad gameplay is a very long reach there and there are loads of obvious examples where these were used and yet the games somehow are good?

Wukong, Stalker 2. Heck pretty much most of the hit games of 2024 use these techniques and somehow, they are still good?


That you had Bioware put failure like DA:V, or Ubisoft's memes was not because DLSS made them do it.
Sorry, no I don't mean that these DLSS or other AI stuffs being the sole culprit of the bad games, it's the push of more eye candy, and that those latest and greatest engines with the AI stuffs to optimise to those tools instead of optimise more on general rendering methods like rasterazation makes it worse.

Most of the hit games are either not that focused on the latest and best graphics possible, Wukong don't even is that open world a game, and TBH, those are still lacking compared to the older games in gameplay.

I say that the DLSS optimization is causing more "trouble" coz I am personally involved in beta testing in some addons in some games, and nowadays other than testing for general glitches or give ideas on what to improve on the gameplay/system side of things, more time is trying to see if there's anything wrong with DLSS, frame gen, RT, FSR and then raster over and over to debug those glitches in the upscaling tech. And there's a trend, when these tools are available, devs put way less effort to texture selection etc. to optimise the game for lower end or older systems, or "since DLSS now enable that perfect pool reflection, we spend 2 more weeks perfecting that reflection instead of trying to give more easter egg style gameplay to encourage replaying the game"

Sure there are always still some developers don't fall into the trap and focus more on the gameplay side of things, but it's dewindling real quick. Some good games still support these tech, but most of those games don't really benefit much/need DLSS to run smooth at ultra setting, a 4070 can run Stalker 2 at 1440p native and not gain much with DLSS quality not really gaining much, that's in stark comparison to those maxing out every eye candy.
 
Also, I am just finding $600 after tax to be really expensive. That’s more than a medium range motherboard, cpu and memory combined. Forget the top end 1k cards unless you truly got the $10k system they were teasing about.
The RTX 2070 launched in October 2018 at $599 MSRP.

The RTX 3070 launched at $499.

The RTX 4070 launched at $599.

Now the RTX 5070 is launching at $549, $50 less than the RTX 2070 launched at even with 6 years of inflation (24% according to the CPI inflation calculator; $599 in October 2018 is equivalent to $747 today).

Please explain to me exactly how the 5070 is so unreasonably priced.

The fastest 70-series GPU yet, at a price that’s less than the same class of GPU 6 years ago. I swear, people will dunk on Nvidia for being greedy no matter how they price their GPUs.
 
Last edited:
The RTX 2070 launched in October 2018 at $599 MSRP.
The RTX 3070 launched at $499.

The RTX 4070 launched at $599.

Now the RTX 5070 is launching at $549, $50 less than the RTX 2070 launched at even with 6 years of inflation (24% according to the CPI inflation calculator; $599 in October 2018 is equivalent to $747 today).

Please explain to me exactly how the 5070 is so unreasonably priced.

The fastest 70-series GPU yet, at a price that’s less than the same class of GPU 6 years ago. I swear, people will dunk on Nvidia for being greedy no matter how they price their GPUs.
They have all been expensive to me. 50% frugal, 50% on principle.
 
The RTX 2070 launched in October 2018 at $599 MSRP.

The RTX 3070 launched at $499.

The RTX 4070 launched at $599.

Now the RTX 5070 is launching at $549, $50 less than the RTX 2070 launched at even with 6 years of inflation (24% according to the CPI inflation calculator; $599 in October 2018 is equivalent to $747 today).

Please explain to me exactly how the 5070 is so unreasonably priced.

The fastest 70-series GPU yet, at a price that’s less than the same class of GPU 6 years ago. I swear, people will dunk on Nvidia for being greedy no matter how they price their GPUs.

The MSRP of the NVIDIA GeForce GTX 1070 when it was released in June 2016 was $379. How convenient of you to ignore that. There was a lot of grumbling of price increases when the 20 series was released.
 
  • Like
Reactions: Jagar123
.It's like a race to eyecandy only, and all these new ver. of DLSS and FSR is getting the coding more complicated each year, just to try optimise all these with raster is killing the dev/debug time to make sure all these arn't F up
These don't really take much of anything to develop and the wildly varying quality of implementation should tell you how little debug time is spent on many.
and another apparent trend is that gameplay time/depth generally decreases as to create a better looking texture world with a 100+ GB size, but let's compress the story line to not make the game overly large.
This is another time problem: for any myriad of reasons big studios just bring in more artists to create new textures for everything to save time. That's why you can have fantastic looking indie games with the same amount of content as a AAA game which are a quarter the size.
 
These don't really take much of anything to develop and the wildly varying quality of implementation should tell you how little debug time is spent on many.

This is another time problem: for any myriad of reasons big studios just bring in more artists to create new textures for everything to save time. That's why you can have fantastic looking indie games with the same amount of content as a AAA game which are a quarter the size.
You're all going to love the upcoming AI generated game stories that seem to be part of the plan... LOL
 
  • Like
Reactions: thestryker
The RTX 2070 launched in October 2018 at $599 MSRP.

The RTX 3070 launched at $499.

The RTX 4070 launched at $599.

Now the RTX 5070 is launching at $549, $50 less than the RTX 2070 launched at even with 6 years of inflation (24% according to the CPI inflation calculator; $599 in October 2018 is equivalent to $747 today).

Please explain to me exactly how the 5070 is so unreasonably priced.
The 1070 was ~25% slower than the 2070, the 2070 ~28% slower than 3070, the 3070 ~17% slower than the 4070. You can't just look at those prices in isolation without considering the performance as well.

The 2070 was a price increase which pretty much matched the performance increase (like most of the 20 series). The 3070 was a reset (like most of the 30 series) and then the 4070 was just a price increase with a performance gain that only situationally covered the price increase (like most of the 40 series).

Until we know the performance it's impossible to say how "overpriced" the 5070 may or may not be, but it certainly won't be another 3070 situation.
 
The 5070 should be about 30% faster than the 4070 in real terms, the supposed 4090 performance is just Mumbo Jumbo from Nvidia trying to convince you Frame Generation at its most agressive form is real time rendering performance. Most of the games that are really worth playing depend on your timing and fast reactions (such as multiplayer games). You cant really count on those Multi Frame Generation frames for those games.
I don’t see how you’re expecting larger per SM gains than Ada when there’s no new node or clock speed bump. The 4070 is the only card on Ada without a change in SM count or core count and it was 20% faster than a 3070…
 
50W more power draw than the 4070 is going to make the 5070 a no go for a lot of people who may not have a power supply that can handle it. I hope the 5060 Ti manages to stay under 200W, otherise I'll just get a 4070. Blackwell uses way too much power. It's as bad as buying an AMD card.
 
This generation of cards from Nvidia are lackluster to say the least. The 5070 is the most egregious, with 1,024 fewer CUDA cores, 32 fewer TMUs, and 16 fewer ROPs compared to the outgoing 4070 super. No wonder they are comparing the 5070 to the old 4070, at least then we see a +4% increase in raster resources, which is still incredibly weak for a whole new series. Without the AI fake frames tech, this entire lineup minus the 5090 is a wash.
You always compare like for like. The 4070 Super is not like for like, the 4070 is. When/If Nvidia releases a 5070 Super, that's when you can use the 4070 Super for comparison.
 
You always compare like for like. The 4070 Super is not like for like, the 4070 is. When/If Nvidia releases a 5070 Super, that's when you can use the 4070 Super for comparison.
No, you compare the direct counterpart from the current generation which is the 4070 super. Only marketers would justify 4070-5070 comparison because the number designation is simply a tool for them to manipulate the consumer. If Nvidia were honest and compared their new lineup to their current lineup (the super refreshes), they would have called the rtx 5070 a 60 Ti class product and it would actually be a good card at the $449-$499 price point.

If a company releases refreshes to their product stack mid-cycle, then that is their current line-up, not the old line-up. Just because they had enough old stock of 4070 tier parts to keep selling them after the mid-cycle refresh doesn’t mean it is current.

Any other market would operate like I described. For example, when the new Mazda MX-5 (aka the ND Miata) came out, no one compared it to the original NC1 Miata from 2005-2008, they compared it to the 2nd refresh called the NC3 Miata from 2013-2015 that came with higher tuned version of the engine, forged crank, pistons, and connecting rods, better tuned suspension, chassis bracing, etc. etc.. The morale of the story is to sift through the marketing BS to find out if you are actually getting a better product with the new generation vs the best version of its equivalent in the outgoing line-up.
 
  • Like
Reactions: Jagar123
The MSRP of the NVIDIA GeForce GTX 1070 when it was released in June 2016 was $379. How convenient of you to ignore that. There was a lot of grumbling of price increases when the 20 series was released.
The $600 2070 price he quoted was the FE model, which is not what you quoted for the 1070. MSRP of the 2070 was $500 and the FE MSRP for the 1070 was $450. How convenient you didn't note that. 10 series was also the last of high end GTX cards. There truly was an industry shift with the 20 series release.
 
  • Like
Reactions: KyaraM
I think AMD has come a long way in this regard. I don't think this is really a talking point anymore?
I agree. Zero issues with my Sapphire 7900 XT besides the ~30W idle power draw early in 2024, but that's long been corrected. If that was my worst issue... yeah, AMD has come a long way in graphics driver stability. I think their biggest issue is that they have shot themselves in the foot for the longest time in regard to marketing and still do to this day. As some would say "AMD never misses an opportunity to miss an opportunity."

At this point, it's just "nVidia is best" -- who doesn't want that, right?? Like, what price discount, performance advantage, or combination of both does it take for some to many gamers to "step down" to the #2 brand? In the world of hi-tech, almost everything is pretty much a two-dog race and indeed top dog tends to have a decent lead over #2 (look at Intel vs. AMD CPU's for the longest time, and heck, it's still taking AMD years to chip away at the market leader incumbent's market share).

That's ok though, green always finds a way to get a new talking point. In the last few years, it's their upscaling tech and in general, their AIAIAIAIAIAIAIAI.
 
except its not universal.

A friend a month ago went to 9800 xt (reason is they needed more vram) and has had issues resulting in blue screens.

Its not "going to happen to everyone" but it is more common to have amd driver issues than nvidia.

Between your claim and their i would side more w/ friend as i have personally seen it happen.
What 9800 XT? From circa 2003? Do you mean an RX 7900 XT? Do you even do your homework on the competition?

"Its not "going to happen to everyone" but it is more common to have amd driver issues than nvidia."
Can you quantify this? It's way too easy for folks on socials and wherever to continue saying things like this. Rumors and dis/misinformation spread like wildfire for a reason as it's easier and natural for humans to have affirmation bias.

Manufacturers can also have defective cards, whether Radeon or GeForce. Besides using DDU as others have mentioned, there could be several reasons for the BSOD's.

Lastly, no, user experiences are never universal. Just like not everyone had MAJOR stuttering issues on practically every game I played with my previous GPU, a GTX 1080, some did. If it weren't for the impressive performance and efficiency when it worked right, I would have scrapped it sooner. But alas, we're talking about anecdote. I'm not going to say that anecdote doesn't matter but that it's easy to amplify and create a disproportion.
 
  • Like
Reactions: Jagar123