News Nvidia's GeForce RTX 5070 at $549 — How does it stack up to the previous generation RTX 4070?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
The multi-frame-generation does NOT generate frames in between two processed frames! Instead, it generates them after ONE. So don’t expect latency from this tech! This is a very different approach than the 40 series frame generation! So if you can hit 120fps, this will let you hit an astronomical 480fps. Will the generated frames look good though? That’s what we’ll have to wait and find out about.
Digital Foundry already has a video up comparing latency in x2 FG to x4 FG. x4 FG has increased latency (although less than 10 ms) over x2 FG, so it's still increasing latency.

View: https://youtu.be/xpzufsxtZpA?si=dyA3tBjLsPwtTc3_
 
12gb of VRAM on a mid range GPU in 2025 should make it a non starter at any price. It just exists so the 16GB 5070TI can exist at a far higher price.
As I mentioned in the articles, this really depends on whether Nvidia has a way to override the regular texture compression with its AI compression techniques. 1/3 the memory requirement for better image quality? That means a 12GB Nvidia 5070 could in theory function like a 24~36 GB non 50-series.

But this is a very big IF. I'd love to see it happen but until it's officially said it's a feature, I would assume this is only for future games that specifically want to implement NTC or whatever "RTX Neural Materials" ends up being called.
 
I really wanted info on the darn chip in the Switch 2, or super switch or whatever it will be called. Imagine if they crammed dlss 4 into that and really optimized it for the system. It's just so different when you are writing code for a console that the optimization for dlss could get much deeper.
 
The closest to real world performance there is Far Cry 6 and it also has RT enabled. The indication is that 50 series RT is superior to the 40 series and that looks to be somewhere around 30% improvement. The ones with big improvements are due to the new frame generation.

I'm sure these cards will be an improvement across the board, but I'm sick and tired of companies flagrantly misrepresenting their products.
I tried to explain this to someone in the wccftech forums, I know ... mistake, and they started spouting the game was rendered at 1080P like the other titles despite the graph stating otherwise. Point being FC6 is our best title on the list to extrapolate the true performance of the 5090 in 4K gaming as it doesn't use DLSS in any form. The list of games is a lot of hype. DLSS is nice to have if you need it but it is never my first choice to turn it on. But hey that is just me.
 
The 4070Ti is genuinely at a similar gaming performance level as the 3090/Ti, though. Yes, without DLSS or Frame Generation. So it absolutely is possible for the 5070 to be more than "slightly faster" than the 4070, even beat the 4080. Will it beat the 4090? Who knows. Time will tell I guess.
What you’re saying would only make sense if we ONLY had the model name without any other information . The 4070ti only had a ~20% core count deficit to the 3090 and gained like nearly 40% clock speed from changing nodes. The 5070 has a 70% core count deficit to the 4090 and has no clock speed bump…
 
I really wanted info on the darn chip in the Switch 2, or super switch or whatever it will be called. Imagine if they crammed dlss 4 into that and really optimized it for the system. It's just so different when you are writing code for a console that the optimization for dlss could get much deeper.
This is NOT when they talk about a chip with ancient technology on an ancient node.
 
  • Like
Reactions: artk2219
I think AMD has come a long way in this regard. I don't think this is really a talking point anymore?

It isn't a talking point anymore. I have had more issues with Nvida updates, the past few years, vs AMD updates, as I deal with both. My main desktop has an RX 6800, my last laptop was a 1660ti, and current one is a 4060.


Regardless, I will probably skip this generation too. My RX 6800 is still plenty for my needs, as I only play WoW, at 1440p, on the desktop. Even my laptop with the 4060, at 1080p, is plenty. I am more CPU bound than anything.
 
issue is amd's drivers and stuff are not as good as nvidia's and to some people they care about "it just works" over a small performance difference.
No more than nvidias drivers. AMD hasnt had major driver issues since they bought ATi any more than Nvidia has.

I do agree DLSS and RT is better on Nvidia, but for pure rasterization, drivers are fine in both camps.
 
No, this is incorrect. Listen to this:
View: https://www.youtube.com/watch?v=qQn3bsPNTyI&t=127s


"These software and hardware innovations enable DLSS 4 on RTX 50-series GPUs to generate up to three additional frames between traditionally rendered frames."

I have seen and heard nothing that implies multi frame gen will use frame projection. But the latency penalty is the same as with regular framegen. You render two frames, and generate stuff in between... now with three frames instead of one.

This is also why you'll be able to use the Nvidia App to override regular framegen to multi-framegen on RTX 50-series. They're doing the same basic work, just with 1, 2, or 3 intermediate frames now. Smoke and mirrors!

UPDATE: Nope, I was wrong. DLSS 4 multi frame generation is apparently using frame projection technology!
Posting this here: We just had a Q&A with Jensen Huang, and I specifically asked about DLSS 4 multi frame generation. Much to my surprise, he said it is not using interpolation. "DLSS 4 predicts the future rather than interpolating the past" is effectively what he said. Which could really change how it feels.

Article: https://www.tomshardware.com/pc-com...crease-framerates-without-introducing-latency
 
  • Like
Reactions: atomicWAR
Yes, but now nVidia is telling us its all DLSS or nothing. They put all of their chips into it in this card generation. Probably forever since they are handing the ball to "AI". It's the result that counts I guess, no matter how you got there. If an electric car can beat a petrol car in a race, it doesn't matter that it worked a fraction as hard.
Nvidia can bill it how ever they like. The question is will the consumers and the industry buy into the idea/hype. Plenty of Nvidia features pushed have gone to the way side over time, some I miss. SLI, Physx cards, and yes quadrilateral polygons on the NV1/sega saturn all faded from their once prominent roles. Point being not everything Nvidia wants their consumer base/industry to adopt, survives the test of time. I suspect DLSS will stick around but how it is actually utilized in the long run is yet to be written in stone. No telling what disadvantages DLSS 4 (5, 6 etc) may or may not introduce. I suspect latency is a potential issue if DLSS 3 is any indicator. Don't get me wrong I am hoping for the best but I am not blindly swallowing the Nvidia hype train just because Jensen said its worth it. A time 'may' come where DLSS testing is all that is done in reviews but for the foreseeable future I don't see that happening. And frankly unless Nvidia opens up DLSS to everyone, it is far more likely an open standard will replace it in the long term.
 
  • Like
Reactions: Jagar123 and Gururu
That is propaganda from Nvidi0ts. I have been on all RDNA uarch and I had ZERO issues. In fact, I had more issues with my 1080 GTX than all my AMD GPUs.
except its not universal.

A friend a month ago went to 9800 xt (reason is they needed more vram) and has had issues resulting in blue screens.

Its not "going to happen to everyone" but it is more common to have amd driver issues than nvidia.

Between your claim and their i would side more w/ friend as i have personally seen it happen.
 
except its not universal.

A friend a month ago went to 9800 xt (reason is they needed more vram) and has had issues resulting in blue screens.

Its not "going to happen to everyone" but it is more common to have amd driver issues than nvidia.

Between your claim and their i would side more w/ friend as i have personally seen it happen.

What is a 9800xt? Blue screens could be an issue of not properly removing drivers via DDU, before upgrade, to a PSU that isn't up to the task. I have even had boards that needed a CMOS reset when you changed the GPU or it wouldn't work properly.

I had an Nvida driver, when I had my 1660ti equipped laptop, that screwed up my WoW install. Even after DDU and reinstall of previous working driver, my fps was a slideshow. Thankfully I keep a backup for it, so I was up and running quickly after I deleted my install. My work PC's have Nvidia cards too. I have had instances of driver killing performance of simulation in my cad based inspection software.

My RX 6800 hasn't given me any issues on the driver front in the nearly 4yrs I have had it.
 
The "AMD drivers are bad" dupe is a little tiring, as is the you need upscaling to play games now lie.
team greed is getting more zealots on the propaganda train every day.
What is really bad is they are getting game studios to buy in to the same shtick to find a way out of doing their jobs as well.
 
  • Like
Reactions: jlake3
The 5070 should be about 30% faster than the 4070 in real terms, the supposed 4090 performance is just Mumbo Jumbo from Nvidia trying to convince you Frame Generation at its most agressive form is real time rendering performance. Most of the games that are really worth playing depend on your timing and fast reactions (such as multiplayer games). You cant really count on those Multi Frame Generation frames for those games.
 
As I mentioned in the articles, this really depends on whether Nvidia has a way to override the regular texture compression with its AI compression techniques. 1/3 the memory requirement for better image quality? That means a 12GB Nvidia 5070 could in theory function like a 24~36 GB non 50-series.

But this is a very big IF. I'd love to see it happen but until it's officially said it's a feature, I would assume this is only for future games that specifically want to implement NTC or whatever "RTX Neural Materials" ends up being called.

Plus we'd need to see that it ACTUALLY reduces the required VRAM by 33% while providing equal or better quality, especially if it were able to be used outside games which did not implement it. If it were an open standard that could be applied across consoles, handhelds, and PCs, then I'd have more faith, but as it is it's not anything close to a selling point to me, especially not once the OEM variant price increases are added to take this $549.99 card well over $600.
 
Plus we'd need to see that it ACTUALLY reduces the required VRAM by 33% while providing equal or better quality, especially if it were able to be used outside games which did not implement it. If it were an open standard that could be applied across consoles, handhelds, and PCs, then I'd have more faith, but as it is it's not anything close to a selling point to me, especially not once the OEM variant price increases are added to take this $549.99 card well over $600.
So I mentioned this elsewhere, but I asked about this and RTX Neural Materials will require content creators to use the new features. And it will only work on the 50-series, as it has some architectural changes that enable the use of shaders and neural code in the same instruction queue or whatever.

That's a pretty massive double whammy. It would be an Nvidia-only feature, and RTX 50-series or later only feature. But Nvidia has managed to get plenty of games to use DLSS and that's also RTX only, with DLSS 3 being 40-series and later, so it could still see a lot of uptake.

Will it reduce VRAM use by 67%? No. There's more in VRAM than just textures. But it should reduce texture VRAM requirements by 67%, and that's the lion's share of VRAM use. The rest is geometry and such, and while that does use memory, I suspect it's probably more like 500~1000 MB tops for a modern game, where textures can easily consume 12GB or more. So 1000MB plus 4GB (AI compressed) would easily fit in 8GB of VRAM.

The demo Nvidia showed really did look impressive. I don't know what game that is, but I want to see more games with that level of visual fidelity. Will it happen? And will the games be worth playing? That's an entirely different set of questions. 🙂
 
So I mentioned this elsewhere, but I asked about this and RTX Neural Materials will require content creators to use the new features. And it will only work on the 50-series, as it has some architectural changes that enable the use of shaders and neural code in the same instruction queue or whatever.

That's a pretty massive double whammy. It would be an Nvidia-only feature, and RTX 50-series or later only feature. But Nvidia has managed to get plenty of games to use DLSS and that's also RTX only, with DLSS 3 being 40-series and later, so it could still see a lot of uptake.
The real question on uptake would be whether utilizing the new features causes more development time or not, because the only reason DLSS FG is being implemented this widely is that it doesn't really require much work. The arbitrary hardware limitations are also something that absolutely wouldn't be happening if nvidia didn't have the majority of the market and mindshare.
 
Really struggling to see the value of the 5070. For 1440p it is a very pricey option, for 4k it seems to have a limited future with the 12GB VRAM, why in 2025 would they hobble it like that.
 
Really struggling to see the value of the 5070. For 1440p it is a very pricey option, for 4k it seems to have a limited future with the 12GB VRAM, why in 2025 would they hobble it like that.
For this gen it seems like only 5070Ti and 5080 is really of anything related to value...
5090is wayyy to expensive, plus it gets the same single 12V 2x6 plug and pulls 575W by spec, not looking convincing for 2 grand.
 
  • Like
Reactions: Loadedaxe
That is propaganda from Nvidi0ts. I have been on all RDNA uarch and I had ZERO issues. In fact, I had more issues with my 1080 GTX than all my AMD GPUs.
Right. All my issues with AMD cards over the years are just propaganda 🙄

I can't remember when I last had issues with my Nvidia cards. Even two months ago when I wiped my driver because I had monitor issues, it ultimately turned out to be Windows, not the driver, and I overreacted DDU wiping it (though that should be done regularly anyways and I was lazy on that count since I got this card two years ago, so eh...). Meanwhile, I had AMD cards that were overheating for no reason and constant driver issues, and I'm far from alone. There is advice out there not to update AMD drivers every time a new one drops even today. But of course, your own experience is more true and it's all just propaganda from "Nvidi0ts"...
 
The real question on uptake would be whether utilizing the new features causes more development time or not, because the only reason DLSS FG is being implemented this widely is that it doesn't really require much work. The arbitrary hardware limitations are also something that absolutely wouldn't be happening if nvidia didn't have the majority of the market and mindshare.
I suspect that using RTX Neural Materials probably wouldn't be too hard. Basically, a lot of games have much higher resolution assets from the artists that need to get downgraded to fit into consumer GPU constraints. So whatever tools are used to do that, potentially you just feed the data into an "RTX Neural Materials" converter and end up with smaller files that have higher quality.

But it still takes time, and the bean counters at game companies are real. Probably Nvidia will basically give free developer resources to some games to try and get the feature used. If it gains traction, maybe more games start doing it as well.

Realistically, every game that wants to do RTX Neural Materials will also need to have Standard Materials available for many years to come, which means more data to store, download, etc. It would be interesting if, instead of HD texture packs, some games started having RTX texture packs or whatever.

I know AMD also talked about some neural compression stuff. It's not clear to me how close that might be to an actual real-world implementation, or if it might be able to do via an open API / library what Nvidia will do as proprietary. Best-case for the long-term would be if we get something like S3TC (later DXTC) as a standard that works on everything, rather than a vendor specific solution that only works on 50-series and later.
 
I suspect that using RTX Neural Materials probably wouldn't be too hard. Basically, a lot of games have much higher resolution assets from the artists that need to get downgraded to fit into consumer GPU constraints. So whatever tools are used to do that, potentially you just feed the data into an "RTX Neural Materials" converter and end up with smaller files that have higher quality.

But it still takes time, and the bean counters at game companies are real. Probably Nvidia will basically give free developer resources to some games to try and get the feature used. If it gains traction, maybe more games start doing it as well.

Realistically, every game that wants to do RTX Neural Materials will also need to have Standard Materials available for many years to come, which means more data to store, download, etc. It would be interesting if, instead of HD texture packs, some games started having RTX texture packs or whatever.

I know AMD also talked about some neural compression stuff. It's not clear to me how close that might be to an actual real-world implementation, or if it might be able to do via an open API / library what Nvidia will do as proprietary. Best-case for the long-term would be if we get something like S3TC (later DXTC) as a standard that works on everything, rather than a vendor specific solution that only works on 50-series and later.
But frankly, it seems like every gen they are just pushing for these AI generated stuffs more and more, and individually it might not take too much of an effort, but combined for game developers to cater all these generations of DLSS and FSR stuffs into their code it likely will result in more AAA games getting so bloated graphics the game content becoming more and more hollow...