News Nvidia's GeForce RTX 5070 at $549 — How does it stack up to the previous generation RTX 4070?

Be very careful when comparing RTX 50 to 40. RTX 40 doesn't have DLSS 4. I wanna see a true, genuine, 1 to 1, apples to apples, side by side comparison, both on the exact same hardware, both with DLSS turned off, both running the exact same race, and see what happens.
It’ll be a little faster than a 4070 that’s what’ll happen. No need to wait and see.

They make excellent GPUs but no way they’re getting that much more raw performance out of so much less hardware in one generation.
 
Be very careful when comparing RTX 50 to 40. RTX 40 doesn't have DLSS 4. I wanna see a true, genuine, 1 to 1, apples to apples, side by side comparison, both on the exact same hardware, both with DLSS turned off, both running the exact same race, and see what happens.
That may never come to pass. It seems to me that the 50 has further dedicated tech intended to utilize DLSS better. It wouldn't be fair if that tech was not allowed to operate. It's becoming a murky situation, where these cards may be depending a whole lot on AI driven processing on top of straight instruction. I fear that in addition to nerfing hardware on lower models, the software could potentially be heavily modulated to weaken or strengthen cards in the lineup.
 
  • Like
Reactions: artk2219
That may never come to pass. It seems to me that the 50 has further dedicated tech intended to utilize DLSS better. It wouldn't be fair if that tech was not allowed to operate. It's becoming a murky situation, where these cards may be depending a whole lot on AI driven processing on top of straight instruction. I fear that in addition to nerfing hardware on lower models, the software could potentially be heavily modulated to weaken or strengthen cards in the lineup.
While I agree on things getting murky... Even now in reviews it is pretty standard to have testing done with and without DLSS/frame gen running so people can see all 3+ sets of numbers. I know gamers who refuse to use frame gen or dlss, swearing it feels laggy. And some who won't touch DLSS for visual preference preferring instead to only game at native resolutions. Personally I don't tend to notice dlss/frame gen lag in most scenarios but a there are few I pick up on it slightly so its fair to request benchmarks without the tech enabled and I don't see that changing anytime soon.
 
It’ll be a little faster than a 4070 that’s what’ll happen. No need to wait and see.

They make excellent GPUs but no way they’re getting that much more raw performance out of so much less hardware in one generation.
That's kinda the point. NVidia is basically lying. In a real apples to apples test, the 5070 will not beat the 4090. The 5080 might. Imagine I come out with a test.... A 1050Ti vs a 5090.... and say my 1050Ti achieved a higher FPS than the 5090.

The 5090 test: Avatar
The 1050Ti test: Terraria

"My 1050Ti got a higher framerate than your 5090"

It's basically lying, which is exactly what NVidia did when they said 5070 performance of a 4090. Imagine someone does that on a game that has no DLSS support at all and then the real results come out and they call false advertising. I get that it's marketing for NVidia, but really, it's lying. Put DLSS 4 on the 4090 and see how the 5070 does. Make it a true, 1 to 1 , side by side, apples to apples test. All software and hardware the same. Let's keep it fair and honest, come on now lol
 
5070 is using 28Gbps memory per MSI and the nvidia spec page shows 672 GB/s memory bandwidth which confirms that.

This is the sort of official nonsense slide they're sharing right now (slide from Wccftech):
MQLoVmi.jpeg

The closest to real world performance there is Far Cry 6 and it also has RT enabled. The indication is that 50 series RT is superior to the 40 series and that looks to be somewhere around 30% improvement. The ones with big improvements are due to the new frame generation.

I'm sure these cards will be an improvement across the board, but I'm sick and tired of companies flagrantly misrepresenting their products.
 
It’ll be a little faster than a 4070 that’s what’ll happen. No need to wait and see.

They make excellent GPUs but no way they’re getting that much more raw performance out of so much less hardware in one generation.
The 4070Ti is genuinely at a similar gaming performance level as the 3090/Ti, though. Yes, without DLSS or Frame Generation. So it absolutely is possible for the 5070 to be more than "slightly faster" than the 4070, even beat the 4080. Will it beat the 4090? Who knows. Time will tell I guess.
 
The multi-frame-generation does NOT generate frames in between two processed frames! Instead, it generates them after ONE. So don’t expect latency from this tech! This is a very different approach than the 40 series frame generation! So if you can hit 120fps, this will let you hit an astronomical 480fps. Will the generated frames look good though? That’s what we’ll have to wait and find out about.
 
While I agree on things getting murky... Even now in reviews it is pretty standard to have testing done with and without DLSS/frame gen running so people can see all 3+ sets of numbers. I know gamers who refuse to use frame gen or dlss, swearing it feels laggy. And some who won't touch DLSS for visual preference preferring instead to only game at native resolutions. Personally I don't tend to notice dlss/frame gen lag in most scenarios but a there are few I pick up on it slightly so its fair to request benchmarks without the tech enabled and I don't see that changing anytime soon.
Yes, but now nVidia is telling us its all DLSS or nothing. They put all of their chips into it in this card generation. Probably forever since they are handing the ball to "AI". It's the result that counts I guess, no matter how you got there. If an electric car can beat a petrol car in a race, it doesn't matter that it worked a fraction as hard.
 
  • Like
Reactions: artk2219
Be very careful when comparing RTX 50 to 40. RTX 40 doesn't have DLSS 4. I wanna see a true, genuine, 1 to 1, apples to apples, side by side comparison, both on the exact same hardware, both with DLSS turned off, both running the exact same race, and see what happens.
It looks like we might have an apples to apples comparison in the Nvidia charts, but only with A Plague Tale. The note says that game only supports DLSS 3, so I presume that means that both cards are running that game with DLSS 3 only. In that case, it looks like the 50 series is about 30% better than the 40 series.

Other than that, I agree. I don't really care about MFG until I've tried it to see how the latency feels. I'm not a fan of FG.
 
  • Like
Reactions: artk2219
The multi-frame-generation does NOT generate frames in between two processed frames! Instead, it generates them after ONE. So don’t expect latency from this tech! This is a very different approach than the 40 series frame generation! So if you can hit 120fps, this will let you hit an astronomical 480fps. Will the generated frames look good though? That’s what we’ll have to wait and find out about.
But can it actually improve input lag?

Most people I know are trying to get to either 60, 120, or 144 with the highest graphics settings they can manage. If they're quad-pumping a 36 fps input into 144 fps output, is that actually gonna feel good?
 
The multi-frame-generation does NOT generate frames in between two processed frames! Instead, it generates them after ONE. So don’t expect latency from this tech! This is a very different approach than the 40 series frame generation! So if you can hit 120fps, this will let you hit an astronomical 480fps. Will the generated frames look good though? That’s what we’ll have to wait and find out about.
No, this is incorrect. Listen to this:
View: https://www.youtube.com/watch?v=qQn3bsPNTyI&t=127s


"These software and hardware innovations enable DLSS 4 on RTX 50-series GPUs to generate up to three additional frames between traditionally rendered frames."

I have seen and heard nothing that implies multi frame gen will use frame projection. But the latency penalty is the same as with regular framegen. You render two frames, and generate stuff in between... now with three frames instead of one.

This is also why you'll be able to use the Nvidia App to override regular framegen to multi-framegen on RTX 50-series. They're doing the same basic work, just with 1, 2, or 3 intermediate frames now. Smoke and mirrors!

UPDATE: Nope, I was wrong. DLSS 4 multi frame generation is apparently using frame projection technology!
 
What a joke 5070 is comparable to a 4090.
We all knew AI was going to be big for the 50 series, but I dunno. Using AI tech as the biggest selling point, rather than featuring hardware points, this generation is going to be bad...ok bad might be a poor choice of wording but I don't think it'll live up to NVIDIA's expectations
 
"5070 = 4090 if you use all these features that will make game look worse than if you just used a 4090"
ray tracing was the worst thing to happen to tech as it muddies actual performance as they have all these hoops to jump throguh now.
dlss, framegen, etc etc.

In rasterization era it was just "this is what you get no hoops"
Unfortunately, this is where we are.
We will see when true benchmarking hits. I am curious what all of this hoopla actually does.
I need to upgrade my 6700k and I am going to. I was holding out on Intel to see if they can "bios" fix the Core Ultra series, and the 5070 was on my list, I may buy a B580 or B770 and call it a day.
 
  • Like
Reactions: artk2219
The 5070 having "4090 levels of performance" is straight-away dishonest. Have some sense of modesty and say "4080 levels of performance" as that is still a good perf and value gain -- if true -- and indeed probably "beats" (fps terms, not such much input lag which is starting to become a more relevant metric in some to many cases) the 4080 in the majority of games when using DLSS4 vs. DLSS3.5.

It's easy to see by the spec sheet that the 5070 would run close to the 4070 when not using DLSS, perhaps with slightly larger gains using the same version like 3.5, even with architecture improvements. For one, going from TSMC 4N ("5" nm Gen 2) to 4NP ("5" nm Gen 3) doesn't provide a big gain.