It DEPENDS! Lol I completely hate the way you guys are framing this discussion, this is hardly any better than a youtube comment section of AMD fanboys, sorry to say.
I don't appreciate being called a fanboy, since I'm trying to take a very data-driven approach. I don't consider myself to have a stake in the outcome, since I'm not anywhere near buying either of these products.
When you say "performance" what exactly are we talking about!? RT? AI? Plain old raster?
I'm talking about across-the-board scaling. Absent any data on the RTX 5000 series prioritizing one area vs. another, that's all we can do at this stage.
And if we ARE talking plain old raster, are we including 1% lows, frame latency, or power draw?
I'm talking mainly about average fps, because I believe 1% lows have more influence from factors like CPU bottlenecks.
Let's take the 1080ti vs the 2080ti, since you brought it up. the 2080ti COMPLETELY DESTROYS the 1080 ti in RT performance,
Thanks for mentioning, but I'm well aware of that. I only used it as an example, because virtually all of the games people were actually playing used raster. So, that's clearly what mattered, and yet Nvidia was willing to launch a product with very little improvement on that front. I get why they did it, but it's still worth keeping in mind that they haven't always held themselves to the kind of generational improvement Jarred mentioned.
People don't generally buy new GPUs every generation, comparing the minute performance uplift from two years ago! Most people buy new GPUs more than one generation apart,
Yes, and the article was saying "if you're thinking about a RTX 4090, here's why you should wait". Presumably, the target audience is on a much older GPU. Only a couple folks in this thread were even contemplating an upgrade from RTX 4090 -> RTX 5090, but I agree that's not normal.
I did give a nod to the idea that you
might get a RTX 4090 now, as a hedge against what happens with timing & pricing of RTX 5090's, but then consider flipping it and going ahead with the upgrade. It's not very cost-efficient, but it is an answer to someone who's anxious about the whole situation and considers a few hundred $ to be worthwhile for making sure they have a high-end GPU to use and aren't left high and dry, waiting for the RTX 5090 to become available at non-scalper prices.
they expect performance increases but MORE IMPORTANT are the transformational FEATURE IMPROVEMENTS from gen-to-gen, like frame generation and DLSS in general! The way techtubers and tech experts on social media talk about GPUs is significantly disconnected from the real reasons why people buy GPUs in the real world!
Understood. However, Nvidia hasn't said what new features it will have and we know the transistor count will be similar. So, I see no good reason to expect anything more than incremental improvements to the existing feature set. Going back to the example of GTX 1080 Ti -> RTX 2080 Ti, they increased the transistor count by 55%, which is almost double what we're looking at here.