News Don't waste money on a high-end graphics card right now — RTX 4090 is a terrible deal

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
This is basically my position. There are lots of unknowns about the RTX 5090, but you can be reasonably sure of strong demand at launch and that it will be even more expensive. In terms of perf/$, I think it should be an improvement, but maybe not by a lot.

So, if the RTX 4090 is already at the top of your price range or you want something soon and aren't willing to pay ridiculous scalper premiums, then I think there's a decent argument to be made for the RTX 4090. When a good opportunity to upgrade comes along, I assume they will hold their resale value reasonably well, especially if I'm right about the RTX 5090 not offering much better perf/$.
I agree with everything stated here except saying that the 5090 will not provide a large leap over the 4090. First of all, the 4090 was indeed a massive leap over the 3090 in literally everything. Second of all and again, IT DEPENDS ON WHAT KIND OF PERFORMANCE we're talking about. I think for RT and AI we will indeed see as big a generational leap as the 4090, maybe not quite as much for applications outside of that. But still, even there, the 5090 will have much more and faster CUDA cores and SMs than the 4090.
 
It DEPENDS! Lol I completely hate the way you guys are framing this discussion, this is hardly any better than a youtube comment section of AMD fanboys, sorry to say.
I don't appreciate being called a fanboy, since I'm trying to take a very data-driven approach. I don't consider myself to have a stake in the outcome, since I'm not anywhere near buying either of these products.

When you say "performance" what exactly are we talking about!? RT? AI? Plain old raster?
I'm talking about across-the-board scaling. Absent any data on the RTX 5000 series prioritizing one area vs. another, that's all we can do at this stage.

And if we ARE talking plain old raster, are we including 1% lows, frame latency, or power draw?
I'm talking mainly about average fps, because I believe 1% lows have more influence from factors like CPU bottlenecks.

Let's take the 1080ti vs the 2080ti, since you brought it up. the 2080ti COMPLETELY DESTROYS the 1080 ti in RT performance,
Thanks for mentioning, but I'm well aware of that. I only used it as an example, because virtually all of the games people were actually playing used raster. So, that's clearly what mattered, and yet Nvidia was willing to launch a product with very little improvement on that front. I get why they did it, but it's still worth keeping in mind that they haven't always held themselves to the kind of generational improvement Jarred mentioned.

People don't generally buy new GPUs every generation, comparing the minute performance uplift from two years ago! Most people buy new GPUs more than one generation apart,
Yes, and the article was saying "if you're thinking about a RTX 4090, here's why you should wait". Presumably, the target audience is on a much older GPU. Only a couple folks in this thread were even contemplating an upgrade from RTX 4090 -> RTX 5090, but I agree that's not normal.

I did give a nod to the idea that you might get a RTX 4090 now, as a hedge against what happens with timing & pricing of RTX 5090's, but then consider flipping it and going ahead with the upgrade. It's not very cost-efficient, but it is an answer to someone who's anxious about the whole situation and considers a few hundred $ to be worthwhile for making sure they have a high-end GPU to use and aren't left high and dry, waiting for the RTX 5090 to become available at non-scalper prices.

they expect performance increases but MORE IMPORTANT are the transformational FEATURE IMPROVEMENTS from gen-to-gen, like frame generation and DLSS in general! The way techtubers and tech experts on social media talk about GPUs is significantly disconnected from the real reasons why people buy GPUs in the real world!
Understood. However, Nvidia hasn't said what new features it will have and we know the transistor count will be similar. So, I see no good reason to expect anything more than incremental improvements to the existing feature set. Going back to the example of GTX 1080 Ti -> RTX 2080 Ti, they increased the transistor count by 55%, which is almost double what we're looking at here.
 
the 4090 was indeed a massive leap over the 3090 in literally everything.
I already addressed this point. The RTX 4090 has 2.6 times as many transistors (i.e. a 160% increase) as the RTX 3090. The information we have on the RTX 5090 is that it will have only about 1.29 times as many transistors (i.e. a 29% increase) as the 4090. In terms of power efficiency, going from TSMC 4N to 4NP should be a very small gain, very much unlike when Nvidia moved from Samsung 8nm to TSMC 4N.

The data just doesn't give us any reason to think this generational uplift will be like the previous one. At least, not at the 5090 tier.
 
Last edited:
The argument stated here makes perfect sense for precisely everyone EXCEPT the 4090 buyer! It's EXCACTLY THOSE PEOPLE WHO WANT A 4090 RIGHT NOW for whom the argument in this article doesn't really hold, because the only GPU that will beat it will likely be the 5090, which will probably cost more and potentially be less available to purchase when it first arrives.

I live in Greece.

Right now, anyone in my country wishing to purchase a 4090, will have to pay a price between 2,500€ and 3,700€ (depending on the version of 4090 they want to obtain).

And that's for a 2 year old GPU, that will most likely be excluded from Nvidia's DLSS 4.0, a.k.a the Holy Grail of Graphics Rendering.

The way i see it, it makes perfect sense for any potential 4090 buyer to wait for 5090: a card that will be equally ridiculous in price, but with a close to 50% performance increase, 8 more GBs of VRAM and full support of DLSS 4.0.
 
Remind me of the RTX 2080 Ti's improvement vs. GTX 1080 Ti, again?
At launch, running traditional rasterization games (because DXR games weren't really available), my figures showed the 2080 Ti delivering 33% higher performance at 4K ultra than the GTX 1080 Ti. If you drop to 1440p, the gap shrinks to 27%, and at 1080p it was only 18%. But that's expected for the top GPUs as you start hitting CPU limitations. (I only had an i7-8700K at the time, which was the fastest CPU of the era.)

Now, more recent testing is far more favorable, even just sticking to pure rasterization games. The last time I tested both GPUs (earlier this year), RTX 2080 Ti was 49% faster at 4K ultra, 52% faster at 1440p ultra, 46% faster at 1080p ultra, and 40% faster at 1080p medium. And when I update my test suite and retest both again, using my 2024 test suite, I expect we'll see the gaps increase yet again.

The problem people really had with the RTX 2080 Ti wasn't that it delivered a mediocre performance upgrade over the 1080 Ti, it was the fact that it cost $1200 instead of $700. And many overlooked the fact that it had a massive 754 mm^2 die on TSMC 12nm (basically tweaked 16nm) while the Pascal GP102 was 471 mm^2 on TSMC 16nm. I'm not saying it should have been $1200, but 1080 Ti was the last time Nvidia had a halo GPU priced aggressively.
 
  • Like
Reactions: bit_user
The way i see it, it makes perfect sense for any potential 4090 buyer to wait for 5090: a card that will be equally ridiculous in price, but with a close to 50% performance increase, 8 more GBs of VRAM and full support of DLSS 4.0.
What if it's only 30% faster, but costs 25% more, and DLSS 4.0 is fully supported on the RTX 4090?

Also, if the price of the RTX 4090 seems ridiculous to you, then you're probably not buying either and therefore you're not who it's really addressed to.
 
The problem people really had with the RTX 2080 Ti wasn't that it delivered a mediocre performance upgrade over the 1080 Ti, it was the fact that it cost $1200 instead of $650.
The GTX 1080 Ti launched at $699, not $650. The Founders Edition of the 2080 Ti was $1200, but the 3rd party MSRP was only $999.

But yeah, cost-wise it was a pretty big shock to get Titan-level pricing on a x80 Ti card.

And many overlooked the fact that it had a massive 754 mm^2 die on a newer Samsung 8N (basically refined 10nm) while the Pascal GP102 was 471 mm^2 on TSMC 12nm.
HUH? Pascal was made on TSMC 16FF. Turing was made on TSMC 12FFN.
 
What if it's only 30% faster, but costs 25% more, and DLSS 4.0 is fully supported on the RTX 4090?

Also, if the price of the RTX 4090 seems ridiculous to you, then you're probably not buying either and therefore you're not who it's really addressed to.
I already possess 4090. I'm just trying to put myself in the position of someone who doesn't have it, but contemplates the purchase of it at this particular moment, and i end up saying that there's just no way i would go for it.

Also, i think it's highly unlikely 5090 will end up costing more than the ridiculous prices we're currently getting for 4090.

Maybe stocks will run dry at first, but i doubt this situation will last more than a few weeks.

Eventually, we'll get a steady supply of 5090s and prices will inevitably smoothen out.

Just like it happended in my country, back in spring/summer of 2023, when you could buy 4090 for almost 1,700€, but 3090/Ti would still cost 2,500€-3,000€, mainly because it had been reduced into a collector's item.

And even at the worst case scenario that Nvidia give us a 3,000€ MSRP (which they won't, but even if they do), 5090 will still make way more sense than buying 4090: for the same price and more VRAM, i'd take 5090 every day of the week and twice on Sunday, even for a 10% performance increase.
 
Also, i think it's highly unlikely 5090 will end up costing more than the ridiculous prices we're currently getting for 4090.
I'm not going to speculate about that, because I don't have any information better than the rumors already mentioned in this thread.

Eventually, we'll get a steady supply of 5090s and prices will inevitably smoothen out.
RTX 4090 pricing has not been a totally smooth ride. Here's a model that launched at about $1720. Nine months later, it dropped as low as about $1620, but then promptly went back up to $1700 and then $1800. Even now, after two whole years, it's selling for almost $100 above where it launched.
 
I'm pretty satisfied w/ just gaming on my 6950 XT for the time being. However, I'll probably wait till after next gen (i.e. in 2027) to upgrade to w/e top tier viable option(s) are available at that point.
 
Well, I'm m not risking the likely tariff price hike. Just purchased a complete tower with an R9 7900x and 4080 Super. Then I'll wait for the costs of everything to go up to sell my 3070 and 4070 laptops. The increase in performance over my laptop will be way more than enough to hold me over until (hopefully) things become affordable again.
 
The GTX 1080 Ti launched at $699, not $650. The Founders Edition of the 2080 Ti was $1200, but the 3rd party MSRP was only $999.

But yeah, cost-wise it was a pretty big shock to get Titan-level pricing on a x80 Ti card.


HUH? Pascal was made on TSMC 16FF. Turing was made on TSMC 12FFN.
Sorry, was pulling things from memory and made some mistakes.

980 Ti was $649 launch. 12nm and 16nm TSMC were basically the same, just a rebranding. And then I got Ampere and Turing processes confused. I'm getting old and senile, clearly! LOL
 
  • Like
Reactions: bit_user