News Don't waste money on a high-end graphics card right now — RTX 4090 is a terrible deal

Admin

Administrator
Staff member
  • Like
Reactions: iLoveThe80s

bit_user

Titan
Ambassador
The article said:
the best advice for anyone looking to buy a high-end GPU is to wait.
For how long, though? If demand for a RTX 4090 is this strong, even though it's 2 years old and about to be replaced, what do you think the chances are of even finding a RTX 5090, when it launches? And then how much will it finally cost, if/when you do get one?

The article said:
Hopefully the RTX 5080 comes in closer to $999, though we suspect it's more likely to inherit the $1,199 MSRP of its predecessor
With a bigger die, I doubt it will be cheaper. I think the best we can reasonably hope for is that it launches at the same price point.

As for the stuff about AMD and Intel, the rumors about their performance don't really justify mentioning them in the same article as RTX 5090 & 5080.
 

DS426

Upstanding
May 15, 2024
262
194
360
The 4090 is especially a terrible deal when one considers that the 4080 Super can be had right now at many places for $1,000, at least in the US. Even at $1,200, I'm pretty sure it's value is on par or better than a >$2,000 4090.

We'll just have to wait and see if scalpers are successful in jacking up the price of the 5090.
 
  • Like
Reactions: iLoveThe80s
Tariffs are going to be quite a wildcard. Even though we can look at what's been said about them, I still think it's too early to try and guess how they will actually play out.
I'm pretty sure it will play out exactly as it did last time: graphics card assembly will be pushed out of China and into other places, so that the cards aren't "made in China" and can thus avoid tariffs. If you look at what actually happened previously, a few companies massively increased prices in the short term and then had to find alternative routes that would avoid the tariffs to remain competitive with other companies.
 
  • Like
Reactions: iLoveThe80s

DS426

Upstanding
May 15, 2024
262
194
360
Tariffs are going to be quite a wildcard. Even though we can look at what's been said about them, I still think it's too early to try and guess how they will actually play out.
Agreed. We don't know what final rates will be, on exactly what all items and components. There will also be differences in short, medium, and long-term pricing as supply chains are reworked.

It's not a political matter to ask the philosophical question: did we really think we could go forever on costs that are held artificially low?
 

bit_user

Titan
Ambassador
I'm pretty sure it will play out exactly as it did last time: graphics card assembly will be pushed out of China and into other places, so that the cards aren't "made in China" and can thus avoid tariffs.
Again, not to say what will happen, but what's been said is to have a blanket import tariff. So, only moving them to US (or maybe Mexico/Canada, if the USMCA holds edit: as of now, that looks unlikely) would avoid them. Obviously, on-shoring that much manufacturing is virtually impossible, especially in a compressed timeframe and not to mention the promised mass-deportations of the very workers who would probably comprise most of the factory workforce. While these are problems that can be worked through, sorting out the inefficiencies would take much longer, and short-term costs would probably far exceed the 20% overhead of the tariffs, if not also in the medium and longer term.

However, let's also not forget the carve-out that was made for certain electronic components, which is why the PC sector didn't feel the brunt of the tariffs, last time. Just like back then, I think there's not the political will to truly impose a 20% blanket import tax.

And with that, I'm just going to sit back and watch how things unfold, like @stonecarver said.

If you look at what actually happened previously, a few companies massively increased prices in the short term and then had to find alternative routes that would avoid the tariffs to remain competitive with other companies.
If you really want to get into it, you ought to look at the tariffs on metals, which did not have any "alternative routes", and see how it affected US companies who were big consumers of those commodities.
 
Last edited:

DavidLejdar

Respectable
Sep 11, 2022
286
179
1,860
It's simpler than that: trade deficits can't be sustained indefinitely. That's not prescribing what to do about them, it's just stating an economic fact.
Mercantilism is dated. Sure, trade deficit may not sound great. But imports create additional value, like a company which can output something with the laptops they bought, and not missing out on staff that would be needed to put them laptops together.

Then there is also the thing, that it may be U.S. shareholders, who take most of the profit of a factory e.g. in Mexico.

And in case of Mexico, imports to Mexico from the U.S., that's more than 300 billion there yearly, likely largely paid by money made from exports to the U.S. - That's a lot of money, that may not be circulating soon.

Btw, in central Europe, there used to be some 1,800 customs areas, until the Zollverein was created in 1834 (one of the founding members was Prussia*). And it boosted the economy, while states didn't miss out on income, as a lot of that comes here from VAT / sales tax.

*
https://en.wikipedia.org/wiki/Prussia–United_States_relations
 

bit_user

Titan
Ambassador
Mercantilism is dated. Sure, trade deficit may not sound great.
Capital outflows and inflows need to equalize, at some point.

But imports create additional value, like a company which can output something with the laptops they bought, and not missing out on staff that would be needed to put them laptops together.
Yeah, that's why people buy imports.
 

TeamRed2024

Upstanding
Aug 12, 2024
200
131
260
For how long, though? If demand for a RTX 4090 is this strong, even though it's 2 years old and about to be replaced, what do you think the chances are of even finding a RTX 5090, when it launches? And then how much will it finally cost, if/when you do get one?

Exactly this. My 4090 is selling for $400 more today than it was when I bought it at MSRP.

I'll get a 5090 at MSRP and not before. The good news is there is nothing that I do with my PC that requires a 5090. The 4090 is still a beast card 2 years later... and if I have to wait 6 months to find a 5090 at MSRP... so be it.
 
  • Like
Reactions: valthuer

emitfudd

Distinguished
Apr 9, 2017
573
82
18,990
Exactly this. My 4090 is selling for $400 more today than it was when I bought it at MSRP.

I'll get a 5090 at MSRP and not before. The good news is there is nothing that I do with my PC that requires a 5090. The 4090 is still a beast card 2 years later... and if I have to wait 6 months to find a 5090 at MSRP... so be it.
I have never been able to get a GPU at MSRP. I always end up overpaying because of a claimed shortage or some reason or another. The 4090 is still more than enough GPU for any CPU out there. No reason I can see to even consider a 5000 series. 6000, maybe?
 
  • Like
Reactions: TeamRed2024

halfcharlie

Prominent
Dec 21, 2022
27
11
535
For those of us not in the US (where sellers can just charge whatever they want whether it be for cars with the dealership mafia or electronics or otherwise) MSRP is easy, in stock is the hard part. It's not like the 5090 is for 4090 owners, and the only real advantage that it could have is if it has significantly better raytracing performance, like 100% increase at least, full raytracing is the only thing which pushes the 4090.
 

Mama Changa

Proper
Sep 4, 2024
83
56
110
The 4090 is especially a terrible deal when one considers that the 4080 Super can be had right now at many places for $1,000, at least in the US. Even at $1,200, I'm pretty sure it's value is on par or better than a >$2,000 4090.

We'll just have to wait and see if scalpers are successful in jacking up the price of the 5090.
The 5090 will be a bare minimum of $2K, already third party has said Nvidia were testing their reactions to a price in the $2-2.5K range. And it's even worse news for 5080 expected to hit at least $1399. Nvidia is now telling stores that they need to market the 5080 and 5090 as professional devices if they complain about the huge price increases and get the 5070 Ti.

So even if scalpers sell jack up the price, a lot of people will be gobsmacked at the 5090 RRP alone. And I'm ignoring Chump's trade war.
 

TeamRed2024

Upstanding
Aug 12, 2024
200
131
260
Have you played games lately?

Have you played games lately at 4K Ultra and wanted to do so at higher framerates like 120/144?

He’s probably like me… playing in 4K Ultra at 60 fps. Nothing pushes my 4090 at all other than the max RT/PT.

Competitive gamer FPS isn’t a requirement for me… I play MMOs… RPGs… ARPGs… and flight/racing sims.

I could honestly skip the 5000 series and not lose any sleep over it… any purchase would be mainly so I can get decent value for my 4090 now rather than 2-3 years from now when it would be like the 3090 is today.

Still want to see benchmarks too. Expecting the same 4090 to 5090 boost that we saw 3090 to 4090.

If it’s a low bump that would be more incentive to pass. There’s nothing I’ve done on this PC so far that has me wishing for a better card.
 

bit_user

Titan
Ambassador
Still want to see benchmarks too. Expecting the same 4090 to 5090 boost that we saw 3090 to 4090.
Have you not read that it's made on virtually the same process node as the 4090 and only 22% bigger? How on earth do you expect them to manage such a feat?

Seriously, the RTX 3090 was made on Samsung's 8nm, which was bad even for a supposed 8nm node. The RTX 4000 series jumped to a tuned version of TSMC N5, called "4N". To put it in more concrete terms, the GA102 had 28.3 billion Transistors, while the AD102 has 76.3. That's 2.7 times as many!

By contrast, the RTX 5090 should have maybe upwards of 1.3 times as many as the RTX 4090. I think this is going to be a lot like the GTX 1080 Ti -> RTX 2080 Ti and nothing at all like the increase from RTX 3090 -> RTX 4090. Where did you get the idea that the 5090 would be anything like such an upgrade?


P.S. I find your username a little ironic for a Nvidia fan, given that Red was an ATI thing. Prior to that, AMD's colors were black, white, and green.

238px-AMD_Athlon_XP_logo.svg.png


Source: https://en.wikichip.org/wiki/File:AMD_Athlon_XP_logo.svg
 
Last edited:

abufrejoval

Reputable
Jun 19, 2020
592
426
5,260
I'd agree only partially with your advice.

First of all: don't buy anything you don't need (or you know you want)!
And try to avoid of buying too far ahead for any "future needs".

In both cases you're far too likely to overspend today.

Where I agree is that it might be worth waiting, if your need isn't there yet.
And if then those 4090s turn up used at around €1000, they might be pretty good value.

For me upgrading from a 3090 to the 4090 was an easy choice driven by the wider range of variable precision data types in CUDA for machine learning experiments and I needed the RAM for the bigger LLMs.

That the 4090 was a pretty good gaming GPU as well, didn't entirely escape me, I try to keep an eye on secondary use cases, just as one of my sons was happy to take the 3090 off me a year later at 50% and no Christmas presents.

The LLM stuff has moved quite beyond what you can reasonably experiement with in the home lab, so it's not very likely an RTX5090 will even remotely keep pace and I can't quite fit a Cerebras into my budget.

So will I need a 5090 for gaming? It doesn't seem very likely, at least if it's really just 30% extra performance over the 4090.

I spend most of my working day (and far too much of my leasure time) in front of a computer screen, but very little of that is gaming. So naturally it has to be big, flat, crisp and easy on the eyes, also by avoiding visual noise (such as reflections).

And that's how I bought my orignal primary: an LG IPS 4k at 43", non-glare great for 2D work and rather affordable at the time (much less than the EIZO EV2411W 1920x1200 it replaced).

When all these new high-refresh, curved, wide, HDR or even OLED displays came out, they were the ones my kids chose, who can afford to use their PCs (my leftovers) primarily for gaming and fun not work.

Eventually I got tired of their games looking so much better, so I compromised on a Gigabyte high-refresh VA-panel with HDR as new primary, but still 43", flat and 4k, because that still pays for what little fun I get. I run both side-by-side via a KVM, but often keep the older 2nd screen off, unless I want to monitor things there.

Unfortunately, TrueHD and 3k may look rather good on displays with that native size, but on a 4k screen chosen for visual detail those lower resolutions look obviously scaled. So whatever GPU I use, it has to be able to deliver at 4k native.

And the gap from THD or even 3k to 4k is square and huge. I went through quite a few GPUs trying to fill it in vain, R9 290X, GTX 980ti, GTX 1080ti, RTX 2080ti, RTX 3090 and eventually RTX 4090.

Ryzens replaced big Xeons with high-core counts but slow clocks, so even there I'm at last year's premium now (Ryzen 9 7950X3D), but it just happens that my two favorite games ARK (both variants) and Flight Simulator (again 2020 and 2024) just suck hardware without delivering the fluid performance I expect.

ASE (the 1st ARK, and a pioneer on Unreal 4) takes ages to load, because it using tens of thousands of individual files and Windows just happens to be really slow at opening them (it starts faster with HDDs on Linux then with NVMe on Windows!) but at least now it delivers on graphics, while ASA (the 2nd ARK) uses a few huge files and thus starts much faster, but is a very early Unreal 5 title with huge demands on GPU and CPU power.

And for the flight sims (2020 and 2024 equally) you either run an ultra wide curved big screen, or a triple with two angled side displays or my preferred variant, a VR setup, because immersion even on a big flat screen just isn't there otherwise.

For VR I started with the original Oculus DK1, got a DK2, CV1 and then a HP Reverb that was finally good enough in terms of screen quality (2kx2k per eye at 90Hz), but the VR experience inside FS2020 was really, really bad with the world just shuddering and stuttering, when that's exactly what can never happen in the real world, at least until the point of impact. And FS2024 has improved nothing, a Quest 3 with a very similar resolution only got rid of the cable (and the dreadful end of Microsoft's AR support).

I keep an RTX 4070 on my 24x7 system on the same KVM and screens and I was both delighted and annoyed to see how it kept creeping up in terms of what it could deliver with ever better DLSS support and general driver improvements on all "normal" games.

I'd dare say that the 4070 at 4k is often as good as the 4090 was at launch and it's two years of software evolution (games and drivers) that made all the difference!

And it's there, in the DLSS arena, where I'm not seeing anything new coming with the 5000 series. Perhaps Nvidia is really just keeping things under wraps, but if 30% uplift for every class at 20% extra cost is all it delivers, perhaps 4090s will last for a very long time yet.

Well at least in theory, should it crash and burn (still a distinct possibility after all, only the PNY 4070 still has a normal 8-pin connector), I'm much more worried about having to pay for a replacement at the currrent prices.

Quite honestly I expect nothing but disappointment from the 5000 series, because where it improves beyond the 4090 it will very likely be of little practical benefit and for the many other classes the spread in potential performance will be extremely wide. In some cases a much lower rank GPU might already deliver the very top performance and quality at 4k a game can actually deliver, in other cases all that extra hardware won't help or it's really the quality enhancement features like DLSS (and their AMD/Intel/Sony variants) that make or break gamer experience.

As to Intel's GPUs: I tried an A770 as a replacement for a GTX 1060, but it refused to work with DP and/or KVM, so I went with the 4070. I then tried again more than a year later with a mobile A770, because it came basically for free with a Serpent Canyon NUC. That was ok for the price, but I'm happy to have passed it on. And its idle power disqualified it for µ-server use that I really bought it for.

AMD's Radeons: I started with ATI via an 8514/A clone in 1990, the very first graphics accelerator they ever produced. And I stayed with them until the R9 290X (actually, I did try an R9 Nano with HBM, too, but returned it). On the GPU side AMD has constantly managed to annoy me since, mostly via bad driver support (e.g. on Linux), early retirements or by refusing to work on Windows server editions.

I've been much more of a fan for their APUs, even if those share quite a few of the driver complaints.

Ultimately it was CUDA which got me tied into team green for dGPUs... somewhat reluctantly at first, but hard to dislodge now.

For me the PNY dual fan Verto 4070 with its strictly dual-slot compact format and a single 8-pin power connector is the card to beat for a console class gamer PC, which I can build on a mobile-on-desktop Mini-ITX board. If AMD or Intel can match that power at a significantly lower price or exceed the performance at the same price and similar Wattage, they might be worth a thought.

The only thing "wrong" with the 4090 are current prices at new, but when you need it now, the only reasonable alternative may be to get it 2nd hand.

I bought my 3090 shortly before the 4090 came out, but even if it wasn't cheap by any means (also €1500, I believe), it let me do my job better than the 2080ti, which didn't support many of the critical ML data types. Staying ahead got me paid and that then allowed me to upgrade to the 4090 to do the job even better.

Using consumer GPUs to do reasonable AI work seems a windows pretty much shut these days, Nvidia put huge resources into making that happen.

It might eat into the success of the 5090, because that secondary market for the 3090 and 4090 high-end consumer GPUs might have been bigger than they ever realized.

The high-end consumer GPUs rode really high on an ML wave for a few years now, but AI workloads have moved far beyond on the upper end and games are really still tied to console class hardware for the mass appeal: more [vendor] players than a [consumer] player market I'd estimate, if I were any good at predicting the future.
 
Last edited:

Elusive Ruse

Estimable
Nov 17, 2022
456
595
3,220
Have you not read that it's made on virtually the same process node as the 4090 and only 22% bigger? How on earth do you expect them to manage such a feat?

Seriously, the RTX 3090 was made on Samsung's 8nm, which was bad even for a supposed 8nm node. The RTX 4000 series jumped to a tuned version of TSMC N5, called "4N". To put it in more concrete terms, the GA102 had 28.3 billion Transistors, while the AD102 has 76.3. That's 2.7 times as many!

By contrast, the RTX 5090 should have maybe upwards of 1.3 times as many as the RTX 4090. I think this is going to be a lot like the GTX 1080 Ti -> RTX 2080 Ti and nothing at all like the increase from RTX 3090 -> RTX 4090. Where did you even get the idea that the 5090 would be anything like such an upgrade?


P.S. I find your username a little ironic for a Nvidia fan, given that Red was an ATI thing. Prior to that, AMD's colors were black, white, and green.
I think the biggest chunk of performance uplift will come from GDDR7.
 
  • Like
Reactions: valthuer