Review Nvidia GeForce RTX 4090 Review: Queen of the Castle

Admin

Administrator
Staff member
Our testing of the GeForce RTX 4090 proves it's the undisputable performance leader of current GPUs, beating previous cards by 50% and more. DLSS 3 should further extend the lead, and upgraded ray tracing hardware means you can finally max out the settings in most games. Just mind the sticker shock, as the card doesn't come cheap.

Nvidia GeForce RTX 4090 Review: Queen of the Castle : Read more
 
I'm still on a 3090, but on my 165hz 1440p display, so it maxes most things just fine. I think I'm going to wait for the 5k series GPU's. I know this is a major bump, but dang it's expensive! I simply can't afford to be making these kind of investments in depreciating assets for FUN.
You could still possibly get $800 for the 3090. Then it’s “only” $800 to upgrade! LOL. Of course if you sell on eBay it’s $800 - 15%.
 

kiniku

Distinguished
Mar 27, 2009
253
74
18,860
A review like this, comparing a 4090 to an expensive sports car we should be in awe and envy of, is a bit misleading. PC Gaming systems don't equate to racing on the track or even the freeway. But the way it's worded in this review if you don't buy this GPU, anything "less" is a compromise. That couldn't be further from the truth. People with "big pockets" aren't fools either, except for maybe the few readers here that have convinced themselves and posted they need one or spend everything they make on their gaming PC's. Most gamers don't want or need a 450 watt sucking, 3 slot, space heater to enjoy an immersive, solid 3D experience.
 

spongiemaster

Admirable
Dec 12, 2019
2,345
1,323
7,560
Most gamers don't want or need a 450 watt sucking, 3 slot, space heater to enjoy an immersive, solid 3D experience.
Congrats on stating the obvious. Most gamers have no need for a halo GPU that can be CPU limited sometimes even at 4k. A 50% performance improvement while using the same power as a 3090Ti shows outstanding efficiency gains. Early reports are showing excellent undervolting results. 150W decrease with only a 5% loss to performance.

Any chance we could get some 720P benchmarks?
 

LastStanding

Prominent
May 5, 2022
75
27
560
the RTX 4090 still comes with three DisplayPort 1.4a outputs

the PCIe x16 slot sticks with the PCIe 4.0 standard rather than upgrading to PCIe 5.0.

These missing components are selling points now, especially knowing NVIDIA's rival(s?) supports the updated ports, so, IMO, this should have been included as a "con" too.

Another thing, why would enthusiasts only value "average metrics" when "average" barely tells the complete results?! It doesn't show the programs stability, any frame-pacing/hitches issues, etc., so a VERY miss oversight here, IMO.

I also find weird is, the DLSS benchmarks. Why champion the increase for extra fps buuuut... never, EVER, no mention of the awareness of DLSS included awful sharpening-pass?! 😏 What the sense of having faster fps but the results show the imagery smeared, ghosting, and/or artefacts to hades? 🤔
 
I am NOT a halo-card kinda guy. That said: alright . . I'm impressed. That's quite a gen-on-gen jump.

If the lower tier cards follow suit with proportionally similar leaps, then AMD is really going to have their work cut out for them.
I'm very curious to see how Navi 31 competes.

  1. If the rumored 308mm^2 die size for the main GPU is correct, that's quite a bit smaller than Nvidia.
  2. Even with the memory chiplets factored in, Nvidia potentially spent a lot more transistors on performance.
  3. Omitting the Matrix/Tensor core equivalent (found only in CDNA) probably doesn't make a massive difference in die size, but it's still something.
  4. Also no DLSS 3 stuff to worry about, possibly not dual encoders.

I expect AMD will get beat badly in ray tracing again, but might do very well in rasterization tests. But I will be happy to be proven wrong!
 

Phaaze88

Titan
Ambassador
It looks impressive - BUT:
1)You know the halo is coming later... @King_V , this ain't it, Chief.
2)This one is scary if you think about it: How, or where, are the AIBs supposed to compete? 'Cause all I see is looks, and that's subjective.
OCing sucks(has for some time now); the performance/efficiency is very good at stock. No need for a more expensive kilowatt+ psu.
The cooler is damn good this time around. I look at AIO/CLCs like the Suprim Liquid, and go, "Uhh..."
It's sells at MSRP + necessitates less expensive psus = for the folks that look this stuff up, FE offers the best 'value' out the gate.
That 2000USD Strix looks pitiful at 400 bucks plus...


PC Gaming systems don't equate to racing on the track or even the freeway.
PC gaming systems are a luxury like the racing and sports cars you bring up.
The core of a PC is versatility - comes with the territory and at a cost. If all one wants to do is just play games with none of the versatility included, maybe they should consider a console...


can this be used in aorus b550 pro ac ?
I don't see why not, unless your board's PCIe slot is somehow physically different from all the other motherboards out there...
 
  • Like
Reactions: renz496 and RodroX

Fates_Demise

Distinguished
Sep 28, 2015
79
36
18,560
These missing components are selling points now, especially knowing NVIDIA's rival(s?) supports the updated ports, so, IMO, this should have been included as a "con" too.

Another thing, why would enthusiasts only value "average metrics" when "average" barely tells the complete results?! It doesn't show the programs stability, any frame-pacing/hitches issues, etc., so a VERY miss oversight here, IMO.

I also find weird is, the DLSS benchmarks. Why champion the increase for extra fps buuuut... never, EVER, no mention of the awareness of DLSS included awful sharpening-pass?! 😏 What the sense of having faster fps but the results show the imagery smeared, ghosting, and/or artefacts to hades? 🤔
Dude testers have barely had any time with the card, like holy <Mod Edit> quit whining like a amd fan boy.
And dlss smearing? Run the damn thing in quality mode, nobody is telling you to run it in full performance mode... obviously doubling fps is gonna have some drawbacks. And quality mode has been far and wide regarded as better than native in almost every aspect.
 
Last edited by a moderator:
I have to agree with others here, its is really an impresive gen to gen performance jump in most scenarios. This trend will probably consolidate and even get better as drivers update.

Then again it sucks big time we are not going to see midrange tier RTX 4xxx GPU for a time. And we don't know if this other GPUs will get the same level of gains vs previous gen.

This will get interesting when RDN3 launch.

Nvidia and the AIB still have lots of RTX 3000 series stock laying around, and prices are not going down as they used to do a few weeks ago. I bet they also have some mayor stock of midrange RTX 4000 series on hold, ready to be delivered to major stores because of the old stock issue.

Its going to turn into a chess play between Nvidia and AMD to see who gets the next big cake slide of the midrange GPUs first. I believe most informed gamers and profesional users who knows new cards are around the corner won't be so keen on buying last gen (unless prices drop more and more).
 
  • Like
Reactions: digitalgriffin
I find this card interesting in that its only universal success comes at 4k resolution where it's way better than everything else. It really makes me wonder how the rest of the stack is going to hold up against the prior generation cards.

I'm happy with what I have, and am unhappy with the industry in general due to pricing, but from a technical standpoint this generation of GPUs seem like they're gonna be fascinating.
 
  • Like
Reactions: RodroX
I bet they also have some mayor stock of midrange RTX 4000 series on hold, ready to be delivered to major stores because of the old stock issue.
Not a chance. Nvidia most likely delayed a lot of the AD106/AD107 orders and turned the wafers into AD102/103/104. But obviously none of the card manufacturers are anywhere close to launching an RTX 4070 or 4060, because there have been zero leaked images. We started getting 4090 image leaks at least a month ahead of the reveal, which means given the lack of images that there are presently no AIB partner cards using lower spec 40-series GPUs.

I'm also still waiting to see what Nvidia announces for a 4070, given the 4080 12GB is a thing. That was really stupid to put 12GB on a 4080. Because a 4070 12GB will probably be right in the same performance ballpark, while a 4070 with less than 12GB will get seriously panned by reviewers and consumers alike. 8GB is basically only fit for budget GPUs now, or the bottom of the midrange. Mainstream GPUs need 12GB or more to be taken seriously, and high-end should all be 16GB or more.

But it will certainly be interesting to see how this all plays out. I too have noticed that GPU prices aren't dropping much any more. Either they're as low as companies are willing to go, or they're hoping to see an uptick in sales with the holidays now approaching. I'd still love to see AMD ship out a compelling new midrange GPU that puts pressure on Nvidia. There's not good reason for 3080 to still be selling at $750+ when 4080 models are less than a month away, other than to try and take advantage of people that don't know as much about PC hardware.
 
Not a chance. Nvidia most likely delayed a lot of the AD106/AD107 orders and turned the wafers into AD102/103/104. But obviously none of the card manufacturers are anywhere close to launching an RTX 4070 or 4060, because there have been zero leaked images. We started getting 4090 image leaks at least a month ahead of the reveal, which means given the lack of images that there are presently no AIB partner cards using lower spec 40-series GPUs.

I'm also still waiting to see what Nvidia announces for a 4070, given the 4080 12GB is a thing. That was really stupid to put 12GB on a 4080. Because a 4070 12GB will probably be right in the same performance ballpark, while a 4070 with less than 12GB will get seriously panned by reviewers and consumers alike. 8GB is basically only fit for budget GPUs now, or the bottom of the midrange. Mainstream GPUs need 12GB or more to be taken seriously, and high-end should all be 16GB or more.

But it will certainly be interesting to see how this all plays out. I too have noticed that GPU prices aren't dropping much any more. Either they're as low as companies are willing to go, or they're hoping to see an uptick in sales with the holidays now approaching. I'd still love to see AMD ship out a compelling new midrange GPU that puts pressure on Nvidia. There's not good reason for 3080 to still be selling at $750+ when 4080 models are less than a month away, other than to try and take advantage of people that don't know as much about PC hardware.

Well alright, you probably have way more inside info than me, but I don't think you can delay the wafer production for longer. Chips are always made way in advance of the actual GPU (the card as a whole I mean) partners and even nvidia needs time to test tons of parameters and make as many adjustments to the actual cards before launch, and even so they make "mistakes" or miss some important tuning every now and then. But yeah perhaps this time around is not the case.
Sorry I found it hard to believe because while mining was a boom Nvidia and the AIB only wanted to get more and more cash, and the change from proof of work vs proof of stake was announced and delayed soo many times that there was a chance it would get postponed once more.
In fact I would even consider that nvidia was able to make this launch a month or more ago, but it delayed it because of mining was more likely than ever before, getting closed down for business for real.
But Im probably wrong. lol

On another point I agree 100% with you, my RTX 2070 have 8 GB and thats a 4 year old card. So the RTX 4070 deserves at least 12GB of vram. The RTX 4080 12 GB should not exist at all.
 

LastStanding

Prominent
May 5, 2022
75
27
560
Dude testers have barely had any time with the card

So, for... days... is barely enough time these days, you know, the metrics that most enthusiasts/potential buyers that are interested in, huh?! 😏

Edit: just watched some YouTubers' feeds that INCLUDED these metrics that were MIA here... hmmm. It must be in the water for some territories, huh?! 🙄

And an observation = fanaticism and you get likes by even the author, for name-calling, etc. at me?

Wow!

Talking about having a valid opinion on TH.

Good to know!
 
Last edited:
Chips are always made way in advance of the actual GPU (the card as a whole I mean) partners and even nvidia needs time to test tons of parameters and make as many adjustments to the actual cards before launch, and even so they make "mistakes" or miss some important tuning every now and then. But yeah perhaps this time around is not the case.
This is why I say Nvidia probably converted some wafer starts that would normally have been used for 4060/4070 into the AD102 4090 wafers. They're almost certainly delaying the midrange 40-series launches at this point, possible delayed 4090 as well (though October isn't terribly out of the ordinary as a launch date -- just a few weeks later than the 20-series launch). The fact that 4080 didn't launch at the same time as 4090 is telling, though.

As for the cards and boards, a lot of the same core design from the 4090 can easily transfer over to 4080 and lower tiers. If you have working 4090 boards, 4080 should be easy. I'm sure the 4080 AIB cards are also in production at this point, they have to be for a November launch, but 4070 probably won't be announced until January at earliest. It will depend on whether the 30-series sells out (more or less) during the holidays. Nvidia probably has all the major design elements finalized on the boards, and even has some hardware in hand, but that sort of thing is kept under much tighter wraps than what AIB partners do.
 
Der8auer brought it up here:
View: https://www.youtube.com/watch?v=60yFji_GKak


WTH... set a whopping 50-60% power target, and still crap all over the cards before it.
I can assure you that you can’t drop your power limit to 60% and still get 95% of the base performance all of the time, or even most of the time.

The thing is, there are benchmarks where the 4090 is totally CPU limited. In those cases, reducing its power limit won’t really affect performance. But when you run a test where you are GPU limited, dropping the power limit to 60% will also reduce performance quite a bit.

It’s basically the opposite of overclocking. Even a massive overclock of your GPU won’t do much for your gaming performance if you are CPU limited.

The other thing you have to understand is that Nvidia is first to market with its next generation graphics cards. It can’t reasonably increase the power limit, so whatever it ships with is what it will have for the rest of its existence. If AMD comes to market with an RX 7950 XT, and performance is close to what an undervoltes and underclocked 4090 can manage, then Nvidia made the “wrong“ choice. End users can undervolt and underclock on their own, but Nvidia needs to give itself the best chance to succeed in the market, and for a part like the 4090 that is going to be maximum stable performance.
 
Holy smack I knew this was going to be a performance beast, but certainly not at this level. The 4090 appears to be the first true 8K ready GPU at least according to its overage above everything else at 4K. And thanks Jarred for referencing MSFS performance as upgrading from my 3080 Ti was something in the back of my mind if the latest service updates improved performance and eliminating the CPU bottlenecks. Apparently not. A 25% FPS improvement between the 3080 Ti and 4090 at 4K is notable in that sim, but definitely not worth a new PSU on top of whatever the GPU will cost in AIB vendor options.