Review Nvidia GeForce RTX 4090 Review: Queen of the Castle

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I can assure you that you can’t drop your power limit to 60% and still get 95% of the base performance all of the time, or even most of the time.

The thing is, there are benchmarks where the 4090 is totally CPU limited. In those cases, reducing its power limit won’t really affect performance. But when you run a test where you are GPU limited, dropping the power limit to 60% will also reduce performance quite a bit.

It’s basically the opposite of overclocking. Even a massive overclock of your GPU won’t do much for your gaming performance if you are CPU limited.

The other thing you have to understand is that Nvidia is first to market with its next generation graphics cards. It can’t reasonably increase the power limit, so whatever it ships with is what it will have for the rest of its existence. If AMD comes to market with an RX 7950 XT, and performance is close to what an undervoltes and underclocked 4090 can manage, then Nvidia made the “wrong“ choice. End users can undervolt and underclock on their own, but Nvidia needs to give itself the best chance to succeed in the market, and for a part like the 4090 that is going to be maximum stable performance.
I don't believe either of us said that would be seen in everything, but it's still impressive. Now, if people pick this up for anything less than 4K, that's on them.

Overclocking hasn't been noteworthy, save for LN2. Even if not cpu limited, the cost of pushing it further looks underwhelming.
Undervolt or 90-95% power limit, and nearly match the default profile? Yes please.

Even if it turns out Nvidia made the 'wrong' choice later, they'll probably still be 'right'. The mindshare is too strong.
 
Forgive me but doesn’t DLSS 3 just sound like the image interpolation setting on modern tv’s where it creates a frame in between frames to smooth out motion? I know it’s more involved than that but couldn’t resist pointing this out lol.

Joke time: I hope Nvidia fixed it so my games don’t have a soap opera look to them haha
 
I don't believe either of us said that would be seen in everything, but it's still impressive. Now, if people pick this up for anything less than 4K, that's on them.

Overclocking hasn't been noteworthy, save for LN2. Even if not cpu limited, the cost of pushing it further looks underwhelming.
Undervolt or 90-95% power limit, and nearly match the default profile? Yes please.

Even if it turns out Nvidia made the 'wrong' choice later, they'll probably still be 'right'. The mindshare is too strong.
Here are some results. There was no option to change the voltage using MSI Afterburner, so I just put the power limit at 65% for an "undervolt/underclock." The overclock was +180MHz on the GPU and +800MHz on the GDDR6X. All of these are ray tracing games, which were the games most likely to come close to the default 450W power limit.

146

What you'll see with this testing is that my results are nowhere near 95% of the base performance. In fact, on average, performance dropped 20% for a 31% decrease in power consumption. Alternatively, with the overclock, performance increased by 17% while using 9% more power. Also notice that the clocks on average dropped 33% with the power limit, or increased 9% with the overclock.

If DerBauer was able to also change voltage on his testing, that might improve things, but clearly with my testing the power savings didn't really match the performance drop. Technically it was a bit more efficient (0.146 fps/W versus 0.125 fps/W), though even the OC ended up more efficient than stock (0.138 fps/W). YMMV in other words, but I don't really agree with the assertion that tweaking the power limit alone and dropping it 30% or 40% is going to be great for performance. ¯\(ツ)

Looking at his power testing, I see three major problems:
  1. 3DMark Time Spy Extreme. It's a dumb benchmark because it's not an actual game.
  2. Using DLSS in Cyberpunk tends to drop power use quite a bit (less rendering work, less power, Tensor upscaling is way more efficient).
  3. PUBG at "esports settings" probably means less detail, so the 4090 might be spinning its wheels a bit. Power draw dropped a lot, but there's no reason for a 140W difference in power use the only reduce performance 5% — other than it being a bad test.
The reason for the high power limit is simple: Nvidia wanted to maximize performance. It's like AMD's socket AM4: it was totally holding back the Ryzen 5950X and 3950X. So they gave it a bunch more headroom with Zen 4, perhaps more than strictly needed, but it wasn't a dumb or bad decision. Because performance wins in the minds of most people, and so you don't want to restrict it any more than necessary.
 
Here are some results. There was no option to change the voltage using MSI Afterburner, so I just put the power limit at 65% for an "undervolt/underclock." The overclock was +180MHz on the GPU and +800MHz on the GDDR6X. All of these are ray tracing games, which were the games most likely to come close to the default 450W power limit.

View attachment 146

What you'll see with this testing is that my results are nowhere near 95% of the base performance. In fact, on average, performance dropped 20% for a 31% decrease in power consumption. Alternatively, with the overclock, performance increased by 17% while using 9% more power. Also notice that the clocks on average dropped 33% with the power limit, or increased 9% with the overclock.

If DerBauer was able to also change voltage on his testing, that might improve things, but clearly with my testing the power savings didn't really match the performance drop. Technically it was a bit more efficient (0.146 fps/W versus 0.125 fps/W), though even the OC ended up more efficient than stock (0.138 fps/W). YMMV in other words, but I don't really agree with the assertion that tweaking the power limit alone and dropping it 30% or 40% is going to be great for performance. ¯\(ツ)

Looking at his power testing, I see three major problems:
  1. 3DMark Time Spy Extreme. It's a dumb benchmark because it's not an actual game.
  2. Using DLSS in Cyberpunk tends to drop power use quite a bit (less rendering work, less power, Tensor upscaling is way more efficient).
  3. PUBG at "esports settings" probably means less detail, so the 4090 might be spinning its wheels a bit. Power draw dropped a lot, but there's no reason for a 140W difference in power use the only reduce performance 5% — other than it being a bad test.
The reason for the high power limit is simple: Nvidia wanted to maximize performance. It's like AMD's socket AM4: it was totally holding back the Ryzen 5950X and 3950X. So they gave it a bunch more headroom with Zen 4, perhaps more than strictly needed, but it wasn't a dumb or bad decision. Because performance wins in the minds of most people, and so you don't want to restrict it any more than necessary.
Sorry. I should've stopped because, "I can assure you that you can’t drop your power limit to 60% and still get 95% of the base performance all of the time, or even most of the time.", pretty much said it all.
 
I've read and watched most reviews I care about the 4090 and it is quite impressive as a new flagship, but there's a few things that sour the impressive showing to me...

1.- DP 1.4 fells like an awful stupid shortcoming for a $1600 GPU. You top out at 4K 120Hz, unless you use chroma* subsampling, which basically reduces the colour spectrum. Even more when you see them touting DLSS3's frame interpolation.
2.- Locking AIB reviews for a day later than FE cards? Like... Seriously? Why did the whole tech media agree to that? WHY?!
3.- Power delivery circuitry is not part of the reference they gave to AIBs, so AIB cards may have horrible spikes; maybe. To be seen tomorrow?
EDIT: I forgot the fourth!
4.- It seems like nVidia forbit reviewers from using the latest Win11 22H2 update because of the bugs, so if you have that version of windows, well, you're in for some waiting until nVidia fixes those, or something along those lines. Pretty scummy.

Overall, I'm more curious as to what AMD has cooking with RDNA3, since they already knew the 4090 was going to be around 80% better than the 3090, so their assumption was pretty close (MLiD mentioned this; use salt as you please). Given how they didn't do terribly with the 6900XT, I think they may do well this time around as well, even with nVidia having a slight process and memory advantage.

Regards.
 
Last edited:
Benchmarks looks good, ray tracing still meh @ 4k but at least an improvement. The main problem is power consumption. At such times when climate change is a real problem and utilities fees are sky rocketing, the power consumption need for a personal gaming card is really ridiculous.
 
Here are some results. There was no option to change the voltage using MSI Afterburner, so I just put the power limit at 65% for an "undervolt/underclock." The overclock was +180MHz on the GPU and +800MHz on the GDDR6X. All of these are ray tracing games, which were the games most likely to come close to the default 450W power limit.

View attachment 146

What you'll see with this testing is that my results are nowhere near 95% of the base performance. In fact, on average, performance dropped 20% for a 31% decrease in power consumption. Alternatively, with the overclock, performance increased by 17% while using 9% more power. Also notice that the clocks on average dropped 33% with the power limit, or increased 9% with the overclock.

If DerBauer was able to also change voltage on his testing, that might improve things, but clearly with my testing the power savings didn't really match the performance drop. Technically it was a bit more efficient (0.146 fps/W versus 0.125 fps/W), though even the OC ended up more efficient than stock (0.138 fps/W). YMMV in other words, but I don't really agree with the assertion that tweaking the power limit alone and dropping it 30% or 40% is going to be great for performance. ¯\(ツ)

Looking at his power testing, I see three major problems:
  1. 3DMark Time Spy Extreme. It's a dumb benchmark because it's not an actual game.
  2. Using DLSS in Cyberpunk tends to drop power use quite a bit (less rendering work, less power, Tensor upscaling is way more efficient).
  3. PUBG at "esports settings" probably means less detail, so the 4090 might be spinning its wheels a bit. Power draw dropped a lot, but there's no reason for a 140W difference in power use the only reduce performance 5% — other than it being a bad test.
The reason for the high power limit is simple: Nvidia wanted to maximize performance. It's like AMD's socket AM4: it was totally holding back the Ryzen 5950X and 3950X. So they gave it a bunch more headroom with Zen 4, perhaps more than strictly needed, but it wasn't a dumb or bad decision. Because performance wins in the minds of most people, and so you don't want to restrict it any more than necessary.
Undervolting results are going to vary depending on multiple things. Here are some results other sites are reporting:

04ba2d86a2bb36268cce1a27d5b9ca65f22465565ffc48093897ee7535f3bf55.png


Here, it looks like dropping power from 448W to 329W results in an fps drop of about 8fps (125-117). That's a 119W drop for about a 7% fps drop.

Another one:
7d00b8043a85fb8049bdda29aa13b9031db0ad9398171e1f1d15678b1e7d7123.png



33% power drop for 5% fps drop. These are unlikely to be typical results, but they are real world results that sites are coming up with.
 
I've read and watched most reviews I care about the 4090 and it is quite impressive as a new flagship, but there's a few things that sour the impressive showing to me...

1.- DP 1.4 fells like an awful stupid shortcoming for a $1600 GPU. You top out at 4K 120Hz, unless you use chroma* subsampling, which basically reduces the colour spectrum. Even more when you see them touting DLSS3's frame interpolation.
2.- Locking AIB reviews for a day later than FE cards? Like... Seriously? Why did the whole tech media agree to that? WHY?!
3.- Power delivery circuitry is not part of the reference they gave to AIBs, so AIB cards may have horrible spikes; maybe. To be seen tomorrow?
EDIT: I forgot the fourth!
4.- It seems like nVidia forbit reviewers from using the latest Win11 22H2 update because of the bugs, so if you have that version of windows, well, you're in for some waiting until nVidia fixes those, or something along those lines. Pretty scummy.

Overall, I'm more curious as to what AMD has cooking with RDNA3, since they already knew the 4090 was going to be around 80% better than the 3090, so their assumption was pretty close (MLiD mentioned this; use salt as you please). Given how they didn't do terribly with the 6900XT, I think they may do well this time around as well, even with nVidia having a slight process and memory advantage.

Regards.
I look forward to RDNA3 reviews where I'm sure you will only point out negative things about AMD's launch. I have complete faith based on your post history, that's what you are going to do.
 
Lets keep in mind that this are early drivers, so performance should get better with time, as it happend before with old gens from both Nvidia and AMD.

I would love to know (since I don't live there), anyone have info about people already waiting in line outside microcenter and/or best buy to buy this card, like with RTX 3000 series launch?

And another question

Will scalpers like to spend this much money for an unknown profit now that mining is dead?
 
Here are some results. There was no option to change the voltage using MSI Afterburner, so I just put the power limit at 65% for an "undervolt/underclock." The overclock was +180MHz on the GPU and +800MHz on the GDDR6X. All of these are ray tracing games, which were the games most likely to come close to the default 450W power limit.

View attachment 146

What you'll see with this testing is that my results are nowhere near 95% of the base performance. In fact, on average, performance dropped 20% for a 31% decrease in power consumption. Alternatively, with the overclock, performance increased by 17% while using 9% more power. Also notice that the clocks on average dropped 33% with the power limit, or increased 9% with the overclock.

If DerBauer was able to also change voltage on his testing, that might improve things, but clearly with my testing the power savings didn't really match the performance drop. Technically it was a bit more efficient (0.146 fps/W versus 0.125 fps/W), though even the OC ended up more efficient than stock (0.138 fps/W). YMMV in other words, but I don't really agree with the assertion that tweaking the power limit alone and dropping it 30% or 40% is going to be great for performance. ¯\(ツ)

Looking at his power testing, I see three major problems:
  1. 3DMark Time Spy Extreme. It's a dumb benchmark because it's not an actual game.
  2. Using DLSS in Cyberpunk tends to drop power use quite a bit (less rendering work, less power, Tensor upscaling is way more efficient).
  3. PUBG at "esports settings" probably means less detail, so the 4090 might be spinning its wheels a bit. Power draw dropped a lot, but there's no reason for a 140W difference in power use the only reduce performance 5% — other than it being a bad test.
The reason for the high power limit is simple: Nvidia wanted to maximize performance. It's like AMD's socket AM4: it was totally holding back the Ryzen 5950X and 3950X. So they gave it a bunch more headroom with Zen 4, perhaps more than strictly needed, but it wasn't a dumb or bad decision. Because performance wins in the minds of most people, and so you don't want to restrict it any more than necessary.
Your overclocked results for Metro Exodus seem a little too good to be true. Maybe a typo?
 
  • Like
Reactions: drivinfast247
Lets keep in mind that this are early drivers, so performance should get better with time, as it happend before with old gens from both Nvidia and AMD.

I would love to know (since I don't live there), anyone have info about people already waiting in line outside microcenter and/or best buy to buy this card, like with RTX 3000 series launch?

And another question

Will scalpers like to spend this much money for an unknown profit now that mining is dead?
Going by Turing and Ampere's few or nonexistent performance uplift driver history (less than 5 drivers since 2018) it is easy to say, don't hold your breathe.
 
Thanks for the in-depth review, Jarred!

  1. Wow. Just wow. Rasterisation = wow.
  2. DLSS, FSR, and XeSS: My thoughts about them have changed over the years: if the player doesn't notice during gameplay, is it a bad thing?

    DLSS was very noticeable during gameplay until very recently causing blur.

    AMD's FSR was similar until version 2.1.

    I'm HIGHLY impressed with XeSS because they've exceeded both FSR 2.1 and DLSS 2.x with their first shot at it. I was looking at still images of Hitman over at TPU a few days ago, and I thought to myself "not bad!" until I realised that I had confused the native 4K image with the XeSS 4K image.
  3. I really haven't come across any game that I would like to play that real-time ray tracing makes the world of difference in, so I'm still not sold on RTRT in games. Having said that, the 4090 sure is very impressive!
 
This is beyond preposterous! Now a $1600 card (and that's even before the "partners" start releasing their $2000+ versions), plus major cooling AND a "minimum" of 1000 W PSU" is the new "great deal"??

You can buy, RIGHT NOW, a fully loaded PC with either a 3090 or a 6950 XT for less than that amount. And this PC will play ANY game you throw at it in 4K in the next 6-7 years, if not even longer!!
 
This is beyond preposterous! Now a $1600 card (and that's even before the "partners" start releasing their $2000+ versions), plus major cooling AND a "minimum" of 1000 W PSU" is the new "great deal"??

You can buy, RIGHT NOW, a fully loaded PC with either a 3090 or a 6950 XT for less than that amount. And this PC will play ANY game you throw at it in 4K in the next 6-7 years, if not even longer!!

No one if forcing you to buy this card, and we all knew this was coming for a long, long time now.

Don't worry too much about the $1600 tag, it wont look so bad once the RTX 4090TI get announced :) , but that may be a few months away (depening on how good the top tier AMD RDN3 cards do on benchmarks next month)
 
Well, that sucks. 4090FE went instantly out of stock at Best Buy. Was able to get the MSI and Gigabyte card in my cart, but don't want either. Odd since it asked for an account verification which sent a text to your phone, so there was no way to script that. Best Buy must have had 12 FE models. Oh well. I can wait.
 
Well, that sucks. 4090FE went instantly out of stock at Best Buy. Was able to get the MSI and Gigabyte card in my cart, but don't want either. Odd since it asked for an account verification which sent a text to your phone, so there was no way to script that. Best Buy must have had 12 FE models. Oh well. I can wait.

The FE model seems to be out of stock in most places.

There are other models available, but if thats the one for you, then thats the one for you !

Good luck

Edit: the ASUS TUF model at the same price does look decent enough: https://www.newegg.com/asus-geforce...814126596?Item=N82E16814126596&quicklink=true
 
The FE model seems to be out of stock in most places.

There are other models available, but if thats the one for you, then thats the one for you !

Good luck

Edit: the ASUS TUF model at the same price does look decent enough: https://www.newegg.com/asus-geforce...814126596?Item=N82E16814126596&quicklink=true
I'm only interested in the FE model. It has a better water block selection and it is quite compact with a block. I can wait. The kids will be happy too, as they weren't expecting to have running water or food this month.
 

Uhm... You had missed the very important piece to this disclaimer:

though improvements will vary based on your specific system setup, and the game settings used. In our testing, performance increases were found in a wide variety of DirectX 12 games, across all resolutions:
  • Assassin’s Creed Valhalla: up to 24% (1080p)
  • Battlefield 2042: up to 7% (1080p)
  • Borderlands 3: Up to 8% (1080p)
  • Call of Duty: Vanguard: up to 12% (4K)
  • Control: up to 6% (4K)
  • Cyberpunk 2077: up to 20% (1080p)
  • F1Ⓡ 22: up to 17% (4K)
  • Far Cry 6: up to 5% (1440p)
  • Forza Horizon 5: up to 8% (1080P)
  • Horizon Zero Dawn: Complete Edition: up to 8% (4k)
  • Red Dead Redemption 2: up to 7% (1080p)
  • Shadow of the Tomb Raider: up to 5% (1080p)
  • Tom Clancy’s The Division 2: up to 5% (1080p)
  • Watch Dogs: Legion: up to 9% (1440p