RTX 4080 Super vs RX 7900 XTX GPU faceoff: Battle for the high-end

kiniku

Distinguished
Mar 27, 2009
253
74
18,860
"...We think people buy high-end cards to be able to crank up all the settings, and there are RT-enabled games where the 4080 Super delivers a playable result while the 7900 XTX does not."
(y) (y) (y)
 
  • Like
Reactions: valthuer and PEnns

Notton

Commendable
Dec 29, 2023
904
804
1,260
Ray-tracing works surprisingly well now.
DLSS and frame-gen have matured enough that it doesn't feel janky, unlike back when the 40 series launched
 
  • Like
Reactions: valthuer
"...We think people buy high-end cards to be able to crank up all the settings, and there are RT-enabled games where the 4080 Super delivers a playable result while the 7900 XTX does not."
(y) (y) (y)
meh out of all the titles they have listed for the RT I only have 2 and that is both Spiderman Remastered and Miles and they both play fine for me at 3440x1440 UW with RT on.

"4080 Super used just 16.5W while idle, 20.8W while playing back a 4K AV1 video using VLC, and 31.0W when playing the same video directly on YouTube. The 7900 XTX used 19.1W while idle, 57.8W while playing our test video in VLC, and 96.8W when viewing the video on YouTube."

idle power draw will depend on monitor setup.

I idle at 7 watts on my 7900XTX and single display not the 19 watts you are seeing in this review.

Youtube video play back for me sits at 32 watts not 3x the amount you are seeing at 96.8 watts....

VLC video play back for me on my system is 26 watts.
 
Last edited:
meh out of all the titles they have listed for the RT I only have 2 and that is both Spiderman Remastered and Miles and they both play fine for me at 3440x1440 UW with RT on.

"4080 Super used just 16.5W while idle, 20.8W while playing back a 4K AV1 video using VLC, and 31.0W when playing the same video directly on YouTube. The 7900 XTX used 19.1W while idle, 57.8W while playing our test video in VLC, and 96.8W when viewing the video on YouTube."

idle power draw will depend on monitor setup.

I idle at 7 watts on my 7900XTX and single display not the 19 watts you are seeing in this review.

Youtube video play back for me sits at 32 watts not 3x the amount you are seeing at 96.8 watts....

VLC video play back for me on my system is 26 watts.
Looks like TH didn’t go into windows power settings and hit “return to defaults” or something. What else would cause this disparity?
 
Looks like TH didn’t go into windows power settings and hit “return to defaults” or something. What else would cause this disparity?
numerous things

The display you are using matters and its refresh rate. Having a second display hooked up to the gpu even if it turned off can also affect this.

Is freesync on, is the system really idle or are there other processes running.

Any tweaking done on the system? some people turn off hardware acceleration in their browsers etc.

there are so many variables to it.
 
Is the first graph correct? Because on the next one nvidia is orange while on the first one its blue.
Sorry, I reversed the card order on the ProViz tests, so the numbers were correct but the colors swapped. I've updated the charts now so that all have the 4080 Super in blue. (The real issue is that I generated both chart variants, one with Nvidia blue and AMD orange, the other with AMD blue and Nvidia orange... and then I accidentally uploaded the wrong charts for ProVizAI.)
 
meh out of all the titles they have listed for the RT I only have 2 and that is both Spiderman Remastered and Miles and they both play fine for me at 3440x1440 UW with RT on.

"4080 Super used just 16.5W while idle, 20.8W while playing back a 4K AV1 video using VLC, and 31.0W when playing the same video directly on YouTube. The 7900 XTX used 19.1W while idle, 57.8W while playing our test video in VLC, and 96.8W when viewing the video on YouTube."

idle power draw will depend on monitor setup.

I idle at 7 watts on my 7900XTX and single display not the 19 watts you are seeing in this review.

Youtube video play back for me sits at 32 watts not 3x the amount you are seeing at 96.8 watts....

VLC video play back for me on my system is 26 watts.
The videos you play will also affect power use, as will the codec for the video. Since you don't know what video I used for testing, the results you give aren't comparable. And I used a PCAT v2 to capture power data over the same ~60 seconds of video (because the content of the video at any given time also affects power use). So yes, power can be higher on the AMD card — and even higher still with the launch drivers.

I had a 144Hz 4K display running at 144Hz — the sort of monitor you'd want to use with a $1000 graphics card, in other words. If you have a 1080p or 1440p or even 3440x1440 monitor, power draw can be lower. You also can't depend on software to accurately report the power use of a graphics card, particularly on certain AMD models (though RDNA3 seems better about reporting GPU-only power rather than full board power).
 
The videos you play will also affect power use, as will the codec for the video. Since you don't know what video I used for testing, the results you give aren't comparable. And I used a PCAT v2 to capture power data over the same ~60 seconds of video (because the content of the video at any given time also affects power use). So yes, power can be higher on the AMD card — and even higher still with the launch drivers.

I had a 144Hz 4K display running at 144Hz — the sort of monitor you'd want to use with a $1000 graphics card, in other words. If you have a 1080p or 1440p or even 3440x1440 monitor, power draw can be lower. You also can't depend on software to accurately report the power use of a graphics card, particularly on certain AMD models (though RDNA3 seems better about reporting GPU-only power rather than full board power).
Correct I don't have the same video file I can only playback what I have on my system and it was also a AV1 video file.

My numbers are correct for my own system which is 3440x1440 @ 144hz.

I guess a simple way to check that would be set your display at 1440p @ 144hz and compare to the 4k @ 144hz numbers if possible. From what I've seen its refresh rate that causes high idle not resolution.

And yes using a PCAT v2 should be more accurate than software, I still find those numbers higher than what I see with others using single displays reporting. I'm using a reference model not AIB.

RDNA 3 is much better than RDNA 2 when it comes to reporting power consumption.
 
Last edited:

bit_user

Titan
Ambassador
For those interested in AI, memory capacity could be a bigger factor than raw compute performance. If you look at what the TinyBox ended up supporting, the two options they give are either the 7900 XTX or the RTX 4090 - both 24 GB cards. Even though the Nvidia option is much more expensive (like $24k instead of $16k, IIRC), they didn't even bother to offer a version with the RTX 4080, because memory bandwidth and capacity is that important for their customers.
 
  • Like
Reactions: Peksha
numerous things

The display you are using matters and its refresh rate. Having a second display hooked up to the gpu even if it turned off can also affect this.

Is freesync on, is the system really idle or are there other processes running.

Any tweaking done on the system? some people turn off hardware acceleration in their browsers etc.

there are so many variables to it.
Yes, there are lots of variables. But interestingly, when using a single monitor, Nvidia's power draw tends to be pretty much the same on multiple different displays. It seems as though AMD hasn't properly tuned the various GPU firmwares / drivers to better optimize for lower idle power. I don't have a ton of different monitors to test with, so I can only report my finding with either four different 4K 60Hz displays, or a 4K 1440Hz displays (which is what I used), or I could even use a 4K 240Hz (with DSC) display. Switching monitors would require rearranging a bunch of other stuff as well, however, as the monitor footprint is larger on the last one.

So to reiterate:

1) The idle value is full idle, while the monitor is on and showing content, but everything is static (no open windows that aren't minimized). Long idle where the display powers off is a different metric and tends to be even lower because the card really doesn't need to do anything. Most GPUs should have long idle power use of <10W, assuming there's no background GPU compute task running, but I haven't tried to test this.

2) G-Sync and FreeSync are both disabled, depending on the GPU, even though the monitor can support VRR. This is done for purposes of consistency. Why force a constant 144Hz refresh rate? Because that's more of the "worst case scenario" and high-end GPUs will often be paired with high-end, high refresh rate monitors. Also, it's one less variable (i.e. VRR on or off introduces variability). Resolution is 3840x2160 4:2:2 YCbCr, though, because the monitor doesn't have DSC support and you can't do 144Hz at 4K with full 4:4:4 color.

3) Test system is Core i9-13900K, the same in both cases. Power use is measured via PCAT v2 while idle, while decoding a 4K30 AV1 video in VLC, and while playing that same video off YouTube. The video, if you want to look, is this one:
View: https://www.youtube.com/watch?v=qZ4n-0162nY
Power data logging starts right as after the black screen disappears and Houston Jones starts talking (~3 seconds into the video) and stops 60 seconds later (where Scott says Girthmaster 5000 or whatever).

AMD's behavior on various GPUs is quite odd in terms of power use, and appears to fluctuate a lot with monitor, refresh rate, and resolution AFAICT. The RX 7600 uses more power while idle than an RX 7900 XT, and the selected refresh rate definitely matters. 4K 60 Hz idle power is ~8W on the 7600, but anything higher (75Hz, 82Hz, 98Hz, 120Hz, 144Hz) jumps up to around 17W. Idle power on the 7900 XT and XTX is generally the same at 4K, regardless of refresh rate (maybe 1~2W lower at 60Hz). The 7800 XT and 7700 XT all seem to have higher power use above 82Hz, and I think the 7900 GRE had lower power at 120Hz and below — so it only "didn't like" 144Hz where power use jumped by maybe 15W.
 
For those interested in AI, memory capacity could be a bigger factor than raw compute performance. If you look at what the TinyBox ended up supporting, the two options they give are either the 7900 XTX or the RTX 4090 - both 24 GB cards. Even though the Nvidia option is much more expensive (like $24k instead of $16k, IIRC), they didn't even bother to offer a version with the RTX 4080, because memory bandwidth and capacity is that important for their customers.
The interesting thing with AI workloads is that quantization and other optimizations can affect VRAM requirements. I know for example that Stable Diffusion can run 768x768 with Nvidia GPUs that only have 6GB of VRAM (RTX 2060), but it craps out on the RX 6650 XT and below because of a lack of VRAM — while oddly working on RX 7600.

If the number format is the same (e.g. FP16), and the project is properly optimized, 24GB can run a 50% larger model than 16GB. Determining which projects are properly optimized is a different story of course. :-\
 
2) G-Sync and FreeSync are both disabled, depending on the GPU, even though the monitor can support VRR. This is done for purposes of consistency. Why force a constant 144Hz refresh rate? Because that's more of the "worst case scenario" and high-end GPUs will often be paired with high-end, high refresh rate monitors. Also, it's one less variable (i.e. VRR on or off introduces variability). Resolution is 3840x2160 4:2:2 YCbCr, though, because the monitor doesn't have DSC support and you can't do 144Hz at 4K with full 4:4:4 color.
For a one off test can you turn on FreeSync that may be the issue right here.

For those interested in AI, memory capacity could be a bigger factor than raw compute performance. If you look at what the TinyBox ended up supporting, the two options they give are either the 7900 XTX or the RTX 4090 - both 24 GB cards. Even though the Nvidia option is much more expensive (like $24k instead of $16k, IIRC), they didn't even bother to offer a version with the RTX 4080, because memory bandwidth and capacity is that important for their customers.
True LLM's like lots of vram.
 
Last edited:
For a one off test can you turn on FreeSync that may be the issue right here.
I already know that lower refresh rates can reduce AMD power use. It seems to be related to VRAM capacity and speed as well, but it’s unclear precisely how. Playing back full screen 4K30 content should drop the refresh rate to 30 if VRR is on.

Anyway, 3440x1440x144Hz is 40% fewer pixels than 3840x2160x144Hz. Again, whatever the reason, AMD GPUs seem to have differing thresholds where power jumps from the minimum amount to a higher level. Rendering 67% more pixels likely crosses that threshold.

But that’s all ignoring the fact that Nvidia’s GPUs do not exhibit the same behavior. Plugging in more than one monitor can increase idle power use (because two RAMDACs are active), but I’m looking at a simple scenario of just a single display.
 

Notton

Commendable
Dec 29, 2023
904
804
1,260
I think the key takeaway here is that AMD has tons of room to improve the power efficiency of their chiplet GPUs.
AFAIK, this doesn't happen on their monolithic designs.
 
  • Like
Reactions: bit_user

Peksha

Prominent
Sep 2, 2023
46
35
560
The interesting thing with AI workloads is that quantization and other optimizations can affect VRAM requirements. I know for example that Stable Diffusion can run 768x768 with Nvidia GPUs that only have 6GB of VRAM (RTX 2060), but it craps out on the RX 6650 XT and below because of a lack of VRAM — while oddly working on RX 7600.

If the number format is the same (e.g. FP16), and the project is properly optimized, 24GB can run a 50% larger model than 16GB. Determining which projects are properly optimized is a different story of course. :-\
No chance to local run llama2/3:70b (without using system RAM) on 16GB. Only 24GB 7900xtx/4090
 
  • Like
Reactions: bit_user

vinay2070

Distinguished
Nov 27, 2011
294
85
18,870
The problem with AMD CPUs and GPUs are thier higher power consumption at idle and nobody is making enough noise about this. All those benchmarkers show you that AMD is more efficient when it comes to CPU processing but dont tell you on an average day, AMD would have consumed more power due to higher idling watts. Only way to look at this for now is to see laptop battery backups as there are not many desktop idling comparision. I myself own a 7800x3d so am not being a fanboi. AMD will NOT fix this until reviewers make a big deal out of it.

@JarredWaltonGPU Is it possible for you to make an Intel vs AMD CPU comparision with a power meter connected that will show watt hours consumed over a typical day of web browsing, gaming and idling? Lets say 24 hours. 14 hours idling, 4 hours gaming and 6 hours websurfing and youtube playback with nvidia GPUs installed (to take GPU efficiency out of the equation as we know AMD GPUs are worse). Or better use iGPUs for each when not gaming.

View: https://youtu.be/mdWAfPfYTnU?si=R2cgLfMOW041YCBW&t=769


View: https://youtu.be/dXNxouijfJI?si=qznEGrgL37IW0SXV&t=691
 
Last edited:

bit_user

Titan
Ambassador
The problem with all AMD hardware is thier higher power consumption on standby. Be it CPUs or GPUs.
Not APUs.

Unless reviewers make a big deal out of AMD consuming more power while idling, they WILL NOT fix it.
AM5 was a new platform. Let's see how the next generation of chipsets and CPUs behave on it.

All those benchmarkers show you that AMD is more efficient when it comes to CPU processing
You're really off-topic, here. This thread is about the RX 7900 XTX vs. the RTX 4080 Super (i.e. dGPUs, not CPUs), in case you didn't notice.
 
Last edited:
  • Like
Reactions: Makaveli

vinay2070

Distinguished
Nov 27, 2011
294
85
18,870
You're really off-topic, here. This thread is about the RX 7900 XTX vs. the RTX 4080 Super (i.e. dGPUs, not CPUs), in case you didn't notice.
A little off topic is fine, am not speaking politics here anyway. And making a request to Jarred to make some tests.

Also AM5 or not people are still buying it and needs to be exposed.
 

bit_user

Titan
Ambassador
Also AM5 or not people are still buying it and needs to be exposed.
It's hardly new or news. Certainly not worth thread-jacking, IMO.

It would be better to start a new thread, or at least bring it up in comments of a relevant CPU article (i.e. one involving AM5 CPUs).

However, at this point in the product cycle, it makes sense just to wait and see how the Zen 5 CPUs and boards fare. You're a couple years too late to influence those products. As for feedback influencing future generations of products, AMD is probably only going to be paying attention to how Zen 5 is received, at this point.
 
  • Like
Reactions: Makaveli