RTX 4080 Super vs RX 7900 XTX GPU faceoff: Battle for the high-end

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Really this is the same story of Radeon over the past few generations. When there was a significant cost difference between Radeon and GeForce, basically the pre-RTX era, Radeon could be considered a true "value alternative". You suffered through perhaps late and/or dodgy driver releases and didn't have access to nVidia exclusive features, but the price difference was such that it was justified, except for the HD 6000 series which were moved to legacy support before their warranties were up. But now in the RTX era AMD is pricing their cards too close to nVidia. As listed in the links currently ($999.99 for the 4080S, $915.99 for the 7900XTX), AMD is 9.5% less expensive yet scores about 10% lower in performance, and of course does not have access to nVidia exclusive features. This strategy is what has kept AMD's market share (per the Steam hardware survey) consistently around 15% for ages.

Imagine if the 7900 series were priced 25% lower than nVidia. Not only would we have a proper price war, every single tech site would say the same thing: "Nvidia is faster, nVidia has more features, nVidia has better software, but the price premium over AMD just makes it impossible to recommend them over AMD unless you need those features."
 
  • Like
Reactions: valthuer

vinay2070

Distinguished
Nov 27, 2011
294
85
18,870
It's hardly new or news. Certainly not worth thread-jacking, IMO.

It would be better to start a new thread, or at least bring it up in comments of a relevant CPU article (i.e. one involving AM5 CPUs).

However, at this point in the product cycle, it makes sense just to wait and see how the Zen 5 CPUs and boards fare. You're a couple years too late to influence those products. As for feedback influencing future generations of products, AMD is probably only going to be paying attention to how Zen 5 is received, at this point.
Too late? Nothings too late. Being a patreon, I have made a request directly to HUB and other tech reviers. Nobody wants to make these tests. Even with next gen out, people will still buy cheaper B6xx and X6xx boards as they will be cheaper etc. Its never too late to make some tests and to set a baseline benchmarks to compare it to the next gen. And its not about ME trying to influence anything. Its to have a fair benchmarks out regardless of fanboism. Am already on AM5 anyway.
 

vinay2070

Distinguished
Nov 27, 2011
294
85
18,870
Really this is the same story of Radeon over the past few generations. When there was a significant cost difference between Radeon and GeForce, basically the pre-RTX era, Radeon could be considered a true "value alternative". You suffered through perhaps late and/or dodgy driver releases and didn't have access to nVidia exclusive features, but the price difference was such that it was justified, except for the HD 6000 series which were moved to legacy support before their warranties were up. But now in the RTX era AMD is pricing their cards too close to nVidia. As listed in the links currently ($999.99 for the 4080S, $915.99 for the 7900XTX), AMD is 9.5% less expensive yet scores about 10% lower in performance, and of course does not have access to nVidia exclusive features. This strategy is what has kept AMD's market share (per the Steam hardware survey) consistently around 15% for ages.

Imagine if the 7900 series were priced 25% lower than nVidia. Not only would we have a proper price war, every single tech site would say the same thing: "Nvidia is faster, nVidia has more features, nVidia has better software, but the price premium over AMD just makes it impossible to recommend them over AMD unless you need those features."
I have been using AMD/nvida/Intel hardware as an opportunist and have built plenty of systems for friends and families using hardware from various companies. My first high powered GPU was the Radeon VE lol.

I think this time around AMD messed up big time. Thier mantra is give 10% discount for 30% reduction in features and expect the extra VRAM will make up for it. Unfortunately, witht that strategy, they will go nowhere and they keep wondering why they are not making enough sales. I myself didnt want to upgrade from my GTX1080 to either of these 4000 or 7000 series card seeing how much milking was going on and I ended up with an used 3080Ti for 500 an year and half ago.

Others milege may vary, but gaming at 4k 42" screen sitting 60cms away, i find FSR unusable in most games and DLSS barely noticebly from native(HUB has a good video on that). So I always end up comparing nvidia DLSS performance to AMDs native performance and AMDs 10% discount dont stand a chance. 25% discount where I consider AMD appropriate. People make a huge deal with VRAM, but most games out there run fine on 12GB. If its an issue, just drop the texture a bit and you still get better FPS and visual quality than AMD cards with FSR. I was recently watching a video on how the 3080Ti with DLSS and RT absoultely destroys a 7900XTX in alan sleep 2 at 4K.

AMD is definitely improving on both fronts, but despite better business and income, they are not putting enough efforts in improving thier software/features sections. I hope AMD gets something better FSR wise next gen and it will be another good 2.5 years for me to upgrade anything AMD even if I wanted to jump ship.
 
  • Like
Reactions: 35below0
Please stop with the propaganda, we already know the results...

4K.png
 

NickyB

Distinguished
Nov 14, 2008
98
9
18,645
I'll stick with my 4080 super thanks for letting me know. And next series from nv I am going with a 5090 if it isn't outrageous in cost. I am not even considering AMD at all for my video card. Never have. I get 4k 120 fps with dlaa and dlss with max settings. On my LG C2 I have 0 need for an AMD card. Get whatever you like. I stick with nvidia. Go buy donkey joe brand for a video card if that makes you happy. AMDs best card is competing with nv 2nd best. Not sure how that is fair but ok. NV drivers just work. I have 0 problems. Not sure what this post is supposed to do. Get people to like AMD video cards?

Sorry.
 

Notton

Commendable
Dec 29, 2023
904
804
1,260
I have a 4070Ti (I got it for free, and I would never have bought it myself).
It only has 12GB of VRAM, but I never saw usage spike to more than 8GB on the games I was playing.
These games were Darktide, Helldiver 2, and Palworld.
 

Silas Sanchez

Proper
Feb 2, 2024
109
65
160
The problem with AMD CPUs and GPUs are thier higher power consumption at idle and nobody is making enough noise about this. All those benchmarkers show you that AMD is more efficient when it comes to CPU processing but dont tell you on an average day, AMD would have consumed more power due to higher idling watts.

@JarredWaltonGPU Is it possible for you to make an Intel vs AMD CPU comparision with a power meter connected that will show watt hours consumed over a typical day of web browsing, gaming and idling? Lets say 24 hours. 14 hours idling, 4 hours gaming and 6 hours websurfing and youtube playback with nvidia GPUs installed (to take GPU efficiency out of the equation as we know AMD GPUs are worse). Or better use iGPUs for each when not gaming.
My take on this is if your concerned with saving all that extra energy you should not buy such hardware. Tbh I dont really buy it, I dont think the savings make enough of a difference in a world were everything is so expensive and going up and up. If i was going to game for example it would be a console, less electricity cost and heat produced, aswell the gpu prices have destroyed PC gaming for me.

My 7950X with 40GB ram used up-brave browser many tabs, heavy game paused in background, many misc things opened, playing a 4K h.265 60fps video consumes around 38-44watts. Compare that to my thinkpad CPU that is something like 10watts irrc.

Doing some dirty quick math with my usage:
The PC is on everyday and each day is on for 12+hours, sleeps 10+hours, during that 12hours the CPU idles at 25-30watts for 50% of the time-not used, and 35-44watts the other 50% multitasking. If i could somehow get that idle figure of 25-30watts down to 6-10 id save around 2500-3000wh a week. Thats worth as much as running my small AC for 5hrs a week, not much of a huge impact tbh.
Sure its something, but then again, all I have to do is keep more lights off, and sleeping my PC during those breaks would for me undermine the advantage of the fabled 7watt CPU.

Now when on battery power that is when every single watt is vital, which is why if on battery I use a Laptop for my dekstop work and sleep my machine every chance I get.

My 4080 Super gets 25-27fps in Portal RTX at 1600p no DLSS all native loaded 99% drawing 310+ watts. Turn on DLSS and the difference in image quality is quite obvious, DLSS makes it somewhat blurry and the reflections on the dimpled textures look unrealistic, overall it was bad enough i actually preferred the slow frame rate. It makes me think I might aswell go Console gaming. It also used 12GB of vram!!!!!!
Since work and energy are fundamental concepts of the physical world and intimately linked, and since shrinking transistors doesn't give as big a performance gains like it use to, and since games have stagnated so badly- they dont have path tracing or those extreme unreal engine 5 tech demo graphics that make Crysis look like N64, we must accept we will soon be drawing well over a kilowatt just to game, either that or they force that DLSS nonsense on use due to us "damaging the environmental". The PS5 is said to draw only 200watts, sound efficient right? But it also only does like a quarter of the work a high end PC does. Even the 5090 wont stand much chance of doing 4K path tracing with true next gen graphics that you see unreal engine 5 is already capable of.
 

mhmarefat

Distinguished
Jun 9, 2013
67
77
18,610
My 4080 Super gets 25-27fps in Portal RTX at 1600p no DLSS all native loaded 99% drawing 310+ watts. Turn on DLSS and the difference in image quality is quite obvious, DLSS makes it somewhat blurry and the reflections on the dimpled textures look unrealistic, overall it was bad enough i actually preferred the slow frame rate. It makes me think I might aswell go Console gaming. It also used 12GB of vram!!!!!!
It is INSANE how Nvidia has gotten away with "If you want RT you must choose nvidia" yet even in games that are specifically optimized for Nvidia hardware, turn on RT and you lose a HUGE amount of performance bringing you back to decades ago standards (30 to 60 FPS). Turn on "real RT" (PT) and go back to decades ago resolution (1080p) as well if you want smooth FPS! (and certainly you can tell the difference between RT on/off to justify 450 (!) Watts of energy consumption... right?) And this comes for GPUs that cost $1K to $2K! wow!
 
@JarredWaltonGPU Is it possible for you to make an Intel vs AMD CPU comparision with a power meter connected that will show watt hours consumed over a typical day of web browsing, gaming and idling? Lets say 24 hours. 14 hours idling, 4 hours gaming and 6 hours websurfing and youtube playback with nvidia GPUs installed (to take GPU efficiency out of the equation as we know AMD GPUs are worse). Or better use iGPUs for each when not gaming.
I don't have the hardware for testing CPUs and collecting power data — Paul's the CPU guy. Plus you're linking to laptop videos that are an entirely different ballgame. I definitely don't do much with the mobile side of testing these days.

You'd need to be very sure that the laptops are equal (as much as possible) on the hardware configurations, including screens, storage, memory, battery, etc. More importantly, you'd need two laptops that have equivalently tuned firmware, which is almost impossible to prove without testing lots of laptops to determine what "tuned" should look like. Grabbing two Clevo or Asus or whatever laptops with the same chassis isn't enough to do this, basically.

And then with all of that, you need performance comparisons, running both plugged in and unplugged. The AC results should show how high performance can get in "ideal" circumstances, while DC work would show what sort of loss in performance you get by unplugging. With battery power, you'd also be looking at how long the laptops last unplugged while delivering whatever level of performance you get — and it's entirely possible to tune for higher unplugged performance with worse battery life, or vice versa.

Nine years ago (and before), when I was doing primarily laptop reviews at AnandTech, this is something I might have tried to do. Today, though, it's "Somebody Else's Problem." LOL
 
  • Like
Reactions: bit_user

vinay2070

Distinguished
Nov 27, 2011
294
85
18,870
I don't have the hardware for testing CPUs and collecting power data — Paul's the CPU guy. Plus you're linking to laptop videos that are an entirely different ballgame. I definitely don't do much with the mobile side of testing these days.

You'd need to be very sure that the laptops are equal (as much as possible) on the hardware configurations, including screens, storage, memory, battery, etc. More importantly, you'd need two laptops that have equivalently tuned firmware, which is almost impossible to prove without testing lots of laptops to determine what "tuned" should look like. Grabbing two Clevo or Asus or whatever laptops with the same chassis isn't enough to do this, basically.

And then with all of that, you need performance comparisons, running both plugged in and unplugged. The AC results should show how high performance can get in "ideal" circumstances, while DC work would show what sort of loss in performance you get by unplugging. With battery power, you'd also be looking at how long the laptops last unplugged while delivering whatever level of performance you get — and it's entirely possible to tune for higher unplugged performance with worse battery life, or vice versa.

Nine years ago (and before), when I was doing primarily laptop reviews at AnandTech, this is something I might have tried to do. Today, though, it's "Somebody Else's Problem." LOL
Thanks for getting back. And I didnt mean to say use laptops for comparision. I meant desktops CPUs. I only gave laptops as an example to show that over long period of usage, intel might actually be consuming less power due to AMD always idling at higher watts throwing out all the efficiency claims out of the window. A cheap meter with energy monitoring should do - https://www.amazon.com.au/TP-Link-Tapo-Smart-Wi-Fi-Socket/dp/B08SJ7MLRR/

And you dont even need to do it for 24 hours. You can proportionally reduce the usage to 2 hours by gaming for 20 mins, broswing/youtubing for 40 mins and idling 60 mins and then calculate watt hours consumed for whole day.

I would have done it myself, but dont have an intel system to compare :)
 
Last edited:

vinay2070

Distinguished
Nov 27, 2011
294
85
18,870
Doing some dirty quick math with my usage:
The PC is on everyday and each day is on for 12+hours, sleeps 10+hours, during that 12hours the CPU idles at 25-30watts for 50% of the time-not used, and 35-44watts the other 50% multitasking. If i could somehow get that idle figure of 25-30watts down to 6-10 id save around 2500-3000wh a week. Thats worth as much as running my small AC for 5hrs a week, not much of a huge impact tbh.
Sure its something, but then again, all I have to do is keep more lights off, and sleeping my PC during those breaks would for me undermine the advantage of the fabled 7watt CPU.
My whole request to benchmark power on both systems is not to save money on electricity over just one system, but curious to see how much it makes a difference in the long run. Goto any youtube videos and see the comments, everyone is praising AMD for efficiency. Why? Because every reviewer shows they consume lower power when loaded. What if in the long run, i mean over 24 hours period if Intel is actually more efficient? We dont know as nobody is doing these kind of tests. And if its true that AMDs total power consumption over a day is the same or more as intel, it will force tthem to push a BIOS/firmware fix (just like they did for GPUs with multi-monitor) which would be awesome. Now multiply that power savings you calculated over millions of systems recieving these patches.

This is when my curiosity hit the roof.

View: https://youtu.be/bWOErOr7INg?si=v2lUV3K78edIwTBD&t=806
 
Thanks for getting back. And I didnt mean to say use laptops for comparision. I meant desktops CPUs. I only gave laptops as an example to show that over long period of usage, intel might actually be consuming less power due to AMD always idling at higher watts throwing out all the efficiency claims out of the window. A cheap meter with energy monitoring should do - https://www.amazon.com.au/TP-Link-Tapo-Smart-Wi-Fi-Socket/dp/B08SJ7MLRR/

And you dont even need to do it for 24 hours. You can proportionally reduce the usage to 2 hours by gaming for 20 mins, broswing/youtubing for 40 mins and idling 60 mins and then calculate watt hours consumed for whole day.

I would have done it myself, but dont have an intel system to compare :)
Like I said: Paul is the CPU guy. I have a few systems, yes, but mostly only one CPU on each. So I have 8700K, 9900K, 12900K, 13900K, and 7950X. (I've borrowed a couple of CPUs from Paul for an upcoming article, though!) He has all the stuff that would be necessary. I'm not sure what more you want done than what he's already doing in CPU reviews, though, as basically every test you add ends up adding that times 50 or whatever to our CPU/GPU hierarchies. Something that takes a few hours per CPU to run would be terrible in my book. Like, that's a short dead end road to reviewer burnout.