Review AMD Radeon RX 7900 XTX and XT Review: Shooting for the Top

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

systemBuilder_49

Distinguished
Dec 9, 2010
58
15
18,545
I want to say how utterly disappointing this release is for AMD . In general this generation for both AMD and Nvidia is underwhelming. Everyone is over-charging for the performance given for starters. The 7900XT(X) should have been the 7800XT(X) from what I am seeing in the benchmarks.
I have to agree that AMD's cards were mis-named BUT theoretically their cards have WAY MORE POWER than the 4080 and as to the mis-naming - who knew that NVidia would make a 4090 card with FIFTY percent more power than the 4080? That's crazy!

AMD should have reacted by renaming their cards the 7800xt and 7850xt. But they have BY FAR the best GFlops/$ of anything ever built, 60 for the 7900 xtx, 58 for the 7900xt, vs a paltry 49 for the 4090, and 40 for the 4080. No other GPU is above the mid 30's per $.

So I am very happy that people don't want to buy them because that makes it easy for me to buy them and I bought a 7900xt today, happily, at msrp. The chance that the 7900xt gets 10% faster over time and exceeds the 4080 is high. the two cards have 10% and 30% higher peak TFlops than the 4080. The ray tracing of the xtx is 3090 TI level and of the xt is 3080 TI level which is good enough for me.

I like to call ray tracing "shooting at puddles" because in multiplayer games it is virtually useless because even if you see an enemy in a puddle, the puddle will not support a ricochet shot so what good is it? And would you rather play in the rain or in the sunshine?
 
Last edited:

AgentBirdnest

Respectable
Jun 8, 2022
178
152
2,070
I don't think so. I mean, new architecture with 67% more raw bandwidth, supposedly even more if you look at the total Infinity Cache bandwidth, and theoretically 160% more compute. And in practice it performs as if a lot of that bandwidth and compute isn't realized in the real world. Having to go over the extra Infinity Fabric to get to L3 cache could be a big part of this.
[snip]
It's also worth looking at relative chip size and performance. AMD has a 300mm^2 GCD plus 220mm^2 of MCDs. Some of that size is due to the Infinity Fabric linking the chiplets together. Nvidia meanwhile has a 379mm^2 die that has a lot of extra features (DLSS and DXR stuff). I'd wager the RTX 4080 actually costs less to manufacture than Navi 31, factoring in everything.

AMD is going to need to prove with RDNA 4 that chiplet GPUs can continue to scale to even higher levels of performance without sacrificing features. They certainly haven't done that with RDNA 3. A monolithic Navi 31 without chiplets probably would have been in the 400mm^2 range and offered even higher performance. That's just my rough estimate, and we can't know for certain, but I'd love to know how much AMD actually saved at the end of the day by doing chiplets.
I love your number crunching. :) Gotta give you proper props for providing info and estimates like that - I love this stuff.
It's really interesting, and makes you wonder what could have been with a monolithic die, and why use chiplets if it's not cheaper (I'd guess they are expecting it to get cheaper with next gen, or something?)
It'd be interesting to see someone do a deep-diving story about this topic.
 
  • Like
Reactions: JarredWaltonGPU

mjbn1977

Distinguished
I have to agree that AMD's cards were mis-named BUT theoretically their cards have WAY MORE POWER than the 4080

Please ellaborate......do you mean power in terms of power limit (in that case its not a good thing since 4080 is delivering same performance for much less power). Or do you mean Power in terms of performance? In that case.

  1. The AMD cards use more power. It almost looks like that they were supposed to be slower and (again) they maxed their power limit up to the brink last minute in order to compete with the 4080. In non-RTX titles the 4080 runs at average of 230 watts and in RTX titles around 300 to 310 watts. The AMD cards always run at their MAX and use more power
  2. Depending on the reviews and games chosen for the review, the XTX is on the same level or slightly up to 3% faster in rasterization than the 4080. The XT is quite a bit slower than the 4080. Wouldn't call that way more power. The 4080 runs in average cooler.
  3. The 4080 has definitely more RTX "power" than the new AMD cards. Depending on game the difference between XTX and 4080 is up to 33% (cyberpunk 2077), even so it got much better and the XTX is very RTX capable....
  4. 4080 has more game ready driver support and driver support is generally better. AMDs Adrenaline got better (and the driver name sounds cooler)
  5. Upscaling is better on DLSS2.0 compared to FSR2. But its close
  6. Nvidia has frame generation, AMD will have frame generation some time in the unspecified future.

Overall the 4080 is the better product. But AMD has the better price. But saying the new cards of AMD have more power than 4080, is just wrong. Unless you mean Power Limit.

They only thing AMD has going with the new cards is the price (and maybe the smaller size of the card). But price aside, I don't see an argument why AMD would have a better product.....
 
  • Like
Reactions: Why_Me

mjbn1977

Distinguished
The chance that the 7900xt gets 10% faster over time and exceeds the 4080 is high. the two cards have 10% and 30% higher peak TFlops than the 4080. The ray tracing of the xtx is 3090 TI level and of the xt is 3080 TI level which is good enough for me.

TFLOPs never really translate 1 to 1 gaming. I agree that they might get better drivers over time, but more for the games they don't do good at all right now. There are games the 7900s do really bad with (probably lacking driver support).

Because the cards already operate on their absolute max in terms of power, I don't think much performance will tweaked out of them.

I don't say they are bad cards at all. Just saying they have way more power than 4080 is misleading. They don't have way more power. The XTX is trading blows with the 4080 in rasterization. Sometimes 4080 wins sometimes XTX wins. Its a wash because we talking 10 to 15 fps at 200 FPS in 1440p.....what's the difference. But the 4080 is using less power to do that. And if you don't care about ray tracing eye candy, than the XTX or XT might be the better deal.

The only real advantage AMD has right now is the price. But that is only true for the XTX, the XT is too expensive....15% less performance for 10% less price....

All those cards are so freakin fast...but expesive too. Everything over $500 is expensive and we buying this stuff anyway.....I call it Geek Tax.....
 
Last edited:

TJ Hooker

Titan
Ambassador
I don't think so. I mean, new architecture with 67% more raw bandwidth, supposedly even more if you look at the total Infinity Cache bandwidth, and theoretically 160% more compute. And in practice it performs as if a lot of that bandwidth and compute isn't realized in the real world. Having to go over the extra Infinity Fabric to get to L3 cache could be a big part of this.

Put another way:
RX 6950 XT has 59% of the theoretical compute of the RTX 3090 Ti and 57% of the raw bandwidth. At 4K (rasterization) on the updated testbed, it delivers 88% of the performance.
RX 7900 XTX has 26% more theoretical compute than the RTX 4080 and 34% more raw bandwidth. At 4K (rasterization), it delivers 4% more performance.
So due to architectural changes plus chiplets, AMD has gone from delivering close to Nvidia performance with substantially lower paper specs, to now needing more paper specs to deliver comparable performance.
But how much of this comes from AMD doubling the FPUs per shader core, which doubles the theoretical FP32 performance but doesn't provide the same level of performance improvement in games? We saw the same thing from Nvidia with Ampere. E.g. the 3090 Ti had ~3x the compute of 2080 Ti, but only performed ~55% faster. So Nvidia took a hit to gaming performance per FLOP from 20 to 30 series, and now AMD is experiencing the same thing.
 
Dec 14, 2022
1
0
10
Thanks for the review!
One method to check the AI performance on Windows is using the DirectML. DirectML supports both Nvidia and AMD GPUs, but I haven't checked if 7900 is possible. Some sample models are available on github, and it would be very helpful if we can see the throughput of the sample models on 7900 xtx.
 

JarredWaltonGPU

Senior GPU Editor
Editor
But how much of this comes from AMD doubling the FPUs per shader core, which doubles the theoretical FP32 performance but doesn't provide the same level of performance improvement in games? We saw the same thing from Nvidia with Ampere. E.g. the 3090 Ti had ~3x the compute of 2080 Ti, but only performed ~55% faster. So Nvidia took a hit to gaming performance per FLOP from 20 to 30 series, and now AMD is experiencing the same thing.
Well, that's the thing. Nvidia if I'm not mistaken got a bigger relative boost than AMD. I mean, here's the takeaway comparing RTX 3080 and RTX 2080 Ti versus RX 7900 XT and RX 6900 XT.

3080: 29.8 teraflops, 760 GB/s
2080 Ti: 13.4 teraflops, 616 GB/s
3080 offered 122% more compute and 23% more bandwidth
3080 at is 39% faster in 4K rasterization, 47% faster in 1440p ray tracing

7900 XTX: 23.0 teraflops, 512 GB/s
6900 XT: 61.4 teraflops, 960 GB/s
7900 XTX offered 167% more compute and 88% more bandwidth
7900 XTX at is 46% faster in 4K rasterization, 51% faster in 1440p ray tracing

That means, as a ratio, Nvidia Ampere vs. Turing improved by 0.32% for every theoretical 1% teraflops increase, and 1.7% for every 1% increase in bandwidth. RDNA 3 vs. RDNA 2 meanwhile improved by 0.28% for every 1% teraflops increase, and 0.52% for every 1% memory increase.

So the teraflops scaling is at least relatively close, but the bandwidth scaling greatly favors Nvidia. Which would make sense, as the chiplets stuff would have the biggest impact on memory latency and bandwidth I would think.
 

systemBuilder_49

Distinguished
Dec 9, 2010
58
15
18,545
I feel like the 7900xt is not a bad card. If you compare it to the 3080 at a recent MSRP ($830), its a great deal. No other nivida cards sell for less than $1000. So if your peak budget is $1000, the 7900xt is only competing with the 6950xt. But I wanted something a little better than 6950xt. Better codecs, better ray tracing, more power efficient, better for 4k especially. The 7900xt ticks all those boxes.
 

Colif

Win 11 Master
Moderator
I pre ordered the Powercolor Red Devil RX 7900 XT version, its good enough for me. I might even get it next week but I need a bigger PSU before I can safely use it. It also depends on what you want out of card, I wasn't concerned about 4k performance or RT really.

I know you have to compare against previous gen but the uplift from earlier generations is where most of the sales come from. And these are still better than those. People with 30 series or 6800 or above shouldn't be looking yet.

Wanting 4090 performance for $600 less is just a pipe dream. If AMD had that, it would have charged for it.

XT may have had a better reception if it wasn't announced at same time. And well, the price was lower. The $100 difference translated to about $150 here after conversions and tax. If the XTX had existed on stores longer then I may have bought one. But XT was in price range I had expected it to be
 
  • Like
Reactions: RodroX and COLGeek
I pre ordered the Powercolor Red Devil RX 7900 XT version, its good enough for me. I might even get it next week but I need a bigger PSU before I can safely use it. It also depends on what you want out of card, I wasn't concerned about 4k performance or RT really.

I know you have to compare against previous gen but the uplift from earlier generations is where most of the sales come from. And these are still better than those. People with 30 series or 6800 or above shouldn't be looking yet.

Wanting 4090 performance for $600 less is just a pipe dream. If AMD had that, it would have charged for it.

XT may have had a better reception if it wasn't announced at same time. And well, the price was lower. The $100 difference translated to about $150 here after conversions and tax. If the XTX had existed on stores longer then I may have bought one. But XT was in price range I had expected it to be
Actually... I've been looking at more AIB models and, particularly Asus and XFX so far in TPU, can actually get SUPER close to the 4090 in overall performance in raster with their OC modes. Sure, the power consumption goes up as expected, but holy damn they close the gap while still costing less money than the 4080 16GB. I'd say that is impressive.

Specially the XFX one, which is about the same size as the reference card can OC really well.

The after day 1 impressions are making the XTX look way better, but I still have serious doubts about the XT. Sorry Colif xD

Regards.
 

Colif

Win 11 Master
Moderator
The after day 1 impressions are making the XTX look way better, but I still have serious doubts about the XT. Sorry Colif xD

It depends what you coming from as to what you want, an XT wouldn't be much of a step up from your card but it is from a 2070 Super. Waiting for a comparison but its likely over 100% better given the xtx is 131% better. Don't read this review if you want to stay sane
I have yet to buy top range card, the pattern continues.
 
It depends what you coming from as to what you want, an XT wouldn't be much of a step up from your card but it is from a 2070 Super. Waiting for a comparison but its likely over 100% better given the xtx is 131% better. Don't read this review if you want to stay sane
I have yet to buy top range card, the pattern continues.
That's a fair point. Whatever you get to replace the 2070S is a nice upgrade for sure.

As I said in the Discord, I hope you don't have teething pains with the AMD drivers >_<

Regards.
 
My last 3 cards been Nvidia, helps to try something else occasionally or Nvidia take you for granted and charge 1600 for a card... oh wait.

Guess I find out about drivers then. Give me useful experience for fixing their problems I guess.
Haha, that made me chuckle.

Well, you will encounter gremlins for sure. My Vega64 has been problems and hassle free for the most part, except the random crash here and there due to power limits. AMD still can't figure out how not to screw power limits via drivers vs vBIOS; at least it doesn't burn down like nVidia cards, LOL. My 6900XT has been spotless, except due to some weird driver bugs here and there with some games, but they have been fixed with releases over the months. The only glaring one was in VRChat, where they introduced a bug back in May and it was patched just last month (video playback; quite an infamous bug). And my 6800M (laptop) had a bug with Genshin Impact for about the same time as the 6900XT where in a particular are of the game, I would just crash to desktop immediately. They introduced this in June and was fixed in Nov as well.

That is to prepare yourself, Colif, haha. You may have to get used to move back and forth between driver releases in the very very worst case scenario. This being said, AMD has made it very painless to do that, since they, as I mentioned in another post, are very self aware their drivers will have bugs. You can downgrade the drivers without uninstalling and it's great; it would be better if there were no bugs, but hey.

Regards :p
 

Colif

Win 11 Master
Moderator
i looked at wrong specs, i saw 6800m, didn't realise it wasn't only pc. Upgrading from a 6900XT would be a waste of money, wait till next generation.
I see more people with problems with Nvidia drivers than AMD, but then there are more people with those cards. ATI lives on as the name of their driver files, if nothing else. - Atikmdag stands for ATI Radeon Kernel Mode Driver Package
 
Last edited:
i looked at wrong specs, i saw 6800m, didn't realise it wasn't only pc. Upgrading from a 6900XT would be a waste of money, wait till next generation.
I see more people with problems with Nvidia drivers than AMD, but then there are more people with those cards. ATI lives on as the name of their driver files, if nothing else.
Yeah. RDNA2 also made its way to Laptops. They're decent mobile GPUs as well. AMD called it the "6800M", but it uses the same 6700XT die and has the same memory config, but at a lower TDP. Performs slightly under it in pure GPU grunt tasks, and slower in overall "PC" (CPU and GPU demanding) tasks.

As for upgrading. It'll depend on the RDNA3+ rumours. If AMD goes ahead with a mid-cycle refresh, I may just wait for that, but otherwise, as you say, I'm better off waiting until RDNA4. That is, unless Valve releases their new HMD (VR headset) this year. Which would be bonkers and I'll need something better than the 6900XT.

Regards.
 

test_123

Distinguished
Feb 22, 2012
97
2
18,535
I was thinking of getting the 7900
Considering how much AMD slammed Nvidia about the 12 pin adaptor and power draw in their November presentation the power rasterization Power draw numbers of the RX7000 cards made me LOL.....come on AMD.....they draw over 100 watts more than 4080 for same and or less performance. No issue with performance which seems pretty decent on the 7000er cards all things considered (even so RT lacks behind), but AMD sounded like "Don't buy the power hungry RTX 4000 series cards" get ours, we don't even need a weird 12 pin power connector.....LOL.....
Exactly what I was thinking!!!! The 7900xt recommends a 750W power supply while the 4080 recommends a 850. However, the 7900xt has a 20W higher peak power usage than the 4080!!!!!!
 

Colif

Win 11 Master
Moderator
some XT need more than 750watts
So far the Asrock Taichi and Gigayte Radeon RX 7900 XT GAMING OC 20G boith need 800,
and the Powercolor Red Devil needs 850

I expect the Sapphire NItro+ does as well, but their website isn't making it obvious. it just shows tbp instead. I expect its 850

no idea about reference cards and sag. Its a little early to know.
 

mjbn1977

Distinguished
I was thinking of getting the 7900

Exactly what I was thinking!!!! The 7900xt recommends a 750W power supply while the 4080 recommends a 850. However, the 7900xt has a 20W higher peak power usage than the 4080!!!!!!

No, Nvidia recommends 750 watts for the 4080. Some board partners recommend 850 watts.

My 4080 also draws much less power in games without raytracing. In games like AC: Valhalla or Kingdom Come Deliverance the power averages around 230 Watts....only if I play games with all the bells and whistles with raytracing and DLSS (Cyberpunk) I see power between 300 and 310 Watts.....and that is with the GPU clocking a 1820Mhz....

What I saw from benchmark runs on videos and reviews, the 7900 series cards always max out the power limit, doesn't matter if with our without raytracing or DLSS....
 

Colif

Win 11 Master
Moderator
power usage seems to be something they need to fix on 7900, maybe in drivers. I don't know. I know there is a bug where idle usage changes depending on which display you attach.
 

test_123

Distinguished
Feb 22, 2012
97
2
18,535
No, Nvidia recommends 750 watts for the 4080. Some board partners recommend 850 watts.

My 4080 also draws much less power in games without raytracing. In games like AC: Valhalla or Kingdom Come Deliverance the power averages around 230 Watts....only if I play games with all the bells and whistles with raytracing and DLSS (Cyberpunk) I see power between 300 and 310 Watts.....and that is with the GPU clocking a 1820Mhz....

What I saw from benchmark runs on videos and reviews, the 7900 series cards always max out the power limit, doesn't matter if with our without raytracing or DLSS....
Thanks for the info!! Yeah I was toying with getting the 7900xt because I thought I would be OK with the power draw. Maybe, the 4070Ti will fit my needs better. Don't like the 12GB.... wish it was 16GB.
 

test_123

Distinguished
Feb 22, 2012
97
2
18,535
power usage seems to be something they need to fix on 7900, maybe in drivers. I don't know. I know there is a bug where idle usage changes depending on which display you attach.
I hope you're right. Draw more power for less performance is not a good look.
 

Phaaze88

Titan
Ambassador
What's with the ridiculous psu estimations? Do folks 'live to Furmark' or something, because from what TPU's power consumption data shows across various cards, that's when some of these over 850w recommendations start to make sense...