Review AMD Radeon RX 7900 XTX and XT Review: Shooting for the Top

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
That coil wine is a big no for me... maybe with drivers update the amd cards gain another 10 or 15% with overclock some more 3 and maybe 5% so a 4090 for 1000us, but no ray tracing... maybe amd got the 7999x he'll edition and give some pain to nvidia. What ever just get a 6950x half the money and be happy
 
  • Like
Reactions: artk2219
Considering how much AMD slammed Nvidia about the 12 pin adaptor and power draw in their November presentation the power rasterization Power draw numbers of the RX7000 cards made me LOL.....come on AMD.....they draw over 100 watts more than 4080 for same and or less performance. No issue with performance which seems pretty decent on the 7000er cards all things considered (even so RT lacks behind), but AMD sounded like "Don't buy the power hungry RTX 4000 series cards" get ours, we don't even need a weird 12 pin power connector.....LOL.....
 
Last edited:
  • Like
Reactions: Why_Me and artk2219
With prices above $1000.00 for both the XTX and 4080 (certainly more) most people will just spend $200.00 more for the 4080. If the XTX was around $900.00 then that could be a game changer. And comparing the performance between the 6950 and XTX, this isn't as large of a performance increase compared to the delta between the 3000 and 4000 series, so far.
 
  • Like
Reactions: Why_Me and artk2219
Stupid question here: Why don't we have dedicated Raytracing cards like 15 years ago where we had the physx add-in card and get that out of the GPU?

Too much info and latency to shuffle between CPU/GPU?

If Nvidia offered a dedicated RT card, then nobody would buy one.
It was a lot more profitable for Nvidia to just force every single customer to buy RT hardware and raise all prices accordingly.
 
  • Like
Reactions: artk2219
Stupid question here: Why don't we have dedicated Raytracing cards like 15 years ago where we had the physx add-in card and get that out of the GPU?

Too much info and latency to shuffle between CPU/GPU?

I don't think it was possible back then. Real time raytracing is pretty intense and 15 years ago we were pretty much 15 years away from the technology. We are just getting there.....as processing power gets better and better raytracing will eventually completely replace rasterization. It is the way to go. Right now it is still a mix. Rasterization and raytracing used at the same time.

Eventually, this will also make it much easier for game developers to create photorealistic lighting effects. They don't have to spend endless of time to find ways to "fake" realistic lighting.....raytracing will basically do most of the work.
 
  • Like
Reactions: artk2219
I don't think it was possible back then. Real time raytracing is pretty intense and 15 years ago we were pretty much 15 years away from the technology. We are just getting there.....as processing power gets better and better raytracing will eventually completely replace rasterization. It is the way to go. Eventually, this will not look for realistic and stunning, but it will also make it much easier for game developers to create photorealistic lighting effects. They don't have to spend endless of time to find ways to "fake" realistic lighting.....raytracing will basically do most of the work.

Sorry, poorly written on my end. I was referring to 15 years ago we had dedicated physics processing units. Nvidia then bought Physx and started incorporating the functionality into GPUs with the Geforce 8000 series.

Merely wondering why we don't do the same today with raytracing since it's so computationally expensive. And for what it's worth, without DLSS tricks, raytracing still renders most games nearly unplayable.
 
I don't think it was possible back then. Real time raytracing is pretty intense and 15 years ago we were pretty much 15 years away from the technology. We are just getting there.....as processing power gets better and better raytracing will eventually completely replace rasterization. It is the way to go. Right now it is still a mix. Rasterization and raytracing used at the same time.

Eventually, this will also make it much easier for game developers to create photorealistic lighting effects. They don't have to spend endless of time to find ways to "fake" realistic lighting.....raytracing will basically do most of the work.
What I dislike about raytracing is that it takes a game that looks great and runs smooth and turns it into stuttering shuttering crap that needs to be repaired with upscaling to be presentable. It will get better with time and become the standard, as you say. But today, it is like playing Breath of the Wild on an 8-bit original NES.
 
When you coming from a 2070 Super, both look good to me.

Same here, just upgraded from 2080 to 4080. The performance increase is insane. Was a little worried that I have buyers remorse after the RX 7000er reviews today. But no, all good. I play mostly WQHD and crank everything up with raytracing (when available). In rasterization only titles what is the difference between 180 or 200fps (AC : Valhalla all on Ultra for example). But Cyberpunk in WQHD with Raytracing on Psycho without DLSS having 60fps average (100 fps with DLSS Quality) vs 40fps average is a big difference....
 
Merely wondering why we don't do the same today with raytracing since it's so computationally expensive. And for what it's worth, without DLSS tricks, raytracing still renders most games nearly unplayable.

I am currently playing Cyberpunk 2077 on Ultra Settings with Raytracing on highest settings (Psycho) on 1440p without using DLSS. Average framerate is 60fps. I call that very well playable. And that is probably the most intense raytracing title out there. no stuttering....
 
Like LTT showed, in Europe these GPU are unaffordable due to how much power they use. If you want to sell a GPU in Europe, you better make sure it is under 200 watt.

sdfsfsfsfsfsff.jpg
 
I play a lot of FPS, so a RTX and DLSS has NEVER been used, and my graphics in game are turned mostly to low as I try to get the highest framerate possible. Take that into consideration and the 7900XTX beats the RTX4080 everytime, and for $200 LESS. I was going to upgrade my RTX2080 with the 4080, but market issues and scalpers I've held off, very glad I did!

I should also note all my monitors are Freesync Premium Pro, not necessarily by choice, its just that the gsync monitors are few, so with an AMD card I'll get native support. This one is a no brainer!
 
The performance hit as a consequence of using chiplets is negligible.
I don't think so. I mean, new architecture with 67% more raw bandwidth, supposedly even more if you look at the total Infinity Cache bandwidth, and theoretically 160% more compute. And in practice it performs as if a lot of that bandwidth and compute isn't realized in the real world. Having to go over the extra Infinity Fabric to get to L3 cache could be a big part of this.

Put another way:
RX 6950 XT has 59% of the theoretical compute of the RTX 3090 Ti and 57% of the raw bandwidth. At 4K (rasterization) on the updated testbed, it delivers 88% of the performance.
RX 7900 XTX has 26% more theoretical compute than the RTX 4080 and 34% more raw bandwidth. At 4K (rasterization), it delivers 4% more performance.
So due to architectural changes plus chiplets, AMD has gone from delivering close to Nvidia performance with substantially lower paper specs, to now needing more paper specs to deliver comparable performance.

It's also worth looking at relative chip size and performance. AMD has a 300mm^2 GCD plus 220mm^2 of MCDs. Some of that size is due to the Infinity Fabric linking the chiplets together. Nvidia meanwhile has a 379mm^2 die that has a lot of extra features (DLSS and DXR stuff). I'd wager the RTX 4080 actually costs less to manufacture than Navi 31, factoring in everything.

AMD is going to need to prove with RDNA 4 that chiplet GPUs can continue to scale to even higher levels of performance without sacrificing features. They certainly haven't done that with RDNA 3. A monolithic Navi 31 without chiplets probably would have been in the 400mm^2 range and offered even higher performance. That's just my rough estimate, and we can't know for certain, but I'd love to know how much AMD actually saved at the end of the day by doing chiplets.
 
Like LTT showed, in Europe these GPU are unaffordable due to how much power they use. If you want to sell a GPU in Europe, you better make sure it is under 200 watt.

sdfsfsfsfsfsff.jpg
Paying $0.52/kWh here in sunny california. Then again we pay the most in the US due to inane policies.
Running a 4090 and I don't have time to game 24/7 so power usage isn't a major consideration.
 
Take that into consideration and the 7900XTX beats the RTX4080 everytime, and for $200 LESS. I was going to upgrade my RTX2080 with the 4080, but market issues and scalpers I've held off, very glad I did!

if the 4080 was $1200 here and not $2500 I would look at it. Waiting to see what prices they charge here.

Starting to think either Australia got none, or they waiting till tomorrow, even though everything says 13th... which is today. I guess I don't normally buy anything on 1st day except games, and they too tend to be held back until rest of world catches up. Hardly fair really.
 
First review I see about this launch (as I always do).
Another sorta disapointing launch. Perhaps AMD can do some tweaking in drivers/bios to get a little more performace and better power usage, but I don't think they will do any miracle here.
Im guessing prices will have to go down for this cards, if they ever get a lot of stock on the streets. Im almost sure it will be very limited for now.

I will go check other reviews, thank you for this one!!!
 
Stupid question here: Why don't we have dedicated Raytracing cards like 15 years ago where we had the physx add-in card and get that out of the GPU?

Too much info and latency to shuffle between CPU/GPU?
In the end, the only advantage of that approach (other than making full RT content) would be to choose rasterization and raytracing performances separately. We could choose to match a mid-range GPU and a high-end RPU, or no RPU altogether. But that would also be a mess to develop and maintain games and stuff.

I believe they will both be merged for some time, until one of them makes a card, and developers make games, that can run full RT at 1080p. At that moment, it would be the end of rasterization (although I doubt it will happen). Anyway, all that are just wild guesses.

And thanks for reminding us of when Nvidia took a technology that could revolutionize physics in games and made it proprietary, thus eliminating its advantages from the gaming world for years. (But that is slightly off-topic)
 
In the end, the only advantage of that approach (other than making full RT content) would be to choose rasterization and raytracing performances separately. We could choose to match a mid-range GPU and a high-end RPU, or no RPU altogether. But that would also be a mess to develop and maintain games and stuff.

I believe they will both be merged for some time, until one of them makes a card, and developers make games, that can run full RT at 1080p. At that moment, it would be the end of rasterization (although I doubt it will happen). Anyway, all that are just wild guesses.

And thanks for reminding us of when Nvidia took a technology that could revolutionize physics in games and made it proprietary, thus eliminating its advantages from the gaming world for years. (But that is slightly off-topic)

With PhysX it wasn't just PhysX add in cards. It was any compatible GPU that you had lying around too. So you upgrade your GPU and repurpose your old one as the PhysX card. The biggest problem with that was that the game had to have GPU accelerated PhysX, which was not every game and no console games. That basically rendered it DOA,

Personally, I did try it since I had the parts laying around. I can't say it was worth it or that it really made sense though.
 
  • Like
Reactions: JarredWaltonGPU
That, I believe, is the reason AMD won't increase too much their market share in this generation. Yes, rasterization is comparable, so are power, memory, price and even upscaling performance/quality. But it is a bad card for raytracing, or at least that's the message, and between a full card and a crippled card, people will prefer the fully featured one. I know designing GPUs is a monstrously complex task, but they really needed to up their RT performance by at least 3x to be competitive. Now they will keep being "bang-for-buck", which is nice, but never "the best".

Edit: by some rough calcs, if the XTX is ~40% faster without RT than the 6950, and ~50% faster with RT, then the generational improvement is ~7%? If so, then that's hardly any improvement at all. Great cards and all that, but I'm very disappointed with the lack of focus on RT.
What on EARTH are you talking about all the reviews I've seen put the 7900 xtx ray tracing at 3090 ti performance, and about 10% down from the 4080, which is consistent with NVidia naming schemes (x090 in one generation becaomes x080 in the next generation).

https://www.guru3d.com/articles-pages/amd-radeon-rx-7900-xtx-review,24.html

Also, please tell me how many adversaries you've shot by looking at reflections in puddles? 10? 100? Oh, zero? Yeah, I thought so. Please stick to subjects that matter, not trolling topics, please.
 
Last edited:
Like LTT showed, in Europe these GPU are unaffordable due to how much power they use. If you want to sell a GPU in Europe, you better make sure it is under 200 watt.

sdfsfsfsfsfsff.jpg
Affordable gaming is dead. Long live overpriced gaming hardware! 😂
 
When you throw dlss into the mix....nvidia murdered them. That said, these prices are nuts and I just can't bring myself to do it. Doubling cost in a generation is just gross, and AMD coming out with something in between still makes me want to just sit this out for a good long time. I hope sales drop through the floor on these high end cards and these companies come back down to sanity.

I know people love their big super guns and all that but...ugh this is too much.
 
Vote with your $$$ just say no. They will relent on price. They are GOUGING the market. .