Review AMD Radeon RX 9070 XT and RX 9070 review: An excellent value, if supply is good

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Ok, actually, you have to break perf between raster and RT as the 9070 XT is faster than the 7900 XTX in raster.

Ehh where did you see that....

The 7900 XTX is faster at raster then the 9070 XT, the 9070 XT is much better at RT.

Huc985Ry4n2nCoMjcj5tvW-970-80.png.webp


jXJvkFKNAMrFvkvZwGqCpW-970-80.png.webp
 
The RX 7900 XTX still beats the RX 9700 XT in raster geomean at all resolutions.
Huc985Ry4n2nCoMjcj5tvW.png
It's only when you introduce RT that it falls down below the 9700 XT. And not even by that much - only about 10% or so.


It'll be viable for a long time, especially with 24 GB of VRAM.

It sounded to me like you bought the 7900 XTX because it was a good deal and I think it was. You knew it wasn't the fastest card available - it wasn't then and it isn't now. So, I think you should be satisfied with your choice and not stress over it.
yet the tech powerup review using the sapphire nitro + oc on average is lower than the 7900xtx by 6% but on average 15% better in RT performance..

still not Nvidia RT performance but not bad when the 7900xtx ( well my red devil ) was 2000aud ..

Im not 100% sold but if i can snag a sapphire nitro + oc for 1500aud or less that might be a good deal !
 
I watched the HUB review, since it came out first. The short of it: 9070XT = 7900XT (not XTX), both in perf and power consumption. So if you think Nvidia (Huang) lied about 5070's perf, then AMD also lied.
First of all AMD marketing materials only compared the 9070 series to the 7900 GRE and then 5070 Ti (at 4k) not the 7900 XT/XTX so I'm not sure where the lies come in. Secondly I'm not sure how you could state that you watched the HUB video coming to the conclusion that the 9070 XT = 7900 XT unless you're shilling for nvidia.
REZG0Il.jpeg

Ta50GoZ.jpeg

8jfLcpX.jpeg

Q9fMlth.jpeg
 
The RX 7900 XTX still beats the RX 9700 XT in raster geomean at all resolutions.
Huc985Ry4n2nCoMjcj5tvW.png
It's only when you introduce RT that it falls down below the 9700 XT. And not even by that much - only about 10% or so.


It'll be viable for a long time, especially with 24 GB of VRAM.

It sounded to me like you bought the 7900 XTX because it was a good deal and I think it was. You knew it wasn't the fastest card available - it wasn't then and it isn't now. So, I think you should be satisfied with your choice and not stress over it.


In case I didn't say it, I think what I MEANT to say was that the 7900xtx is faster in raster performance but that it's between the 9070 and 9070xt in ray tracing.

So as I read it, raster it's pretty much 4080/4080 super/5070ti levels give or take. RT I think it's basically like a 5070. Which honestly I'm ok with that. At $800 the 7900xtx was really a steal. When I bought it I think I figured that Nvidia cards would either be way overpriced or almost impossible to get. Both of which have proven true. Plus the fact that AMD said they would not compete high end. Now that I see the cards AMD is bringing out are 8gb vram less than my card and raster is lower, as well as RT not being much better on the higher model, I think the XTX is going to be a good card to just sit on this generation.

Definitely would not mind if they bring FSR 4 on the 7000 series cards, which I surmise they will later on. That would be a nice to have, but if they don't, I'm using the card for high refresh 1440p anyway, and it still has FSR 3.1 and frame generation, plus the fact the card is a beast of a card for the money I paid. So yeah, this card will definitely be good for a while. And if I do have issues, I bought Microcenter's 2 year walk in warranty on it, so if any issues, back it goes and I'll trade it out then.
 
Looks like the 9070 XT is the card to get if you can justify the cost and are in that $500-700 range. The 9070 on the other hand is disappointing because it doesn't particularly justify its cost (as expected) and just makes the $549 price point from both AMD and nvidia look bad.

Having watched the DF video comparing FSR3/4 and DLSS CNN/TF it seems like AMD delivered what they needed to by being between the two DLSS models. FSR 4 has a bit of a performance hit over 3, but given the improvement I think it's totally worthwhile. They tested Ratchet and Clank and showed 9070 XT FSR 4 Performance roughly matching the frame rate of the 5070 Ti DLSS 4 TF Performance and while the latter has better image quality based on TPU's review the 5070 Ti should be around 10-13% faster than the 9070 XT at native resolutions in this title.

I do find the ray tracing performance to be somewhat of a letdown, but that's mostly because Intel seems to have mostly matched nvidia's ray tracing performance (B series vs 40 series) in similar performance products. I'm happy to see that AMD massively improved the ray tracing performance over RDNA 2/3 because it's no longer an anchor dragging their products down.
 
I do find the ray tracing performance to be somewhat of a letdown, but that's mostly because Intel seems to have mostly matched nvidia's ray tracing performance (B series vs 40 series) in similar performance products. I'm happy to see that AMD massively improved the ray tracing performance over RDNA 2/3 because it's no longer an anchor dragging their products down.
Yeah i was hoping for matched 5070ti RT performance across a multitude of games ..

That would have been a real AMD win ..

Im happy at a loss of 6% on average give or take in raster over the 7900xtx it was a 1000usd GPU at release ..

But yes disappointed at RT performance has a whole !!
 
  • Like
Reactions: Loadedaxe
As far as tariffs and trade markets I think some of that will get sorted out over time. Anyway, I think the 9070 xt is the most interesting of these 2. Realistically if I needed a gpu I’d buy either one of these cards that were available, of course preferring the 9070xt.

However since I’m fortunate enough to be sitting on a 7900xtx, I think I’ll wait until UDNA comes out. If AMD hasn’t figured things out by then and Intel has an upper mid range to high end card, I may pick up an Intel card. However I’m pretty well set for my pc at the moment. Running a ryzen 7700 with 32gb ddr5 on an asus tuf gaming b650 board, roughly 4.5 tb of ssd between 3 drives, the 7900xtx of course and a 1200 watt psu. So I imagine my next upgrade will be probably a ryzen 7 10800x3d or whatever they call it at the time, or maybe more ram or storage. But for now this pc is fine as is.
 
  • Like
Reactions: Peksha
>First of all AMD marketing materials only compared the 9070 series to the 7900 GRE and then 5070 Ti (at 4k) not the 7900 XT/XTX so I'm not sure where the lies come in.

This is what happens when people jump into argue mode. They read what they want to read, and ignore everything else.
No I read what you typed not what I think you typed. If you meant something other than what you put out that's on you not me.
To paraphrase, I said, "if you think Nvidia lied, then AMD also lied...Marketing does not actually lie, it just stretches the truth."

AMD exaggerates XT-vs-GRE its perf numbers. HUB itself acknowledges this. People (AMD fans) took those numbers to say "9070XT is close to XTX". Nobody is lying, per se. Nvidia didn't technically lie either. But it did stretch the truth by a lot. That's marketing.
No nvidia actually lied and every time they compare FG to non they're lying just like when they compare MFG to FG.

AMD didn't exaggerate the performance numbers they just used different games (+37% at 1440p and +33% at 4k). HUB tested two games where the 9070 XT underperformed compared to RDNA3 which throws off their results. This doesn't mean they're wrong, or unimportant (knowing performance outliers I'd argue is very important), just different. TPU shows ~34% at 4k and ~29% at 1440p for the 9070 XT over the 7900 GRE (and they test CS2 which is one of the outliers).

So you can roll out this whole "marketing stretches the truth" nonsense to try to cover as if they're both the same, but they're not. Whether intentionally or not you're providing cover for nvidia by playing this "both sides are the same" game which is rarely a good take.
>Secondly I'm not sure how you could state that you watched the HUB video coming to the conclusion that the 9070 XT = 7900 XT unless you're shilling for nvidia.

My view is that 1440p is the sweet spot for the -70 segment. In HUB's 18-game 1440p avg, 9070XT (119 avg) is almost identical to 7900XT (117), while XTX is at 136.

Yes, one can lean on the 4K avg to claim that 9070XT (74) is between XTX (82) than XT (68). It's a toss-up here.

I do note that there's a propensity for fanboys to reach for the "shill" label whenever they encounter opinions that they don't agree with.
So your opinion is based on what you think the cards should be used for (1440p and completely ignoring ray tracing) and a singular data point yet you claim I'm a "fanboy"? This would be funny if it didn't strike me as sad because you typically come across as an intelligent and thoughtful individual.
I'll ask you the same, are you a fanboy shilling for AMD?
Nope, I'm not, because I don't care about companies just the product being released and the price involved. I read multiple data sources and checked for outliers as to get the best impression of what the product is rather than finding something that fits what I want and running with it.
You see, it works both ways. Before reaching for epithets, recognize that people have different perspectives and different opinions. Not everyone will see what you see.
You appear to be seeing what you want to see rather than establishing a well informed opinion.
 
Let's be real: anyone buying a GPU primarily for AI still isn't going to go with AMD. It's great that they've improved their AI performance, but Nvidia is still well in the lead and has far better software support.

In case you missed it, Jarred tested AI (inferencing) performance here:

I'm not even going to quote any of the graphs, because Nvidia beat AMD on all of them. It's really not until you get to the non-AI workstation benchmarks, where AMD has any wins. And a lot of that is just because AMD has kept a consistent focus on optimizing their workstation rendering performance.
I heard the same thing during the whole crypto shortage. AMD cards were not good for mining and blablabla, but all cards were out of stock and scalped anyways. The reality is that AI has a huge impact on the GPU market in general. And Nvidia selling GPUs like hotcakes has an effect on prices for all cards, no matter how good they are for AI.
 
As far as tariffs and trade markets I think some of that will get sorted out over time.
Some tariffs are here to stay, at least for as long as these products remain current. Given the node, it's possible AMD (which has already purchased wafer allocation in AZ) and even Nvidia can source chips from there, but TSMC has said the AZ wafer pricing is higher than Taiwan and demand could theoretically push pricing even to near the post-tariff prices of wafers from the Taiwan fabs.

Two reasons I think we're looking at a long-term tariff situation:
  1. If you really intend them to boost domestic production that's at a structural price disadvantage to current off-shore producers, then you need either tariffs or subsidies. There's just no way around that. This is why the tariffs on metals ostensibly came back.
  2. Things have been said that give the impression the administration is looking to substitute a significant amount of corporate and personal income tax with revenue from tariffs. If true, then we're going to be looking at tariffs on a substantial amount of imports, for as long as this policy continues to be pursued.

I think I'm not wrong about #1. At this point, #2 is up in the air. I don't really want to to debate either point - I only mention because I think people should seriously consider that tariffs are going to be in the picture, for the foreseeable future.
 
I heard the same thing during the whole crypto shortage. AMD cards were not good for mining and blablabla, but all cards were out of stock and scalped anyways. The reality is that AI has a huge impact on the GPU market in general. And Nvidia selling GPUs like hotcakes has an effect on prices for all cards, no matter how good they are for AI.
Wait, am I remembering it wrong? I thought for crypto, it was the AMD cards that did it better.

Or was it different between CryptoCraze 1 vs CryptoCraze 2?
 
Wait, am I remembering it wrong? I thought for crypto, it was the AMD cards that did it better.
IIRC, AMD was cost-effective at Ethereum, since performance mainly seemed dependent on memory bandwidth.

For crypto mining, the main thing is just to have a cost-effective setup. It doesn't need to be the fastest or most cost-effective to be profitable. Any card that was profitable would sell out.
 
Looks like the 9070 XT is the card to get if you can justify the cost and are in that $500-700 range. The 9070 on the other hand is disappointing because it doesn't particularly justify its cost (as expected) and just makes the $549 price point from both AMD and nvidia look bad.

I know you got a response to this, but I'll flat out say it: because people can't read. AMD said "~40% better than the 7900GRE at 4K" showing some numbers and that in text, but everyone is ignoring some of the games in the selection they had were also considering the titles where the 9070XT completely "demolishes" the 7900GRE in RT while having single digit framerates (slightly exaggerated, but I'm guessing you get the point). So this skews the results in favour, in a simple linear (or worse, geometric) average.

So this is half AMD's fault for not fully clarifying that in the marketing material and leaving it to reading comprehension. I hate the "see foot note XYZ" in all of AMD slides in particular, since they leave them all for the last portion of their presentations and never instruct (EDIT: this is my assumption; a correction is welcome) people looking at the slides to CONSIDER THOSE NOTES when communicating the things with the proper context.

Regards.
 
Looks good, both should be $50 cheaper so the OEM custom editions would come in at $599/$549 instead, but I'm not sure I'd be willing to bet in excess of $600 of my own money to have to deal with AMD's drivers again. Way too many negative experiences...
 
I know you got a response to this, but I'll flat out say it: because people can't read. AMD said "~40% better than the 7900GRE at 4K" showing some numbers and that in text, but everyone is ignoring some of the games in the selection they had were also considering the titles where the 9070XT completely "demolishes" the 7900GRE in RT while having single digit framerates (slightly exaggerated, but I'm guessing you get the point). So this skews the results in favour, in a simple linear (or worse, geometric) average.

So this is half AMD's fault for not fully clarifying that in the marketing material and leaving it to reading comprehension. I hate the "see foot note XYZ" in all of AMD slides in particular, since they leave them all for the last portion of their presentations and never instruct (EDIT: this is my assumption; a correction is welcome) people looking at the slides to CONSIDER THOSE NOTES when communicating the things with the proper context.

Regards.
While I agree AMD should have given separated averages they did separate the RT/raster graphs and show them side by side. One didn't need to read footnotes or anything to figure out the comparison for those just be willing to do some addition then division. I always cringe when looking through marketing slides, but in this case they were surprisingly clear if not presented the way I'd have liked.
 
Those temperature and noise results on the powercolor 9070 XT show that huge, triple slot coolers are completely unnecessary on these cards. Kudos to them for such a strong showing with what is, ostensibly, their "cheapest" model.

I want to see how small a 9070 can get for mini-itx builds. That's the only space where I see the non-XT model making any sense given its much lower power draw. Otherwise, just spend the extra $50 for the XT and reap huge performance gains.
 
While I agree AMD should have given separated averages they did separate the RT/raster graphs and show them side by side. One didn't need to read footnotes or anything to figure out the comparison for those just be willing to do some addition then division. I always cringe when looking through marketing slides, but in this case they were surprisingly clear if not presented the way I'd have liked.
Yes, but they didn't do it in all slides and places when talking performance. That's why people got slightly different messages, depending on the slide they were looking at, or decided to remember, haha.

Still, just to be clear: this was a great improvement over RDNA3's kerfuffle marketing and much clearer to me, personally. That's why I did not get any surprises when reading reviews. It landed exactly where they predicted with very small variance.

EDIT: Grammorz.

Regards.
 
Looks good, both should be $50 cheaper so the OEM custom editions would come in at $599/$549 instead, but I'm not sure I'd be willing to bet in excess of $600 of my own money to have to deal with AMD's drivers again. Way too many negative experiences...
Nvidia has had several high-profile driver snafus this year while reviewers are remarking that the 9070 drivers are the best AMD's ever had. Now might be the time to give AMD another look.
 
@JarredWaltonGPU ,

Thanks for the review! I do have one concern, which is that I've read elsewhere that VRAM temps on some of these cards can run hot. How was it looking on the PowerColor models you tested?
I saw a maximum of about 90C on the GDDR6 for the 9070 XT, which should be well within spec (typically either 105 TjMax or maybe 110). Maybe a few degrees warmer than on 7900 XTX cards? But there's definitely a potential for luck of the draw. Maybe some cards have misplaced thermal pads or similar issues.

CP77-VRAM-temps.jpg
Thanks for the review @JarredWaltonGPU, excellent work as always.

I will say this: it's definitely correct to say that the 9070XT is the better price/performance card, no doubt about it. It's also what all the pre-release numbers made pretty clear, so, completely expected. But, I'd say that the 9070 isn't MUCH worse than the 9070XT in that regard.

XT is 15% faster for 9% more money. Or, if I haven't bungled the math, the non-XT is 13% slower for 8% less money, if you like looking at it from the other direction. Either way, it doesn't seem like a large decrease in price/performance. And, I, for one, do like the idea of "it's close, but uses 26.6% less power." (or the XT uses 38.2% more power)
Yeah, I think that's a fair assessment. I often get stuck in tunnel vision while trying to wrap up these reviews, and I'm usually sleep deprived. (I went to bed at 9am this morning.) It's interesting that AMD dropped the boost clocks so much in order to drop the power as well. Like, was a 220W target that important, as opposed to 275W? It just feels like an attempt to push people to the higher card.

So...I get doing this as part of your verdict:

Cons​

  • Concerns with retail availability and pricing

But why make that a con when that wasn't even a made a point for Nvidia's 5080/90 on review release? NVIDIA has had a more pronounced track record over the most recent generations of having supply issues, so much so, you'd want to believe that they're doing it on purpose for profit margins sake. Jensen even said supply would be an issue this time around before release, yet that wasn't listed at the time for the 80/90 review, only subsequent models. Seems a bit one-sided and typical to me.

In any case, pointing this as a con is a good thing, but talk about changing your narrative to fit when in the comments section regarding the 5070ti you were recently talking about how stock should get better in the next couple of weeks. Yeah, okay.
Hindsight is always 20-20. At the time the 5090 launched, and the 5080, there was still some hope things wouldn't suck as bad as they suck. We don't actually know how many 5090 cards or 5080 cards have shipped and been sold, but it certainly feels woefully inadequate. I have talked to a few industry contacts at AIBs that say the total numbers to them seem similar to the 40-series, but that demand is just higher.

Nvidia has claimed that it's producing a lot more 5070 Ti and 5070 cards, but without hard numbers. Like for the 5070 Ti launch, someone said (paraphrasing), "Oh, we've produced a lot of cards and I think that anyone who wants a 5070 Ti and is ready to buy at launch will be able to get one at close to MSRP." Obviously, that proved to be incredibly naïve or just outright gaslighting. And the 5070 didn't do much better this morning (though I could have nabbed a card for $599 if I had wanted — it was at least still available at Newegg for maybe 5~10 minutes after launch).

This is so true. It was not an issue for the 5070 Ti because hey, everything will magically get resolved in two weeks, but for the AMD cards it's a real problem and is even worth to be mentionned as a con. My prediction is that it's going to be a huge problem for both AMD and Nvidia's cards, and for a while. In 2020-2022 it was the crypto, now it's AI. And unlike crypto mining, AI is not going anywhere.
I have noted concerns with supply in the 5070 and 5070 Ti reviews for sure. I didn't list it on the 5080, but I've added it now. It was on the 5090 as well. So please, stop trying to pretend I glossed over supply. All of these reviews get written and published before actual retail availability. That makes it a guessing game as to what will actually happen. Post-5090 and 5080 launches, it became abundantly clear that supply and demand were seriously out of whack and I don't think it's going away, so that has become an increasingly important consideration on the latest 5070 and 9070 XT reviews.

Wait, am I remembering it wrong? I thought for crypto, it was the AMD cards that did it better.

Or was it different between CryptoCraze 1 vs CryptoCraze 2?
AMD cards were better with phase 0 and 1 (2011 and 2014), as well as phase 2 (~2017). Nvidia was hit hardest with phase 3 (2020~2021).
 
Let's be real: anyone buying a GPU primarily for AI still isn't going to go with AMD. It's great that they've improved their AI performance, but Nvidia is still well in the lead and has far better software support.

In case you missed it, Jarred tested AI (inferencing) performance here:

I'm not even going to quote any of the graphs, because Nvidia beat AMD on all of them. It's really not until you get to the non-AI workstation benchmarks, where AMD has any wins. And a lot of that is just because AMD has kept a consistent focus on optimizing their workstation rendering performance.
The poor AI performance is 100% why I chose to grab a 3090 Ti over the 7900XTX last year, despite me being an unabashed AMD fanboy. To be fair to RDNA4, I fully expect performance to improve once software is updated to take advantage of the new int8 support. Still, AMD has a lot of ground to cover just to catch up to Ada, let alone Blackwell on AI. It seems at a hardware level they might've done it but at the software level? Not even close. I don't know if AMD has anything to compete with Triton & sage attention for example but on Ada cards it's a huge performance boost.
 
I don’t even think it’s a big deal for professional use. I’ve been running DP1.4a at 4K 240Hz with DSC for over two years and never notice any issues. I now have a DP2.1 display that can do 4K 240Hz without DSC (I think… maybe only 180 Hz without DSC?) and don’t see a difference.

DSC’s “visually lossless” compression does work well in practice.
@DS426 So the response from AMD is that only two DP can be in DP2.1 mode, and a third display would drop to DP1.4 mode. Meaning all three can be used simultaneously, just not at maximum DP2.1 speeds. Which, honestly, anyone that has triple DP2.1 monitors probably should splurge on a more potent GPU because money clearly isn't a bottleneck. LOL
 
The poor AI performance is 100% why I chose to grab a 3090 Ti over the 7900XTX last year, despite me being an unabashed AMD fanboy. To be fair to RDNA4, I fully expect performance to improve once software is updated to take advantage of the new int8 support. Still, AMD has a lot of ground to cover just to catch up to Ada, let alone Blackwell on AI. It seems at a hardware level they might've done it but at the software level? Not even close. I don't know if AMD has anything to compete with Triton & sage attention for example but on Ada cards it's a huge performance boost.
This is totally accurate. AMD has some tools that it has created to try to get around its lack of software support — or rather, tools it acquired instead of creating. Amuse.ai is a good example. It provides an easy to use front-end for a bunch of AI content generation stuff. But what it doesn't do is provide the same off the shelf functionality as you get with Nvidia.

You can go to Github and grab a repository and compile and it generally works on Nvidia, because that's the default target 99% of the time for AI it seems. Amuse does a bunch of the dirty work, optimizes things with ONNX, and runs decently fast. But you end up with a different interface and it's hard to get fully apples to apples comparisons.

I said in the cons, "Nvidia still wins on software support and features" for this review, and that includes AI, FSR uptake (especially FSR 3.1 and above, plus backporting), FSR image quality, ROCm and the lack of support there, dev support in general... AMD just doesn't feel like it gets how important software is.

Nvidia brags all the time about having more software engineers than hardware engineers. It's why CUDA is winning so badly right now. It's why DLSS is winning. AMD always seems to fall back to hardware first. Lisa Su was involved with the PS3, a notoriously difficult to program for platform, that was made so much worse by the lack of proper software support. And even now, over a decade later when nearly everyone agrees that, yes, PS3 was an interesting design that was ahead of its time in some ways but that was severely lacking in software support... Su doesn't seem to want to acknowledge that.

This is a brutal breakdown of the software problem at AMD.
Here's another one pointing out the problems.

The solution, IMO, is that AMD needs to hire about 10X more software people for its GPU division. I don't know, maybe only 3X would suffice, but whatever it has right now isn't enough. And even if it hired 5,000 people today as an example, it would take months to get them all up to speed. So this problem isn't going away any time soon... and even worse, AMD doesn't even seem to fully acknowledge that it is a problem. "ROCm? Oh, yeah, RDNA 4 will be supported — at some point in the future." Gee, I don't know why devs prefer CUDA to ROCm!
 

TRENDING THREADS