Ok, actually, you have to break perf between raster and RT as the 9070 XT is faster than the 7900 XTX in raster.
Ehh where did you see that....
The 7900 XTX is faster at raster then the 9070 XT, the 9070 XT is much better at RT.


Ok, actually, you have to break perf between raster and RT as the 9070 XT is faster than the 7900 XTX in raster.
Nothing below $720 by 1100am......
yet the tech powerup review using the sapphire nitro + oc on average is lower than the 7900xtx by 6% but on average 15% better in RT performance..The RX 7900 XTX still beats the RX 9700 XT in raster geomean at all resolutions.
It's only when you introduce RT that it falls down below the 9700 XT. And not even by that much - only about 10% or so.
It'll be viable for a long time, especially with 24 GB of VRAM.
It sounded to me like you bought the 7900 XTX because it was a good deal and I think it was. You knew it wasn't the fastest card available - it wasn't then and it isn't now. So, I think you should be satisfied with your choice and not stress over it.
First of all AMD marketing materials only compared the 9070 series to the 7900 GRE and then 5070 Ti (at 4k) not the 7900 XT/XTX so I'm not sure where the lies come in. Secondly I'm not sure how you could state that you watched the HUB video coming to the conclusion that the 9070 XT = 7900 XT unless you're shilling for nvidia.I watched the HUB review, since it came out first. The short of it: 9070XT = 7900XT (not XTX), both in perf and power consumption. So if you think Nvidia (Huang) lied about 5070's perf, then AMD also lied.
The RX 7900 XTX still beats the RX 9700 XT in raster geomean at all resolutions.
It's only when you introduce RT that it falls down below the 9700 XT. And not even by that much - only about 10% or so.
It'll be viable for a long time, especially with 24 GB of VRAM.
It sounded to me like you bought the 7900 XTX because it was a good deal and I think it was. You knew it wasn't the fastest card available - it wasn't then and it isn't now. So, I think you should be satisfied with your choice and not stress over it.
Yeah i was hoping for matched 5070ti RT performance across a multitude of games ..I do find the ray tracing performance to be somewhat of a letdown, but that's mostly because Intel seems to have mostly matched nvidia's ray tracing performance (B series vs 40 series) in similar performance products. I'm happy to see that AMD massively improved the ray tracing performance over RDNA 2/3 because it's no longer an anchor dragging their products down.
No I read what you typed not what I think you typed. If you meant something other than what you put out that's on you not me.>First of all AMD marketing materials only compared the 9070 series to the 7900 GRE and then 5070 Ti (at 4k) not the 7900 XT/XTX so I'm not sure where the lies come in.
This is what happens when people jump into argue mode. They read what they want to read, and ignore everything else.
No nvidia actually lied and every time they compare FG to non they're lying just like when they compare MFG to FG.To paraphrase, I said, "if you think Nvidia lied, then AMD also lied...Marketing does not actually lie, it just stretches the truth."
AMD exaggerates XT-vs-GRE its perf numbers. HUB itself acknowledges this. People (AMD fans) took those numbers to say "9070XT is close to XTX". Nobody is lying, per se. Nvidia didn't technically lie either. But it did stretch the truth by a lot. That's marketing.
So your opinion is based on what you think the cards should be used for (1440p and completely ignoring ray tracing) and a singular data point yet you claim I'm a "fanboy"? This would be funny if it didn't strike me as sad because you typically come across as an intelligent and thoughtful individual.>Secondly I'm not sure how you could state that you watched the HUB video coming to the conclusion that the 9070 XT = 7900 XT unless you're shilling for nvidia.
My view is that 1440p is the sweet spot for the -70 segment. In HUB's 18-game 1440p avg, 9070XT (119 avg) is almost identical to 7900XT (117), while XTX is at 136.
Yes, one can lean on the 4K avg to claim that 9070XT (74) is between XTX (82) than XT (68). It's a toss-up here.
I do note that there's a propensity for fanboys to reach for the "shill" label whenever they encounter opinions that they don't agree with.
Nope, I'm not, because I don't care about companies just the product being released and the price involved. I read multiple data sources and checked for outliers as to get the best impression of what the product is rather than finding something that fits what I want and running with it.I'll ask you the same, are you a fanboy shilling for AMD?
You appear to be seeing what you want to see rather than establishing a well informed opinion.You see, it works both ways. Before reaching for epithets, recognize that people have different perspectives and different opinions. Not everyone will see what you see.
I heard the same thing during the whole crypto shortage. AMD cards were not good for mining and blablabla, but all cards were out of stock and scalped anyways. The reality is that AI has a huge impact on the GPU market in general. And Nvidia selling GPUs like hotcakes has an effect on prices for all cards, no matter how good they are for AI.Let's be real: anyone buying a GPU primarily for AI still isn't going to go with AMD. It's great that they've improved their AI performance, but Nvidia is still well in the lead and has far better software support.
In case you missed it, Jarred tested AI (inferencing) performance here:
I'm not even going to quote any of the graphs, because Nvidia beat AMD on all of them. It's really not until you get to the non-AI workstation benchmarks, where AMD has any wins. And a lot of that is just because AMD has kept a consistent focus on optimizing their workstation rendering performance.
Some tariffs are here to stay, at least for as long as these products remain current. Given the node, it's possible AMD (which has already purchased wafer allocation in AZ) and even Nvidia can source chips from there, but TSMC has said the AZ wafer pricing is higher than Taiwan and demand could theoretically push pricing even to near the post-tariff prices of wafers from the Taiwan fabs.As far as tariffs and trade markets I think some of that will get sorted out over time.
Wait, am I remembering it wrong? I thought for crypto, it was the AMD cards that did it better.I heard the same thing during the whole crypto shortage. AMD cards were not good for mining and blablabla, but all cards were out of stock and scalped anyways. The reality is that AI has a huge impact on the GPU market in general. And Nvidia selling GPUs like hotcakes has an effect on prices for all cards, no matter how good they are for AI.
IIRC, AMD was cost-effective at Ethereum, since performance mainly seemed dependent on memory bandwidth.Wait, am I remembering it wrong? I thought for crypto, it was the AMD cards that did it better.
Looks like the 9070 XT is the card to get if you can justify the cost and are in that $500-700 range. The 9070 on the other hand is disappointing because it doesn't particularly justify its cost (as expected) and just makes the $549 price point from both AMD and nvidia look bad.
While I agree AMD should have given separated averages they did separate the RT/raster graphs and show them side by side. One didn't need to read footnotes or anything to figure out the comparison for those just be willing to do some addition then division. I always cringe when looking through marketing slides, but in this case they were surprisingly clear if not presented the way I'd have liked.I know you got a response to this, but I'll flat out say it: because people can't read. AMD said "~40% better than the 7900GRE at 4K" showing some numbers and that in text, but everyone is ignoring some of the games in the selection they had were also considering the titles where the 9070XT completely "demolishes" the 7900GRE in RT while having single digit framerates (slightly exaggerated, but I'm guessing you get the point). So this skews the results in favour, in a simple linear (or worse, geometric) average.
So this is half AMD's fault for not fully clarifying that in the marketing material and leaving it to reading comprehension. I hate the "see foot note XYZ" in all of AMD slides in particular, since they leave them all for the last portion of their presentations and never instruct (EDIT: this is my assumption; a correction is welcome) people looking at the slides to CONSIDER THOSE NOTES when communicating the things with the proper context.
Regards.
Yes, but they didn't do it in all slides and places when talking performance. That's why people got slightly different messages, depending on the slide they were looking at, or decided to remember, haha.While I agree AMD should have given separated averages they did separate the RT/raster graphs and show them side by side. One didn't need to read footnotes or anything to figure out the comparison for those just be willing to do some addition then division. I always cringe when looking through marketing slides, but in this case they were surprisingly clear if not presented the way I'd have liked.
Nvidia has had several high-profile driver snafus this year while reviewers are remarking that the 9070 drivers are the best AMD's ever had. Now might be the time to give AMD another look.Looks good, both should be $50 cheaper so the OEM custom editions would come in at $599/$549 instead, but I'm not sure I'd be willing to bet in excess of $600 of my own money to have to deal with AMD's drivers again. Way too many negative experiences...
Agreed, was very nice to see such detailed reporting so soon. It's clear Alex is ecstatic over the improvements, no matter how sleep deprived he must be at this point.Worth looking at Digital foundry YouTube, there is a review of FSR 4 vs Nvidia, old and new DLSS upscaling.
I saw a maximum of about 90C on the GDDR6 for the 9070 XT, which should be well within spec (typically either 105 TjMax or maybe 110). Maybe a few degrees warmer than on 7900 XTX cards? But there's definitely a potential for luck of the draw. Maybe some cards have misplaced thermal pads or similar issues.@JarredWaltonGPU ,
Thanks for the review! I do have one concern, which is that I've read elsewhere that VRAM temps on some of these cards can run hot. How was it looking on the PowerColor models you tested?
Yeah, I think that's a fair assessment. I often get stuck in tunnel vision while trying to wrap up these reviews, and I'm usually sleep deprived. (I went to bed at 9am this morning.) It's interesting that AMD dropped the boost clocks so much in order to drop the power as well. Like, was a 220W target that important, as opposed to 275W? It just feels like an attempt to push people to the higher card.Thanks for the review @JarredWaltonGPU, excellent work as always.
I will say this: it's definitely correct to say that the 9070XT is the better price/performance card, no doubt about it. It's also what all the pre-release numbers made pretty clear, so, completely expected. But, I'd say that the 9070 isn't MUCH worse than the 9070XT in that regard.
XT is 15% faster for 9% more money. Or, if I haven't bungled the math, the non-XT is 13% slower for 8% less money, if you like looking at it from the other direction. Either way, it doesn't seem like a large decrease in price/performance. And, I, for one, do like the idea of "it's close, but uses 26.6% less power." (or the XT uses 38.2% more power)
Hindsight is always 20-20. At the time the 5090 launched, and the 5080, there was still some hope things wouldn't suck as bad as they suck. We don't actually know how many 5090 cards or 5080 cards have shipped and been sold, but it certainly feels woefully inadequate. I have talked to a few industry contacts at AIBs that say the total numbers to them seem similar to the 40-series, but that demand is just higher.So...I get doing this as part of your verdict:
Cons
- Concerns with retail availability and pricing
But why make that a con when that wasn't even a made a point for Nvidia's 5080/90 on review release? NVIDIA has had a more pronounced track record over the most recent generations of having supply issues, so much so, you'd want to believe that they're doing it on purpose for profit margins sake. Jensen even said supply would be an issue this time around before release, yet that wasn't listed at the time for the 80/90 review, only subsequent models. Seems a bit one-sided and typical to me.
In any case, pointing this as a con is a good thing, but talk about changing your narrative to fit when in the comments section regarding the 5070ti you were recently talking about how stock should get better in the next couple of weeks. Yeah, okay.
I have noted concerns with supply in the 5070 and 5070 Ti reviews for sure. I didn't list it on the 5080, but I've added it now. It was on the 5090 as well. So please, stop trying to pretend I glossed over supply. All of these reviews get written and published before actual retail availability. That makes it a guessing game as to what will actually happen. Post-5090 and 5080 launches, it became abundantly clear that supply and demand were seriously out of whack and I don't think it's going away, so that has become an increasingly important consideration on the latest 5070 and 9070 XT reviews.This is so true. It was not an issue for the 5070 Ti because hey, everything will magically get resolved in two weeks, but for the AMD cards it's a real problem and is even worth to be mentionned as a con. My prediction is that it's going to be a huge problem for both AMD and Nvidia's cards, and for a while. In 2020-2022 it was the crypto, now it's AI. And unlike crypto mining, AI is not going anywhere.
AMD cards were better with phase 0 and 1 (2011 and 2014), as well as phase 2 (~2017). Nvidia was hit hardest with phase 3 (2020~2021).Wait, am I remembering it wrong? I thought for crypto, it was the AMD cards that did it better.
Or was it different between CryptoCraze 1 vs CryptoCraze 2?
The poor AI performance is 100% why I chose to grab a 3090 Ti over the 7900XTX last year, despite me being an unabashed AMD fanboy. To be fair to RDNA4, I fully expect performance to improve once software is updated to take advantage of the new int8 support. Still, AMD has a lot of ground to cover just to catch up to Ada, let alone Blackwell on AI. It seems at a hardware level they might've done it but at the software level? Not even close. I don't know if AMD has anything to compete with Triton & sage attention for example but on Ada cards it's a huge performance boost.Let's be real: anyone buying a GPU primarily for AI still isn't going to go with AMD. It's great that they've improved their AI performance, but Nvidia is still well in the lead and has far better software support.
In case you missed it, Jarred tested AI (inferencing) performance here:
I'm not even going to quote any of the graphs, because Nvidia beat AMD on all of them. It's really not until you get to the non-AI workstation benchmarks, where AMD has any wins. And a lot of that is just because AMD has kept a consistent focus on optimizing their workstation rendering performance.
@DS426 So the response from AMD is that only two DP can be in DP2.1 mode, and a third display would drop to DP1.4 mode. Meaning all three can be used simultaneously, just not at maximum DP2.1 speeds. Which, honestly, anyone that has triple DP2.1 monitors probably should splurge on a more potent GPU because money clearly isn't a bottleneck. LOLI don’t even think it’s a big deal for professional use. I’ve been running DP1.4a at 4K 240Hz with DSC for over two years and never notice any issues. I now have a DP2.1 display that can do 4K 240Hz without DSC (I think… maybe only 180 Hz without DSC?) and don’t see a difference.
DSC’s “visually lossless” compression does work well in practice.
This is totally accurate. AMD has some tools that it has created to try to get around its lack of software support — or rather, tools it acquired instead of creating. Amuse.ai is a good example. It provides an easy to use front-end for a bunch of AI content generation stuff. But what it doesn't do is provide the same off the shelf functionality as you get with Nvidia.The poor AI performance is 100% why I chose to grab a 3090 Ti over the 7900XTX last year, despite me being an unabashed AMD fanboy. To be fair to RDNA4, I fully expect performance to improve once software is updated to take advantage of the new int8 support. Still, AMD has a lot of ground to cover just to catch up to Ada, let alone Blackwell on AI. It seems at a hardware level they might've done it but at the software level? Not even close. I don't know if AMD has anything to compete with Triton & sage attention for example but on Ada cards it's a huge performance boost.