Review AMD Radeon RX 9070 XT and RX 9070 review: An excellent value, if supply is good

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I watched the HUB review, since it came out first. The short of it: 9070XT = 7900XT (not XTX), both in perf and power consumption. So if you think Nvidia (Huang) lied about 5070's perf, then AMD also lied.

AMD said it was on parity with the 5070TI, which it is!

2160p.png


Jensen said the 5070 was FASTER than the 4090, which is a blatant lie.
 
One thing to remember, there are lies, damn lies and statistics. Depending on your test suite and your words you can get any result you want and push any agenda you wish.
It was compared to the GRE because of it's MSRP... 550$...

You are the one dishonest here with the fanboy agenda.

Matter of fact, this GPU is the best value in the market... AND... will be the only thing available in the channel.
 
  • Like
Reactions: Cooe
It was compared to the GRE because of it's MSRP... 550$...

You are the one dishonest here with the fanboy agenda.

Matter of fact, this GPU is the best value in the market... AND... will be the only thing available in the channel.
You misunderstood, not being dishonest. I expected it to be around the 7900xt. That it is quicker and approaches the xtx in some cases is impressive. The reported boost to RT is really good.

I agree about value, so long as the RRP holds.

My criticism if any is in the way HUB presented its data.
 
Good review. If I were in the market for a gpu, I definitely would be hunting for a 9070xt. However, I was fortunate enough to find a 7900 xtx for $800 in December. Nice to see it seems to sit between the 9070 and the xt in RT and still has the edge on raster with an extra 8gb of vram. I think I'll definitely be sitting on the 7900xtx through this generation especially if they bring FSR 4 to the 7000 series which I've heard they might do later. But good job on this one AMD. You've already got my $$ for this generation, but if I were gpu shopping, the XT would likely be the pick.
 
  • Like
Reactions: stuff and nonesense
This is not accurate.
The 9070 xt is closer to the 7900xtx in performance than the 7900xt.
The 9070 is faster than the 5070.

Nvidia, without price adjustments, is no longer competitive in this market segment, IMO.
They are no longer competitive in anything. The only thing they have is the Halo product that is the 5090, if you really want to absolute performance.

All the rest below, is either a 4080 in disguise, or a crippled offering (5070).

I have no hope whatsoever that the 5060 and 5060 TI will be any better than the 5070.
 
  • Like
Reactions: SSGBryan
While it all looks okay, anyone saying a 600 dollar GPU is saving PC gaming has several screws loose, again.
But, it seems the real battle will be supply chain, it will be the winner if you can actually walk in to a BB and purchase, or get one on Amazon or NE without having to be on the lookout for stock notifications for the next three months.
 
How do you define mainstream gaming? I still don't think these cards are mainstream, like at all.

In Steam hardware survey, the 50th percentile gaming PC is playing at 1080p with <8GB VRAM and 16GB of RAM (with most of the other half at 32GB, which was a "massive swing").
It's disappointing that PC gaming has been stuck at 1080p for like a decade, but it is what it is.

From a typical user perspective, these 1440p60+ ultra cards are targeting a market segment that is higher than mainstream.

Another way of looking at what is mainstream is by what is popular:
february-steam-hardware-survey-after-almost-2-years-the-rtx-v0-vh423ag0yame1.jpeg


The top 10 GPUs are definitely getting more expensive lately, especially since Nvidia stopped halted manufacture of <$300 gaming cards - but only 2/10 cross the $500 barrier and you're still looking at an average MSRP of around $400-$450 depending on which 3060 version and how you price a laptop GPU.

February is an outlier though.

January was closer to typical though, with only 1 >$500 MSRP card in the top 10 and an average of around $360.
steam-hardware-survey-january-2025-v0-oatuj5e8rqge1.png


A third way to look at it is to just say "Nvidia controls the market and is smarter than us, so mainstream gaming is whatever they theoretically say their xx60 cards are worth" Which right now is either $300, or nothing. Because there is no RTX 5060, and RTX 4060 production has probably been stopped in favor of higher margin cards.

Steam Surveys are not the best metric, and applying USA MSRP pricing to a global snapshot of usage is not so good either - but I'm not finding any way of looking at it where its justified to start calling $550 "Mainstream".
I don't think it's a good idea to base what we call mainstream on the currently offered product stack though, because people do not have to buy a GPU to play games. If Nvidia stopped making every card except $2000 5090s, that would not suddenly turn the RTX 5090 into an entry level card. PC gaming is not an essential, and other forms of entertainment are easily available everywhere, for free. When people are faced with the choice of a $500+ gaming GPU and nothing... most of them are going to choose nothing... Or, they choose a PlayStation.

I think a debate for what counts as mainstream gaming is *today* should start somewhere in the ballpark of a 1080p60 ultra card that is somewhere in the price range of $300-$500.
The RX 9070 XT and RX 9070 just don't meet that definition.

Of course, all this all is just limiting the topic to dedicated PC gaming and ignoring true mainstream gaming, because 99%+ of all games are just played on people's phones.
 
While it all looks okay, anyone saying a 600 dollar GPU is saving PC gaming has several screws loose, again.
But, it seems the real battle will be supply chain, it will be the winner if you can actually walk in to a BB and purchase, or get one on Amazon or NE without having to be on the lookout for stock notifications for the next three months.

There’s the key. This generation the bar is lower imo. The key point being that they don’t really have to win the performance battle. Really as long as they can win the supply battle and actually keep a greater number of cards in stock at close to msrp they may win by default.

In other words, let’s say your shopping for a 5070ti but for a month or two at a time they are out of stock but you can find these more regularly at .600-650 and you need a gpu, knowing these are something like 2% slower than the 5070ti for less money, if you need a gpu, I would think one of these would be a tempting offer for a gpu that you can actually buy and that is close to the card you wanted.

I don’t know if this will happen but I’d like to see them give nvidia the old 1 2 and bring out 9060 cards with 12gb of vram as the baseline. That would put pressure on nvidia in the entry level segment.
 
Thanks for the review @JarredWaltonGPU, excellent work as always.

And, makes me feel pretty good for this release with AMD. I'm not in the market (I've BARELY made my RX 6700 do anything yet... I need to get back into gaming), but this feels like it's better than I was expecting.

I will say this: it's definitely correct to say that the 9070XT is the better price/performance card, no doubt about it. It's also what all the pre-release numbers made pretty clear, so, completely expected. But, I'd say that the 9070 isn't MUCH worse than the 9070XT in that regard.

XT is 15% faster for 9% more money. Or, if I haven't bungled the math, the non-XT is 13% slower for 8% less money, if you like looking at it from the other direction. Either way, it doesn't seem like a large decrease in price/performance. And, I, for one, do like the idea of "it's close, but uses 26.6% less power." (or the XT uses 38.2% more power)

If you don't really need that extra 15% performance, the lesser heat and power draw could be very compelling, considering there isn't too much of a loss of price/performance.

Or maybe I'm just the guy always looking at "Can I live with a step lower, and do I need max settings?"
 
So...I get doing this as part of your verdict:

Cons​

  • Concerns with retail availability and pricing

But why make that a con when that wasn't even a made a point for Nvidia's 5080/90 on review release? NVIDIA has had a more pronounced track record over the most recent generations of having supply issues, so much so, you'd want to believe that they're doing it on purpose for profit margins sake. Jensen even said supply would be an issue this time around before release, yet that wasn't listed at the time for the 80/90 review, only subsequent models. Seems a bit one-sided and typical to me.

In any case, pointing this as a con is a good thing, but talk about changing your narrative to fit when in the comments section regarding the 5070ti you were recently talking about how stock should get better in the next couple of weeks. Yeah, okay.
 
  • Like
Reactions: Cooe and redgarl
Oh, meant to address this, but forgot in my last post:
The 9070 XT meanwhile ends up using more power across the test suite compared to the 5070 Ti. That's interesting, as Nvidia uses more power with the 5070 relative to the 9070, while the 5070 Ti offers more performance than the 9070 XT while drawing less power.

This was definitely a very big what the heck for me - Nvidia's bigger card is more efficient, whereas AMD's smaller card is more efficient? I suppose we've seen the "lower-power card is less efficient" when the lower-powered card was a cut down version of a bigger one, and the clocks are run higher (past their sweet spot) to make up for fewer hardware resources. But, it looks like the 5070 and 5070 Ti have about the same clocks, so it's a bit confusing here.

For AMD, on the other hand, 9070 vs 9070 XT efficiency difference makes sense, given the significantly higher clock speed of the XT.
 
Last edited by a moderator:
My take on this is the actual 9070xt models are vastly different the xfx mercury is running the 3100mhz boost clock which is giving you better out of the box performance than say the one HUB and GN were using ..

That said like Steve said in HUB conclusion that the xfx mercury magnetic fan model its hitting 750usd to 800usd prices which defeats the purpose of the 600usd price tag ..

Not sure im sold on the 9070xt i really was expecting MUCH better RT performance to make that switch from my 7900xtx red devil to the 9070xt a drop in raster yes but for a decent boost in RT and i just didnt see it..

Im not disappointed with the fact that RT isnt AMAZING i just think for VS the 7900xtx i cant see a decent uplift in RT
 
@JarredWaltonGPU ,

Thanks for the review! I do have one concern, which is that I've read elsewhere that VRAM temps on some of these cards can run hot. How was it looking on the PowerColor models you tested?
TPU covered it with all of the variants they had, and while it seems okay it does make me wonder about design (AMD is using Hynix 20Gbps GDDR6 so maybe this?) with the GPU being that much cooler:
https://www.techpowerup.com/review/sapphire-radeon-rx-9070-xt-nitro/39.html
https://www.techpowerup.com/review/sapphire-radeon-rx-9070-xt-nitro/40.html
 
Last edited:
The awakening might be brutal. We are already talking about 850$ for the 9070 XT (according to Jaytwocents). Those MSRP prices might not be seen anytime soon. You are all praising AMD right now but we might end-up with a card slightly more powerful than the 7900 XT starting at 850$. I would wait before saying Nvidia is cooked.

In any case, pointing this as a con is a good thing, but talk about changing your narrative to fit when in the comments section regarding the 5070ti you were recently talking about how stock should get better in the next couple of weeks. Yeah, okay.
This is so true. It was not an issue for the 5070 Ti because hey, everything will magically get resolved in two weeks, but for the AMD cards it's a real problem and is even worth to be mentionned as a con. My prediction is that it's going to be a huge problem for both AMD and Nvidia's cards, and for a while. In 2020-2022 it was the crypto, now it's AI. And unlike crypto mining, AI is not going anywhere.
 
Looks like the XTX is between the 2 cards.
The RX 7900 XTX still beats the RX 9700 XT in raster geomean at all resolutions.

Huc985Ry4n2nCoMjcj5tvW.png

It's only when you introduce RT that it falls down below the 9700 XT. And not even by that much - only about 10% or so.

I’ve got a 7900xtx as well and personally I think I’m just going to sit on the xtx until another card that is worth the cash comes up.
It'll be viable for a long time, especially with 24 GB of VRAM.

It sounded to me like you bought the 7900 XTX because it was a good deal and I think it was. You knew it wasn't the fastest card available - it wasn't then and it isn't now. So, I think you should be satisfied with your choice and not stress over it.
 
My prediction is that it's going to be a huge problem for both AMD and Nvidia's cards, and for a while. In 2020-2022 it was the crypto, now it's AI. And unlike crypto mining, AI is not going anywhere.
Let's be real: anyone buying a GPU primarily for AI still isn't going to go with AMD. It's great that they've improved their AI performance, but Nvidia is still well in the lead and has far better software support.

In case you missed it, Jarred tested AI (inferencing) performance here:

I'm not even going to quote any of the graphs, because Nvidia beat AMD on all of them. It's really not until you get to the non-AI workstation benchmarks, where AMD has any wins. And a lot of that is just because AMD has kept a consistent focus on optimizing their workstation rendering performance.