Review AMD Radeon RX 9070 XT and RX 9070 review: An excellent value, if supply is good

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Manufacturing the chips here isn't going to avoid the tariffs because the chip packaging, GPU construction, RAM, and motherboards are all being made in Asia. The world is going to go on their merry way and US consumers won't even realize they are paying a 20%+ Federal sales tax. Fabs are only built here when they are supported by the government (CHIPS Act, massive TIF and tax rebates). Have we forgotten about the MASSIVE Foxconn campus in Wisconsin already?
 
Ticks all my boxes. Massively faster in raster than my 6800XT, even more massive improvements in RT even over 7900XTX, lower power, massively more powerful for AI 1554 TOPS vs ~60-70TOPS, lower cost $599 vs $699 (at release), FSR4 support. Even 9070 makes the 7800XT look rubbish, which it sort of is being a 7700XT in all but name.

I'm in no hurry to buy one, and hopefully by waiting 3-6 months drivers will only improve and supply will be decent.

Could AMD technically build a larger monolithic GPU with say 6000 cores, and 24GB. Hell, a 9080XT/9090XT would be very impressive.
 
  • Like
Reactions: salgado18
Hindsight is always 20-20. At the time the 5090 launched, and the 5080, there was still some hope things wouldn't suck as bad as they suck. We don't actually know how many 5090 cards or 5080 cards have shipped and been sold, but it certainly feels woefully inadequate. I have talked to a few industry contacts at AIBs that say the total numbers to them seem similar to the 40-series, but that demand is just higher.

Nvidia has claimed that it's producing a lot more 5070 Ti and 5070 cards, but without hard numbers. Like for the 5070 Ti launch, someone said (paraphrasing), "Oh, we've produced a lot of cards and I think that anyone who wants a 5070 Ti and is ready to buy at launch will be able to get one at close to MSRP." Obviously, that proved to be incredibly naïve or just outright gaslighting. And the 5070 didn't do much better this morning (though I could have nabbed a card for $599 if I had wanted — it was at least still available at Newegg for maybe 5~10 minutes after launch).

It's a combination of Nvidia draining down the 40 series supply by just not making many of them last year combined with everyone who was on the 30 series and was looking for an upgrade. I myself had a 3080 from EVGA (I miss those guys) and skipped the 40 series because of the segmentation shenanigans that nVidia pulled. Was waiting for the 50 series, watched the 5090 do it's halo thing, then when the 5080 landed I saw the writing on the wall and immediately grabbed the next best thing, 7900 XTX. Couple days later prices exploded and supply vanished as everyone else who was waiting did the exact same thing.
 
Next generation for AMD should bring them far enough along to be close enough to parity with nVidia I am hoping. Unless nVidia gets back to actually catering to gamers and modest creators. This is a great stretch forward for their tech and looks very promising for the future. If I saw more 4K 60 fps in the ray tracing arena from the 9070 xt I would probably be asking the wife if I could burn $600 to upgrade my alternate PC. It is kind of compelling as it stands, compared to what nVidia has put forward this generation, but I think I can keep gaming 1080P for now on my alternate.
 
  • Like
Reactions: Peksha
The AMD Radeon RX 9070 XT looks set to deliver highly competitive mainstream performance at a great price — perhaps too great. We'll need to see what retail availability looks like, but the performance in rasterization, ray tracing, and AI workloads has improved a lot, closing the gap between AMD and Nvidia GPUs.

AMD Radeon RX 9070 XT and RX 9070 review: An excellent value, if supply is good : Read more
Seems like no one wants to win this gen. Waiting for the next one and maybe it will be more than vaporware.
 
It's a combination of Nvidia draining down the 40 series supply by just not making many of them last year combined with everyone who was on the 30 series and was looking for an upgrade. I myself had a 3080 from EVGA (I miss those guys) and skipped the 40 series because of the segmentation shenanigans that nVidia pulled. Was waiting for the 50 series, watched the 5090 do it's halo thing, then when the 5080 landed I saw the writing on the wall and immediately grabbed the next best thing, 7900 XTX. Couple days later prices exploded and supply vanished as everyone else who was waiting did the exact same thing.

I was able to get a 7900xtx for $800 from Microcenter the day after Christmas. I must have gotten one of the few they had left for that money. Pretty happy with that card.

I was watching my local Microcenter’s website when the 5080 dropped and a lot of other folks I think did what you did. Because that day suddenly they ran out of the 5000 series cards and then seemed to run out of the high end and cards as well. I’m assuming that people who were there that couldn’t get a 5000 series card, just started buying up whatever high end gpus they could.
 
Nvidia brags all the time about having more software engineers than hardware engineers. It's why CUDA is winning so badly right now. It's why DLSS is winning. AMD always seems to fall back to hardware first. Lisa Su was involved with the PS3, a notoriously difficult to program for platform, that was made so much worse by the lack of proper software support. And even now, over a decade later when nearly everyone agrees that, yes, PS3 was an interesting design that was ahead of its time in some ways but that was severely lacking in software support... Su doesn't seem to want to acknowledge that.
I do think that's a reach.

Very interesting read! Thanks for sharing! TBH, I wasn't very surprised, although I had expected software support for their flagship CDNA product would be in better shape than what we see with ROCm on consumer GPUs.

I also read part of that interview the commentator is talking about. It's quite long, though. I think the commentator has some points, but I diverge part way through their post. I think it's most likely that Lisa was being guarded so as not to give away any mea culpa quotes. I think it's fair to say that AMD underestimated and underresourced their software efforts, but I'm sure there's not so much lack of awareness or self-reflection.

Furthermore, it's quite obvious that AMD does care about ROCm and they do care about FSR. And I think the ROCm vs. Pytorch quote was taken out of context.

IMO, the core problem is this: why did AMD think they could beat Nvidia at their own game? This has been notoriously difficult in the 3D graphics realm, which has been a focus for far longer. And you really have to ask yourself some hard questions about your tactics, if this is going to be your approach. It's a David and Goliath battle, meaning you have to be more clever, more resourceful, and try to find every possible advantage you can. To approach it as a mere slug-fest is a sure way to lose.

The solution, IMO, is that AMD needs to hire about 10X more software people for its GPU division. I don't know, maybe only 3X would suffice, but whatever it has right now isn't enough.
10x seems way over, but I could easily see them being off by a factor of 2. Maybe more, if their strategy is really just to beat Nvidia at its own playbook.

BTW, AMD did recently open new engineering offices in Serbia. I would note that a lot of tech talent fled there from Russia, over the past few years:

And even if it hired 5,000 people today as an example, it would take months to get them all up to speed. So this problem isn't going away any time soon...
I think Wall St. has figured this out, which is why we saw the correction in AMD's stock price, and it's been on a long, slow slide since last Oct.

Gee, I don't know why devs prefer CUDA to ROCm!
They always will. The fundamental problem is that ROCm's HIP will always be an off-brand CUDA wannabe. They will always be playing catchup and basically the best they can do is just be as good as CUDA - maybe a little faster.

That said, AMD isn't wrong to say that a lot of AI users don't directly use ROCm and don't really care what's under PyTorch, as long as it works and it's fast.


P.S. I think maybe AMD had the wrong strategy. Perhaps they could've sued Nvidia for using its monopoly power to lock customers out of other hardware. A victory here could've enabled AMD to integrate with Nvidia's CUDA stack at the PTX level. That gets AMD mostly out of the software race, where they've had the most trouble playing. However, I see little chance of that happening now.
 
Last edited:
  • Like
Reactions: thestryker
I saw a maximum of about 90C on the GDDR6 for the 9070 XT, which should be well within spec (typically either 105 TjMax or maybe 110). Maybe a few degrees warmer than on 7900 XTX cards? But there's definitely a potential for luck of the draw. Maybe some cards have misplaced thermal pads or similar issues.
This reminds me a bit of the reasoning behind NVIDIA's decision to remove external reporting of hotspot temps. They can cause unneeded worry/concern. I don't agree with the decision - I always think more data is better than less - but I can understand it, at least.
I limit my 7900 XTX to 400W but even so, with my tweaking, it still reaches a max hot spot and memory junction temp of 98ºC, with the base GPU temperature more than 30ºC cooler. I did take mine apart and put a kryosheet in place of the core thermal paste, but even before that I had a difference of 20ºC showing.

I don't think memory junction temps or hot spot temps are a good indication of an issue with GPUs, at least not anymore.
 
Last edited:
I watched the HUB review, since it came out first. The short of it: 9070XT = 7900XT (not XTX), both in perf and power consumption. So if you think Nvidia (Huang) lied about 5070's perf, then AMD also lied.

People are still clueless about marketing. It doesn't lie. It stretches the truth (exaggerates). That's its job. Every company does it. If you're worked up over it, and think that one side lies and the other doesn't, you've been suckered.

About pricing: Now we see the real reason for XT's $600 pricing, a climb down from the anticipated $650-700. XT is not as fast as Ti, and uses more power. A $50 price diff wouldn't have mattered, so XT got a haircut and diff is now $150, which theoretically allows XT to win on bang/buck, since it can't win on the bang.

The price drop wasn't about AMD being nice to gamers. It's just competitive positioning. If you think one company cares about you and the other doesn't, that's just another lie. But this time, it's you lying to yourself.

I say "XT theoretically wins," because dollars to donuts MSRP parts will be instantly OOS just like 5070 is. If you think the 2-month "stockpile" can overcome scalpers, I envy you your optimism.

Yes, the conventional wisdom is to wait for prices to "settle" and inventory to "catch up." People have short memory, and they forgot how it was during the crypto boom. Inventory won't catch up. The clue is that all the alternatives, previous gen parts, are also OOS or marked up to heaven. Demand will rise as we get into the holiday seasons, when people traditionally buy electronics. Your best chance is to do what scalpers do and use a buy bot, because it won't get better.
So 9070XT beats the 7900XT in raster, RTing, beats the 700XTX in RTing and is only 5% slower in raster and uses 20-25% less power for a lot less money and you say AMD lied. All the pricing was BS speculation. AMD never climbed down from anything, they never once said what the price was going to be before Feb 27/28. Forums went into meltdown over nothing. So basically AMD cannot do anything to please you other than say give you a card for almost nothing I guess.
 
also my opinion amd just needs to focus on driver support forget high end and focus on market share and not do a Icarus and fly to close to the high end and make a rear end of it. these 2 cards are a great start and with more driver focus they can then add some features. hopefully there team is realising that they need to brush up on drivers and this isnt just a 1 hit wonder. they desperately need something in the 100-200 area which just has to be good at 1080p enough to swing at intel and nvidia scraps.
 
XFX Swift, Powercolor Hellhound and Sapphire Pulse are going to be MSRP models... most likely...
What makes you think those will be available at all? The OC cards are as bad a deal as the basic 9070. If stock-priced 9070XTs make up less than 5% of production you will never see them. That will raise the 'default' 9070XT price to $700 which is too much to woo any Nvidia owners. They will just wait another month or two until Nvidia can boost production.
 
Only thing I am worried : AMD drivers, compatibility with old games, new games, crashing, etc .... Hardware looks good, but what about the software side.
 
Only thing I am worried : AMD drivers, compatibility with old games, new games, crashing, etc .... Hardware looks good, but what about the software side.

Other than the rare issue AMD drivers have been fine for a long time. I bet this line just gets repeated by people who haven't used AMD in 20 years. Blackwell has driver issues so Nvidia is not immune. When it comes to old games it is Intel that has problems. They have assed everything below DX11.
 
  • Like
Reactions: LolaGT and ilukey77
Other than the rare issue AMD drivers have been fine for a long time. I bet this line just gets repeated by people who haven't used AMD in 20 years. Blackwell has driver issues so Nvidia is not immune. When it comes to old games it is Intel that has problems. They have assed everything below DX11.
Nvidia fanboys love to complain about AMD drivers as if its the last shred of their dignity they have left after being shafted by Nvidia gen after gen !!

My first and only ever Nvidia GPU was the 1070 in a alienware aurora R7 like 10+ years ago ( my first PC )

Since then ive built my own

I have ben using AMD GPUs for 8 + years now !!

First card i ever bought was the 5700xt red devil followed by the 6900xt red devil 6650xt sakura hellhound , 6700xt spectral white hell hound and the 7900xtx red devil and like you said the odd rare hiccup otherwise spot on drivers !!

Not saying their perfect but Nvidia are not much better ..

For the most part i think its fanboys hurt back when the 5000series had some really bad drivers !!
 
For the typical average gamer AMD fixed the drivers at least 5 years ago and never looked back.
The shtick about them having bad drivers is propaganda from people trying to discredit them to charge their confirmation bias that they made the correct purchase.
No one likes to feel or be made to feel that they made a bad or not the 'best' buy. Doesn't matter what it is, automobiles, power tools, washer and dryer sets, and certainly PC hardware.
 
  • Like
Reactions: King_V
For the typical average gamer AMD fixed the drivers at least 5 years ago and never looked back.
The shtick about them having bad drivers is propaganda from people trying to discredit them to charge their confirmation bias that they made the correct purchase.
No one likes to feel or be made to feel that they made a bad or not the 'best' buy. Doesn't matter what it is, automobiles, power tools, washer and dryer sets, and certainly PC hardware.

This is such a dead horse discussion. Anyone that says AMD drivers are trash, have either A. Not used a 6xxx or better AMD GPU, B. Have issues stemming from something other than their GPU or C. Rabid fanbois that suffer from severe confirmation bias.

I ran my 6600, 6600XT and 6800XT in the same system and not once did I feel that drivers make the cards inferior to their NVidia counterparts. in fact, I had better performance from my 6600 than I did from a loaner 3060.

BTW, I am anything but an AMD fanboi, I was seriously looking at a 40 series GPU and now weighing up pros and cons of 9070XT vs 5070Ti. I just get super annoyed with the driver bashing
 
  • Like
Reactions: Makaveli
I would like to see every article about video cards be cancelled until manufacturers get their act together and provide the items at a guaranteed msrp. We consumers have ridden this crazy train of greed and profit long enough. Stop the train. Stop the stories. Why read about a card that can’t be bought at double the fake price?
 
This is such a dead horse discussion. Anyone that says AMD drivers are trash, have either A. Not used a 6xxx or better AMD GPU, B. Have issues stemming from something other than their GPU or C. Rabid fanbois that suffer from severe confirmation bias.

I ran my 6600, 6600XT and 6800XT in the same system and not once did I feel that drivers make the cards inferior to their NVidia counterparts. in fact, I had better performance from my 6600 than I did from a loaner 3060.

BTW, I am anything but an AMD fanboi, I was seriously looking at a 40 series GPU and now weighing up pros and cons of 9070XT vs 5070Ti. I just get super annoyed with the driver bashing
The last time I saw a visual glitch in an AMD driver was with the 7800 series cards, HD7800, a texture flickered when moving through the map.

Hardware wise I have had no blue screen crashes on Windows NT based operating systems using ATI/AMD or Nvidia, that’s windows 2000 to Windows 11.

The Pitcairn 7800 card was partnered with a second, cross fire, they ran near flawlessly with the texture flickered being the only bugette.
Nvidia had a bug with the 3070 and dual monitors, it wouldn’t clock down when idle. I wound back to a driver that allowed that function to work properly. No idea if/when they fixed it. In use in games it has been equal to the AMD drivers.

Neither side can risk producing bad software. Fanboys on both sides will gleefully foam at the mouth in their respective echo chambers and some of that noise will spill out into the (new) enthusiast’s consciousness.

Gpu/vga card history
1993, Cirrus on a 486 sx 25
ATi on Intel Atlantis, P120 (motherboard graphics)
ATI Rage on Tyan Titan, P233
Voodoo, GeForce 1 and 3, Via motherboard, Athlon 750 (slot)
ATI 9550, Athlon XP,
1950pro, Athlon x2
Pitcairn, Phenom 2 and 8350
R9 - 390 on 8350 and 4790k
2060 on 2700x
3070 on 3900x
7900xt on 9900x

Only once did I see a repeatable crash. The GeForce 1 was not at fault. The Athlon 750 was mounted on an Irongate, 750/751 chipset motherboard. There was a bug in the AGP timings which caused the video to repeatedly crash. AMD were helpful in identifying this and the return of the part to the vendor eased with their email train.
 
XFX Swift, Powercolor Hellhound and Sapphire Pulse are going to be MSRP models... most likely...
Multiple retailers have already gone public that MSRP models will only be available in the first batch of launch cards. Whenever the first batch is gone, no more MSRP cards. Welcome to the new reality. The challenges of the current market don't apply to just Nvidia.

https://videocardz.com/newz/retaile...o-first-shipments-price-set-to-increase-later

If you look at Best Buy's website, there is one MSRP XFX card for each model and they both have a 16 hour discount counter counting down. After that the 9070 goes up $80 and the 9070XT increases $130.
 
Last edited:
  • Like
Reactions: JarredWaltonGPU
Best Buy had a coming soon for the 9070 XT @ $599.99 so I was refreshing every 2-3 minutes. It went from coming soon to out of stock. I even put myself on their product alert. It did not matter cause they soldout in seconds. I know it will get better but it is going to be at least 2-3 months before you can find them on Amazon.