News AMD estimates of Radeon RX 9070 XT performance leaked: 42% – 66% faster than Radeon RX 7900 GRE

If they're keen* on comparing it to the 7900GRE, then does that mean it'll be hovering around the same MSRP? Would that be a reasonable assumption?

The Launch MSRP of the 7900GRE, for comparison, was $550. If AMD does indeed launch the 9070XT at that price, that's a no-arguments winner this generation.

Come on AMD, you said you wanted market share. This is it.

Regards.
 
If they're keen* on comparing it to the 7900GRE, then does that mean it'll be hovering around the same MSRP? Would that be a reasonable assumption?

The Launch MSRP of the 7900GRE, for comparison, was $550. If AMD does indeed launch the 9070XT at that price, that's a no-arguments winner this generation.

Come on AMD, you said you wanted market share. This is it.

Regards.
I certainly hope that is what this means. I was hoping for it to come in at $499 but I can see myself paying $550 for this performance level. I am cautiously optimistic for this release.
 
If they're keen* on comparing it to the 7900GRE, then does that mean it'll be hovering around the same MSRP? Would that be a reasonable assumption?

The Launch MSRP of the 7900GRE, for comparison, was $550. If AMD does indeed launch the 9070XT at that price, that's a no-arguments winner this generation.

Come on AMD, you said you wanted market share. This is it.

Regards.

Even if it turns out that the launch price is higher than the one you mentioned, the improved Ray Tracing performance of this card, is a good sign by itself.
 
I managed to copy the table out of their HTML source, which enabled me to copy & paste it into Excel. From there, I could compute the following speedups on RT vs. non-RT games:

RX 9700 (non-XT) vs. RX 7900 GRE:

Category1440p Ultra4k Ultra
Raster
+17%​
+19%​
Raytracing
+26%​
+26%​


RX 9700 XT vs. RX 7900 GRE:

Category1440p Ultra4k Ultra
Raster
+33%​
+37%​
Raytracing
+50%​
+53%​

In both cases, we can clearly see the bigger speedup is on RT, which is in line with what has been revealed so far - that RDNA4 mostly improves in the areas of RT and AI.
 
Seems a comparison to the current AMD flagship GPU (RX 7900 XTX) would have been more appropriate given a new generation of product.
But it's not meant to be a flagship-tier GPU. AMD said (much like Polaris, if you remember the RX 400 and RX 500 generations) this gen will focus only on mid-range and below.

Dr. Lisa Su ... asserting that next-gen Radeon RX 9000 GPUs will target the "highest volume portion" of the market; referring to the mid-range segment. AMD made it clear a couple of months back that it has no plans to compete with Nvidia in the high-end space with RDNA 4. Instead, the goal of this generation is to penetrate the budget market, mirroring a strategy similar to RDNA 1.

https://www.tomshardware.com/pc-com...-in-early-march-promises-4k-mainstream-gaming
 
But it's not meant to be a flagship-tier GPU. AMD said (much like Polaris, if you remember the RX 400 and RX 500 generations) this gen will focus only on mid-range and below.
Dr. Lisa Su ... asserting that next-gen Radeon RX 9000 GPUs will target the "highest volume portion" of the market; referring to the mid-range segment. AMD made it clear a couple of months back that it has no plans to compete with Nvidia in the high-end space with RDNA 4. Instead, the goal of this generation is to penetrate the budget market, mirroring a strategy similar to RDNA 1.​
No, I get that. But when going from one generation to the next, it would seem a comparison to the top dog more appropriate that a model that is already EOL.
 
  • Like
Reactions: artk2219
>Seems a comparison to the current AMD flagship GPU (RX 7900 XTX) would have been more appropriate given a new generation of product.

Comparison to 7900GRE looks better. Perception (read: marketing) matters.

Why not then 7800XT compare? Answer: 7900 looks better than 7800.

D.Owen gives a decent first-pass take of the Videocardz leak
View: https://youtube.com/watch?v=h0xAqkzQ53k

BTW, go read the Videocardz piece to see the per-game breakdown results. I normally just head there first anyway.


>The Launch MSRP of the 7900GRE, for comparison, was $550. If AMD does indeed launch the 9070XT at that price, that's a no-arguments winner this generation.

Then why not $300? Surely that's an even bigger winner?

I think folks are getting their hopes way too high, and setting themselves up for disappointment. If $750 was deemed by all and sundry to be a "good" price for 5070Ti, and 9070XT can match or slightly exceed that, asking for a 33% discount--50% off given Ti's going street price--does fall into the wishful thinking category.

Second, there won't be a reference card, so it's all up to AIBs to set pricing. "Winning market share" (for AMD) isn't on AIBs' menu. Given the current GPU crunch, their most likely route is the same as what they're doing for Nvidia cards. Have one "base" SKU with small allocation, and the rest being OC models with markups.

Third, indications are that tariff surcharge is already in effect for 5070Ti (plus an extra 10% to take advantage of the GPU crunch, so +20% altogether). So, whatever MSRP AMD puts out, it's reasonable to expect a similar price inflation for RDNA4.

All that said, let's play "guess the MSRP" game.

Going by the TPU numbers (in Owen's video) plus some guesswork, 9070XT should slightly outperform 5070Ti. 9070, given its closer positioning to 9070XT than 5070 to 5070Ti, should substantially outperform 5070. Also factor in the extreme shortage of midrange GPU at the moment.

Then, my SWAG is that 9070 will be priced on par with 5070 at $550 MSRP; 9070XT will be $650 to $700. These would give AMD substantial $/perf advantage. Given the current environment--with no Nvidia at MSRP--this should net sales wins for AMD.

Numbers above are for MSRP. As said, I expect MSRP models to be instantly OOS, and most available AIB models (in the US) to be at least 10% higher to account for tariff, and more likely closer to 20% as with 5070Ti.
 
If they're keen* on comparing it to the 7900GRE, then does that mean it'll be hovering around the same MSRP? Would that be a reasonable assumption?

The Launch MSRP of the 7900GRE, for comparison, was $550. If AMD does indeed launch the 9070XT at that price, that's a no-arguments winner this generation.

Come on AMD, you said you wanted market share. This is it.

Regards.
How about joining us in reality? The launch price is irrelevant because the market will determine the price. AMD can launch these at $150 and almost no one will pay that price because scalpers will clean house on anything that is perceived as a great value. The only chance gamers have of landing a good value card is getting lucky, of buying at the end of a product's life when it is getting replaced. In the US, the 9800X3D is still selling for about $100 over MSRP.
 
I just bought a 7900xtx in late December. $800 was a great deal and it’s a great card, hopefully I don’t have buyer’s remorse if these are close for a lot less.
You're obviously going to have 50% more memory and close to that amount of additional memory bandwidth. Quite likely more raster performance, as well. As for upscaling and RT, those might be two areas where you're at a slight deficit.

Now, don't forget that you also got to use yours for > 2 months. That's worth something, as well. We all know that if you're willing to wait long enough, something better will always come along. That time is worth something.

I'd say $800 on that card is a great deal, especially when considering inflation. Plus, it was a known quantity, without the risk of launch SNAFUs, scarcity, and price gouging, like what has plagued Nvidia's launch. I wouldn't feel bad about your decision, if I were you.
 
Going by the TPU numbers (in Owen's video) plus some guesswork, 9070XT should slightly outperform 5070Ti.
Haven't we already seen leaks where it struggled against a RTX 4070 Ti? Let's try to be realistic, here. For one thing, the 5070 Ti will probably have 40% more memory bandwidth. I'd put the XT as being competitive with a (non-Ti) RTX 5070, maybe a little faster.
 
  • Like
Reactions: KyaraM and artk2219
The only chance gamers have of landing a good value card is getting lucky, of buying at the end of a product's life when it is getting replaced.
There were definitely points over the past couple years when you could get a RTX 4000, including even the 4090, for near MSRP.

In the US, the 9800X3D is still selling for about $100 over MSRP.
Bad example. Arrow Lake's poor gaming performance created an unprecedented surge in demand for that model. AMD wasn't ready for it.
 
If they're keen* on comparing it to the 7900GRE, then does that mean it'll be hovering around the same MSRP? Would that be a reasonable assumption?

The Launch MSRP of the 7900GRE, for comparison, was $550. If AMD does indeed launch the 9070XT at that price, that's a no-arguments winner this generation.

Come on AMD, you said you wanted market share. This is it.

Regards.
Agree 100%. AMD needs to take some of that money they're raking in on the CPU side and invest it in changing the marketplace where GPUs are concerned. Make a reasonable profit on the new parts by charging reasonable prices and gaining marketshare.

We've seen what NV's done with basically a monopoly on the top end, asking prices for their top card that I wouldn't be able to spend on entire system. That's fine, obviously there's a niche group who will always pay what they ask, but for the rest of us who just want to play modern games at decent framerates without taking out a second mortgage on our homes, we need an alternative.
 
I'm tempering my expectations here due to the current state of the GPU market. However, as long as this thing comes with all the parts included, doesn't catch fire, produces a constant image on the screen, and can interface with the motherboard at the correct PCIe speed, then we may have a winner. Scalping will likely be a thing, but waiting a bit can alleviate that particular issue.
 
>Haven't we already seen leaks where it struggled against a RTX 4070 Ti?

All perf "leaks" have been squishy, especially those with unknown benchmark settings. I deem this particular leak as relatively more credible, albeit coming from AMD and thus with some inevitable cherrypicking.

Uncertainty aside, it wouldn't make sense for AMD to explicitly position 9070 series against Nvidia's x070 series and have them be worse off, especially against 4070Ti which is a known quantity.

>Let's try to be realistic, here. For one thing, the 5070 Ti will probably have 40% more memory bandwidth.

That'll have to wait for 3rd-party benchmarks. I don't think perf wins matter as much as bang/buck for midrange, as pricing is anticipated to be pegged to perf. If 9070XT is better, it'll be priced a bit higher; if not, then a bit lower.
I believe those were synthetic benchmarks like Geekbench etc. In other words non gaming loads. Other actual gaming load leaks indicate RTX 4080 or better level, similar to this article.
 
You're obviously going to have 50% more memory and close to that amount of additional memory bandwidth. Quite likely more raster performance, as well. As for upscaling and RT, those might be two areas where you're at a slight deficit.

Now, don't forget that you also got to use yours for > 2 months. That's worth something, as well. We all know that if you're willing to wait long enough, something better will always come along. That time is worth something.

I'd say $800 on that card is a great deal, especially when considering inflation. Plus, it was a known quantity, without the risk of launch SNAFUs, scarcity, and price gouging, like what has plagued Nvidia's launch. I wouldn't feel bad about your decision, if I were you.

The xtx was one of those cards I didn’t need, already had the 6800xt, but I’d gotten some cash for Christmas and my local Microcenter had it for $800. I think I got one of the last ones they had the day after Christmas. I looked at the 7900xt they had for $620 which was a great deal also, but I’m glad I went for the xtx. I figure ray tracing is about a 3080 or 4070 performance. I think it should last a while especially if they bring fsr 4 to the 7000 series cards which wouldn’t be a bad idea for some goodwill for folks already invested into their cards.
 
AMD shares have been in an extended slump because of poor outlook in AI market. It's not rolling in dough.

https://site.financialmodelingprep....ed-as-ai-gpu-challenges-persist-shares-down-2

http://goo.gl/search?amd+stock
AMD's financials are fine.

VZLmGxwwQwRwAP9XoK33VY.png

Source: https://www.tomshardware.com/pc-com...-the-datacenter-for-the-first-time-in-q4-2024

The problem affecting investment in gaming GPUs is that their gaming products are doing poorly.

fK7AxLyqdjCiiSPbeNBvcY.png


Without a solid potential for return-on-investment, I think it's hard for them to invest the necessary resources to truly leapfrog Nvidia. That said, they can't ignore the sector, because they need to offer solid iGPUs for laptops and next gen consoles.

Something to look forward to: with them re-merging gaming and datacenter GPUs in the upcoming UDNA architecture, hopefully that will bring more total resources and efficiency to their GPU efforts. That's still at least a couple years off, however.