News AMD estimates of Radeon RX 9070 XT performance leaked: 42% – 66% faster than Radeon RX 7900 GRE

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I think the 9070xt is going to retail for around $1k for the first few weeks. There's no reason to not ask for a high price. If they don't move, just lower the price until they move.
 
Umm ... AMD is not replacing the 7900 XTX model this generation and they have outright said they are not making a high end GPU's for this generation.

https://www.tomshardware.com/pc-com...ck-hyunh-talks-new-strategy-for-gaming-market

https://www.pcmag.com/opinions/ditc...-cards-could-give-amd-an-edge-in-the-gpu-wars

https://www.pcgamesn.com/amd/new-radeon-8000-series-strategy

The 9070's are aimed to compete against nVidia's xx70 and xx70 ti models.
Ummm.... I'm very well aware of that
That doesn't mean they can surprise drop an XTX nor does it mean it can't be an upgrade to their own like without competing against Nvidia.

Besides lingering your own interpretation of what I said, it's completely off topic to my point about comparing the XT performance to last year's GRE. Thanks for playing along but here's your quarter back.
 
  • Like
Reactions: artk2219
Ah. I could've sworn there was some mention. But, can't recall for certain, and can't find it now.
The way we'll know is when there's a die shot and someone counts the number of compute units. If there's not much spare capacity, then probably no 9080 XT will be forthcoming. We know the RDNA4 generation will consist of only the Navi 44 and 48 dies:

Navi 41 was cancelled, some time ago.

In any case I don't think mid-range of a new Gen outperforming the halo of the previous gen was the typical experience. At least, not that I can recall offhand.
Pascal was a good generational uplift, with the GTX 1070 basically equaling the GTX 980 Ti. I'm not sure what Ampere model matched the RTX 2080 Ti or what Ada matched the RTX 3090.
 
Ummm.... I'm very well aware of that
That doesn't mean they can surprise drop an XTX nor does it mean it can't be an upgrade to their own like without competing against Nvidia.
It takes months of validation and bug testing to get a new die out the door. Assuming AMD were not lying and did not have a 9080 XT in the works at the same time as the other 9000 series chips, the earliest we would see something like this if they changed their mind would be 9-12 months, imo.
 
  • Like
Reactions: artk2219
Ummm.... I'm very well aware of that
That doesn't mean they can surprise drop an XTX nor does it mean it can't be an upgrade to their own like without competing against Nvidia.

Besides lingering your own interpretation of what I said, it's completely off topic to my point about comparing the XT performance to last year's GRE. Thanks for playing along but here's your quarter back.

Riiight...

Why compare the new XT which is apparently replacing XTX
 
If you took a moment to stop worshipping at the altar of AMD, you would see that CPU's prices are so high because AMD is the one that drove them up there, not Intel.
Oh, come on. And you don't have a big Intel+nVidia, full of neon, altar next to your bed? Please. Check my posting history and see if I have an AMD altar and if you don't have an Intel+nVidia one. I'm 150% sure I'm way more neutral than you are, any time of the day, week, month, year.

Also, bait and switch? Who is talking about CPU prices here? But hey, I'll bite: yes, AMD will abuse the market if we allow them to. The thing with the 9800X3D, I'd like to say it was pure market demand, as AMD has not increased the MSRP and the CPUs are being sold, as I said, close or at MSRP in the rest of the world now. The MSRP was set before tariffs, so how do you reconcile that to the modifidied pricing from the supply chain (distributors)? AMD is no saint and I'm not treating them as such, so I find strange you brought up the "but look; 9800X3D expensive! hurr durr".

If the 9070XT (going back to the topic) is above $650 MSRP, then it's dead on arrival. We all know nVidia can get the 5070ti be at the MSRP ($750) any time it wants. Tariff shenanigans not withstanding, I doubt nVidia wants to miss out on the increased pricing of their GPUs and give AMD a chance.

Regards.
 
Riiight...
In terms of rank relative to their own lineup. Since there is no XTX, XT is essentially replacing the XTX as the top of the line. Who knows if XTX will be back anytime in the near future officially. .. everyone is so literal these days. Maybe im just not relaying what I'm thinking in the most sensical (<- I know that's not what word) manner.
I want that quarter back Sir/Madam lol

Moving along


Now that 5080s are apparently missing ROP units and Nvidia downplaying the situation that it's 1% of cards which would be 1 in 100 cards, discounting the free fire risk you get, that's a high number.

Really want to see AMD take advantage of the situation. Some '90s smack talk or something, just do anything with the cards you have.

I don't hate Nvidia hardware but do the company in recent years.

Intel has botched their GPUs. Maybe Qualcomm despite mainly doing SoC's could enter the arena.
 
  • Like
Reactions: artk2219
Oh, come on. And you don't have a big Intel+nVidia, full of neon, altar next to your bed? Please. Check my posting history and see if I have an AMD altar and if you don't have an Intel+nVidia one. I'm 150% sure I'm way more neutral than you are, any time of the day, week, month, year.
Every fanboy thinks they are neutral. 100% of them are wrong.
Also, bait and switch? Who is talking about CPU prices here? But hey, I'll bite: yes, AMD will abuse the market if we allow them to. The thing with the 9800X3D, I'd like to say it was pure market demand, as AMD has not increased the MSRP and the CPUs are being sold, as I said, close or at MSRP in the rest of the world now. The MSRP was set before tariffs, so how do you reconcile that to the modifidied pricing from the supply chain (distributors)? AMD is no saint and I'm not treating them as such, so I find strange you brought up the "but look; 9800X3D expensive! hurr durr".
AMD stinks at GPU's, so you can't use that as an example of AMD "conditioning" customers to accept terrible pricing. 9800X3D pricing has nothing to do with politics. The pricing is where it is because availability has been terrible since launch. If you don't know what you're talking about, don't just make things up. Currently, none of the major retailers have stock, it's all third party stock who love jacking up the price. Which goes back to the original point you have been ignoring. It doesn't matter who makes the product, if it is desirable, it will be scalped.

Look at AMD's x600X pricing. 6 core 2600X was $229. 6 core 3600X was $249. 6 core 5600X was $299. Node shrinks made the 5600X 1/3 the size of the 3600X yet the price increased 37%. What is going on there? By the time the 9600X launched, the 7600X was selling for under $200, because no one wants a 6 core CPU for $300. Worst case scenario, people were predicting a $250 launch price fore the 9600X. Nope, launched at $280, still $50 more than the 2600X. AMD trying to condition their customer that a 6 core should sell for over $250. Even during Intel's quad core stagnation days, they at least kept prices relatively flat. 2600K launched at $316. 6 years later, they launched the 7700k at $330. $14 increase in 5 generations. AMD increased prices more than that in 2 consecutive generations. Given the opportunity, AMD will crap on customers just as hard as Intel or Nvidia. You leaving AMD out of your original statement taking shots at Nvidia and Intel is evidence of your lack of neutrality.
 
Last edited by a moderator:
  • Like
Reactions: KyaraM
Look at AMD's x600X pricing. 6 core 2600X was $229. 6 core 3600X was $249. 6 core 5600X was $299. Node shrinks made the 5600X 1/3 the size of the 3600X yet the price increased 37%. What is going on there? By the time the 9600X launched, the 7600X was selling for under $200, because no one wants a 6 core CPU for $300. Worst case scenario, people were predicting a $250 launch price fore the 9600X. Nope, launched at $280, still $50 more than the 2600X. AMD trying to condition their customer that a 6 core should sell for over $250. Even during Intel's quad core stagnation days, they at least kept prices relatively flat. 2600K launched at $316. 6 years later, they launched the 7700k at $330. $14 increase in 5 generations. AMD increased prices more than that in 2 consecutive generations. Given the opportunity, AMD will crap on customers just as hard as Intel or Nvidia. You leaving AMD out of your original statement taking shots at Nvidia and Intel is evidence of your lack of neutrality.
That is comparing two very different times. And for AMD, two different fabs. GF was still making the chips for the 2000 series. When they moved to TSMC they had to buy their way in, and then everyone was using TSMC aside from Intel and Samsung for the Nvidia 30 series. TSMC was able to start increasing wafer prices because they were doing new advanced nodes constantly, and they could, and still do.

On the Intel timeline, everyone was keeping prices flat, even Nvidia at the time. Intel could have raised prices in the face of no competition and having in house fabs, but they didn't, and pumped out high volume instead. AMD was struggling to sell any Bulldozer/Piledriver etc products so their CPUs were often deeply discounted after launch.

Coupled with inflation PC part prices are weirdly about what they were before we had that amazing run of static pricing. Call that the golden age of PC gaming, because I don't think it is coming back.

i5-750 $196 (2009) -> $287 (2025)
14600K launch price $319
245K also $319
Basically CPU prices have gone up 10% overall class for class. I understand buying power for the individual is worse, but that is the world we live in.
 
On the Intel timeline, everyone was keeping prices flat, even Nvidia at the time. Intel could have raised prices in the face of no competition and having in house fabs, but they didn't, and pumped out high volume instead. AMD was struggling to sell any Bulldozer/Piledriver etc products so their CPUs were often deeply discounted after launch.
One thing to note about Intel is how their die sizes were generally decreasing, during most generations of the quad core era. They did burn increasing amounts of real estate on enlarging their iGPUs, but there was an overall downward trend. I'm sure that helped them keep prices fairly flat.

Then, once core count inflation got underway, we saw prices increase markedly. Now that core counts have plateaued, there's been some gradual deflation of prices (probably like what would've happened to Intel, had they been under competitive pressure before Zen). I think it's worth noting that Zen 5 CCDs are pretty similar in size to those of Zen 2.

Basically CPU prices have gone up 10% overall class for class. I understand buying power for the individual is worse, but that is the world we live in.
GPUs are the real budget killer, because they depend on large dies and lots of transistors for high performance.
 
  • Like
Reactions: artk2219
Buying or purchasing power in the US is up since 2019 at least. Here is my source for that and then the latest update as of December 19 2024.
Since 2019, but the comparison I put out was from 2009. 2019 pre pandemic numbers things were less good in terms of wages, but the market reflected that without much price increases.

The problem with Median earnings is that the Median is quite skewed to start with, sitting at around $70k right now. And that is household earnings, not individual. A little too variable region to region and person to person.

Also just missed out on the current high interest rates and still relatively high home prices and the current food price issues. Fuel has been relatively stable though a bit higher than normal.
 
  • Like
Reactions: artk2219
Then, once core count inflation got underway, we saw prices increase markedly. Now that core counts have plateaued, there's been some gradual deflation of prices (probably like what would've happened to Intel, had they been under competitive pressure before Zen). I think it's worth noting that Zen 5 CCDs are pretty similar in size to those of Zen 2.
I know dealing with rumours and witch whispers is not everyone's cup of tea, but seems like AMD will move to 12 core CCDs, so the top end in Zen6 may be a 24c/48t chiplet SoC (edit: in AM5, that is).

AMD has been keeping core counts rather stagnant, but I wonder if that's because of ThreadRipper or they just wanted to wait for more bandwidth in DDR5 to be accessible in their designs? I'm guessing their new IMC should be able to offset the per-core bandwidth and latency to a somewhat decent degree compared to TR's Quad and Octo channel configs.

Still, Intel's approach to bigLITTLE seems to have backfired a bit with their implementation and AMD's "dense" approach reached an impasse? Perhaps? In any case, the next generation from both Intel and AMD will be rather interesting when talking core counts.

Regards.
 
  • Like
Reactions: artk2219
I know dealing with rumours and witch whispers is not everyone's cup of tea, but seems like AMD will move to 12 core CCDs, so the top end in Zen6 may be a 24c/48t chiplet SoC (edit: in AM5, that is).
This could be anticipating Intel's rumored jump to 32 E-cores, or maybe just trying to cope better with core count inflation in their server CPUs.

Let's not forget that CCDs are primarily about servers. So, it perhaps anticipates a 192-core Zen 6 CPU. Zen 5C achieves that watermark using 16-core CCDs. With full Zen 6 cores, I think using 8-core CCDs won't scale well enough.

AMD has been keeping core counts rather stagnant, but I wonder if that's because of ThreadRipper or they just wanted to wait for more bandwidth in DDR5 to be accessible in their designs?
I read an interview with one of their execs, earlier last year (i.e. prior to the Ryzen 9000 launch), which said they were holding back on desktop core counts due further cores being too bandwidth-starved. Perhaps CUDIMMs finally get us to a point where adding more cores makes sense.

I'm reminded of the massive MT boost Zen 4 achieved (~43% average) - far more than the single-threaded improvement. I'm sure moving to DDR5 had a lot to do with it.
 
  • Like
Reactions: -Fran-
Basically CPU prices have gone up 10% overall class for class. I understand buying power for the individual is worse, but that is the world we live in.

Inflation is one of those things that people just don't realize until it's been compounded over years.

https://www.calculator.net/inflatio...1&coutyear1=2025&calctype=1&x=Calculate#uscpi

$300 January 2004 is now $514 in January 2025. Vendors are often hesitant to raise prices year over year as they have become "anchored" at some value. Instead they wait for some sort of market effecting event then use that as an opportunity to rebaseline to expected values. This creates the sudden "sticker shock" when a price jumps a large amount. Then we get something dumb like the 5090 being essentially a datacenter GPU with it's ridiculous size, causing another price rebaselline for those with more money then sense.
 
Pascal was a good generational uplift, with the GTX 1070 basically equaling the GTX 980 Ti. I'm not sure what Ampere model matched the RTX 2080 Ti or what Ada matched the RTX 3090.

That may be the case, and maybe *70 is now considered mid-range because Nvidia (mostly) ditched *50 once the RTX cards came out. But in the Pascal era, I'd argue that the 1060 6GB was the mid-range card. It's weird, as *60 is now entry level I guess (or Nvidia hopes?). So that might put the *70 cards, at least post-Pascal as "mid-level" just because there's (mostly) only *60, *70, and *80. Oh, and post-Turing, the *90 series, which are I guess Titan turned into a regular line.

The re-working and (intermittent?) loss of the budget level of cards makes the comparison a bit messier.

The more I think about this, the more I wonder what constitutes a "mid-range" card anymore.
 
That may be the case, and maybe *70 is now considered mid-range because Nvidia (mostly) ditched *50 once the RTX cards came out.
Huh? There is the RTX 3050.

I suspect they intended for the RTX 4060 to be sold as a RTX 4050, but then GPU prices collapsed and they realized how ridiculous it'd look, if they tried that, at the same prices they asked for the RTX 4060. This also explains why the generational uplift was so low, at the bottom end of the Ada range. It would've been much stronger, if you just knock each of those cards down one tier.
 
  • Like
Reactions: King_V
Huh? There is the RTX 3050.

I suspect they intended for the RTX 4060 to be sold as a RTX 4050, but then GPU prices collapsed and they realized how ridiculous it'd look, if they tried that, at the same prices they asked for the RTX 4060. This also explains why the generational uplift was so low, at the bottom end of the Ada range. It would've been much stronger, if you just knock each of those cards down one tier.
Hence the "(mostly)" I put in. And, maybe I'm reading to much into it, but given it's poor value, I have this feeling that the 3050 was sort of a begrudging offer from Nvidia. They didn't even try to compete price/performance against the RX 6600.

And, yeah, maybe just a feeling, but I agree - I think they wanted *50 as a $300 card to be the new normal, it seems.
 
Since 2019, but the comparison I put out was from 2009. 2019 pre pandemic numbers things were less good in terms of wages, but the market reflected that without much price increases.

The problem with Median earnings is that the Median is quite skewed to start with, sitting at around $70k right now. And that is household earnings, not individual. A little too variable region to region and person to person.

Also just missed out on the current high interest rates and still relatively high home prices and the current food price issues. Fuel has been relatively stable though a bit higher than normal.
So just to be clear, you are saying 1 year after the global banking system near collapse that lead us to the worst recession since the great depression in 2009, you/we had more buying power on average than in 2019? I cannot cite anything at this moment, but at first glance that seems suspect.
 
  • Like
Reactions: bit_user
That is comparing two very different times. And for AMD, two different fabs. GF was still making the chips for the 2000 series. When they moved to TSMC they had to buy their way in, and then everyone was using TSMC aside from Intel and Samsung for the Nvidia 30 series. TSMC was able to start increasing wafer prices because they were doing new advanced nodes constantly, and they could, and still do.

Zen 2 and 3, which saw the largest price increase per core, were both TSMC.

On the Intel timeline, everyone was keeping prices flat, even Nvidia at the time. Intel could have raised prices in the face of no competition and having in house fabs, but they didn't, and pumped out high volume instead. AMD was struggling to sell any Bulldozer/Piledriver etc products so their CPUs were often deeply discounted after launch.
Single GPU halo card from Nvidia.

2011 - GTX 480 - 529mm2 - $499
2016 - Titan X (Pascal) - 471mm2 - $1199

That's flat? Are you trying to be funny?

How about we try arguing with facts instead of reinventing history in the endless effort to defend AMD at all costs?
 
Hence the "(mostly)" I put in. And, maybe I'm reading to much into it, but given it's poor value, I have this feeling that the 3050 was sort of a begrudging offer from Nvidia. They didn't even try to compete price/performance against the RX 6600.

And, yeah, maybe just a feeling, but I agree - I think they wanted *50 as a $300 card to be the new normal, it seems.
3050 launched at the height of the ethereum crypto bubble. The MSRP was completely irrelevant. From THG's review:

Nvidia's RTX 3050 delivers good performance for its theoretical $249 starting price. Unfortunately, that also means there's not a snowball's chance in hell that it won't sell for radically inflated prices. It lands between the previous-gen RTX 2060 and GTX 1660 Super, both of which currently sell for far more than Nvidia's asking price.

That said, the 3050 wasn't competing with the 6600. It launched at $249, which was less than the $279 that the 6500XT launched at and the 3050 trashed that card. It's up 50% in these benchmarks and MSRP'd for less. That's not trying to compete on price/performance?

BvXW74etMCgQm8Xe9xTDUG-1200-80.png.webp
 
Zen 2 and 3, which saw the largest price increase per core, were both TSMC.


Single GPU halo card from Nvidia.

2011 - GTX 480 - 529mm2 - $499
2016 - Titan X (Pascal) - 471mm2 - $1199

That's flat? Are you trying to be funny?

How about we try arguing with facts instead of reinventing history in the endless effort to defend AMD at all costs?

Yeah, not a direct comparison, other than fastest card available. Granted the launch price of the GTX 1080 was a little high at $699, but the follow up 1080 Ti, same die as the Titan X, was also $699 a year later.

Oh, and Ryzen 2000 was Zen+, Zen 2 was 3000 series.
 
Last edited:
  • Like
Reactions: bit_user and -Fran-
3050 launched at the height of the ethereum crypto bubble. The MSRP was completely irrelevant. From THG's review:



That said, the 3050 wasn't competing with the 6600. It launched at $249, which was less than the $279 that the 6500XT launched at and the 3050 trashed that card. It's up 50% in these benchmarks and MSRP'd for less. That's not trying to compete on price/performance?

BvXW74etMCgQm8Xe9xTDUG-1200-80.png.webp
So, your defense is to use the crypto-craze pricing and MSRP, neither of which I was referencing?

No.

It's that Nvidia KEPT the pricing elevated. Not crypto-elevated, obviously. And I wasn't arguing 3050 vs 6500XT, I was talking about 3050 vs 6600 non-XT, because the 3050's prices remained above the 6600's price despite performing below it.

And that's not even getting into the insult to the customers known as the 3050 6GB.
 
  • Like
Reactions: -Fran-

TRENDING THREADS