Ummm.... I'm very well aware of thatUmm ... AMD is not replacing the 7900 XTX model this generation and they have outright said they are not making a high end GPU's for this generation.
https://www.tomshardware.com/pc-com...ck-hyunh-talks-new-strategy-for-gaming-market
https://www.pcmag.com/opinions/ditc...-cards-could-give-amd-an-edge-in-the-gpu-wars
https://www.pcgamesn.com/amd/new-radeon-8000-series-strategy
The 9070's are aimed to compete against nVidia's xx70 and xx70 ti models.
The way we'll know is when there's a die shot and someone counts the number of compute units. If there's not much spare capacity, then probably no 9080 XT will be forthcoming. We know the RDNA4 generation will consist of only the Navi 44 and 48 dies:Ah. I could've sworn there was some mention. But, can't recall for certain, and can't find it now.
Pascal was a good generational uplift, with the GTX 1070 basically equaling the GTX 980 Ti. I'm not sure what Ampere model matched the RTX 2080 Ti or what Ada matched the RTX 3090.In any case I don't think mid-range of a new Gen outperforming the halo of the previous gen was the typical experience. At least, not that I can recall offhand.
It takes months of validation and bug testing to get a new die out the door. Assuming AMD were not lying and did not have a 9080 XT in the works at the same time as the other 9000 series chips, the earliest we would see something like this if they changed their mind would be 9-12 months, imo.Ummm.... I'm very well aware of that
That doesn't mean they can surprise drop an XTX nor does it mean it can't be an upgrade to their own like without competing against Nvidia.
2070S ~= 1080 tiPascal was a good generational uplift, with the GTX 1070 basically equaling the GTX 980 Ti. I'm not sure what Ampere model matched the RTX 2080 Ti or what Ada matched the RTX 3090.
Ummm.... I'm very well aware of that
That doesn't mean they can surprise drop an XTX nor does it mean it can't be an upgrade to their own like without competing against Nvidia.
Besides lingering your own interpretation of what I said, it's completely off topic to my point about comparing the XT performance to last year's GRE. Thanks for playing along but here's your quarter back.
Why compare the new XT which is apparently replacing XTX
Oh, come on. And you don't have a big Intel+nVidia, full of neon, altar next to your bed? Please. Check my posting history and see if I have an AMD altar and if you don't have an Intel+nVidia one. I'm 150% sure I'm way more neutral than you are, any time of the day, week, month, year.If you took a moment to stop worshipping at the altar of AMD, you would see that CPU's prices are so high because AMD is the one that drove them up there, not Intel.
In terms of rank relative to their own lineup. Since there is no XTX, XT is essentially replacing the XTX as the top of the line. Who knows if XTX will be back anytime in the near future officially. .. everyone is so literal these days. Maybe im just not relaying what I'm thinking in the most sensical (<- I know that's not what word) manner.Riiight...
Every fanboy thinks they are neutral. 100% of them are wrong.Oh, come on. And you don't have a big Intel+nVidia, full of neon, altar next to your bed? Please. Check my posting history and see if I have an AMD altar and if you don't have an Intel+nVidia one. I'm 150% sure I'm way more neutral than you are, any time of the day, week, month, year.
AMD stinks at GPU's, so you can't use that as an example of AMD "conditioning" customers to accept terrible pricing. 9800X3D pricing has nothing to do with politics. The pricing is where it is because availability has been terrible since launch. If you don't know what you're talking about, don't just make things up. Currently, none of the major retailers have stock, it's all third party stock who love jacking up the price. Which goes back to the original point you have been ignoring. It doesn't matter who makes the product, if it is desirable, it will be scalped.Also, bait and switch? Who is talking about CPU prices here? But hey, I'll bite: yes, AMD will abuse the market if we allow them to. The thing with the 9800X3D, I'd like to say it was pure market demand, as AMD has not increased the MSRP and the CPUs are being sold, as I said, close or at MSRP in the rest of the world now. The MSRP was set before tariffs, so how do you reconcile that to the modifidied pricing from the supply chain (distributors)? AMD is no saint and I'm not treating them as such, so I find strange you brought up the "but look; 9800X3D expensive! hurr durr".
That is comparing two very different times. And for AMD, two different fabs. GF was still making the chips for the 2000 series. When they moved to TSMC they had to buy their way in, and then everyone was using TSMC aside from Intel and Samsung for the Nvidia 30 series. TSMC was able to start increasing wafer prices because they were doing new advanced nodes constantly, and they could, and still do.Look at AMD's x600X pricing. 6 core 2600X was $229. 6 core 3600X was $249. 6 core 5600X was $299. Node shrinks made the 5600X 1/3 the size of the 3600X yet the price increased 37%. What is going on there? By the time the 9600X launched, the 7600X was selling for under $200, because no one wants a 6 core CPU for $300. Worst case scenario, people were predicting a $250 launch price fore the 9600X. Nope, launched at $280, still $50 more than the 2600X. AMD trying to condition their customer that a 6 core should sell for over $250. Even during Intel's quad core stagnation days, they at least kept prices relatively flat. 2600K launched at $316. 6 years later, they launched the 7700k at $330. $14 increase in 5 generations. AMD increased prices more than that in 2 consecutive generations. Given the opportunity, AMD will crap on customers just as hard as Intel or Nvidia. You leaving AMD out of your original statement taking shots at Nvidia and Intel is evidence of your lack of neutrality.
One thing to note about Intel is how their die sizes were generally decreasing, during most generations of the quad core era. They did burn increasing amounts of real estate on enlarging their iGPUs, but there was an overall downward trend. I'm sure that helped them keep prices fairly flat.On the Intel timeline, everyone was keeping prices flat, even Nvidia at the time. Intel could have raised prices in the face of no competition and having in house fabs, but they didn't, and pumped out high volume instead. AMD was struggling to sell any Bulldozer/Piledriver etc products so their CPUs were often deeply discounted after launch.
GPUs are the real budget killer, because they depend on large dies and lots of transistors for high performance.Basically CPU prices have gone up 10% overall class for class. I understand buying power for the individual is worse, but that is the world we live in.
Buying or purchasing power in the US is up since 2019 at least. Here is my source for that and then the latest update as of December 19 2024.i5-750 $196 (2009) -> $287 (2025)
14600K launch price $319
245K also $319
Basically CPU prices have gone up 10% overall class for class. I understand buying power for the individual is worse, but that is the world we live in.
Since 2019, but the comparison I put out was from 2009. 2019 pre pandemic numbers things were less good in terms of wages, but the market reflected that without much price increases.Buying or purchasing power in the US is up since 2019 at least. Here is my source for that and then the latest update as of December 19 2024.
I know dealing with rumours and witch whispers is not everyone's cup of tea, but seems like AMD will move to 12 core CCDs, so the top end in Zen6 may be a 24c/48t chiplet SoC (edit: in AM5, that is).Then, once core count inflation got underway, we saw prices increase markedly. Now that core counts have plateaued, there's been some gradual deflation of prices (probably like what would've happened to Intel, had they been under competitive pressure before Zen). I think it's worth noting that Zen 5 CCDs are pretty similar in size to those of Zen 2.
This could be anticipating Intel's rumored jump to 32 E-cores, or maybe just trying to cope better with core count inflation in their server CPUs.I know dealing with rumours and witch whispers is not everyone's cup of tea, but seems like AMD will move to 12 core CCDs, so the top end in Zen6 may be a 24c/48t chiplet SoC (edit: in AM5, that is).
I read an interview with one of their execs, earlier last year (i.e. prior to the Ryzen 9000 launch), which said they were holding back on desktop core counts due further cores being too bandwidth-starved. Perhaps CUDIMMs finally get us to a point where adding more cores makes sense.AMD has been keeping core counts rather stagnant, but I wonder if that's because of ThreadRipper or they just wanted to wait for more bandwidth in DDR5 to be accessible in their designs?
Basically CPU prices have gone up 10% overall class for class. I understand buying power for the individual is worse, but that is the world we live in.
Pascal was a good generational uplift, with the GTX 1070 basically equaling the GTX 980 Ti. I'm not sure what Ampere model matched the RTX 2080 Ti or what Ada matched the RTX 3090.
Huh? There is the RTX 3050.That may be the case, and maybe *70 is now considered mid-range because Nvidia (mostly) ditched *50 once the RTX cards came out.
Hence the "(mostly)" I put in. And, maybe I'm reading to much into it, but given it's poor value, I have this feeling that the 3050 was sort of a begrudging offer from Nvidia. They didn't even try to compete price/performance against the RX 6600.Huh? There is the RTX 3050.
I suspect they intended for the RTX 4060 to be sold as a RTX 4050, but then GPU prices collapsed and they realized how ridiculous it'd look, if they tried that, at the same prices they asked for the RTX 4060. This also explains why the generational uplift was so low, at the bottom end of the Ada range. It would've been much stronger, if you just knock each of those cards down one tier.
So just to be clear, you are saying 1 year after the global banking system near collapse that lead us to the worst recession since the great depression in 2009, you/we had more buying power on average than in 2019? I cannot cite anything at this moment, but at first glance that seems suspect.Since 2019, but the comparison I put out was from 2009. 2019 pre pandemic numbers things were less good in terms of wages, but the market reflected that without much price increases.
The problem with Median earnings is that the Median is quite skewed to start with, sitting at around $70k right now. And that is household earnings, not individual. A little too variable region to region and person to person.
Also just missed out on the current high interest rates and still relatively high home prices and the current food price issues. Fuel has been relatively stable though a bit higher than normal.
That is comparing two very different times. And for AMD, two different fabs. GF was still making the chips for the 2000 series. When they moved to TSMC they had to buy their way in, and then everyone was using TSMC aside from Intel and Samsung for the Nvidia 30 series. TSMC was able to start increasing wafer prices because they were doing new advanced nodes constantly, and they could, and still do.
Single GPU halo card from Nvidia.On the Intel timeline, everyone was keeping prices flat, even Nvidia at the time. Intel could have raised prices in the face of no competition and having in house fabs, but they didn't, and pumped out high volume instead. AMD was struggling to sell any Bulldozer/Piledriver etc products so their CPUs were often deeply discounted after launch.
3050 launched at the height of the ethereum crypto bubble. The MSRP was completely irrelevant. From THG's review:Hence the "(mostly)" I put in. And, maybe I'm reading to much into it, but given it's poor value, I have this feeling that the 3050 was sort of a begrudging offer from Nvidia. They didn't even try to compete price/performance against the RX 6600.
And, yeah, maybe just a feeling, but I agree - I think they wanted *50 as a $300 card to be the new normal, it seems.
Nvidia's RTX 3050 delivers good performance for its theoretical $249 starting price. Unfortunately, that also means there's not a snowball's chance in hell that it won't sell for radically inflated prices. It lands between the previous-gen RTX 2060 and GTX 1660 Super, both of which currently sell for far more than Nvidia's asking price.
Zen 2 and 3, which saw the largest price increase per core, were both TSMC.
Single GPU halo card from Nvidia.
2011 - GTX 480 - 529mm2 - $499
2016 - Titan X (Pascal) - 471mm2 - $1199
That's flat? Are you trying to be funny?
How about we try arguing with facts instead of reinventing history in the endless effort to defend AMD at all costs?
So, your defense is to use the crypto-craze pricing and MSRP, neither of which I was referencing?3050 launched at the height of the ethereum crypto bubble. The MSRP was completely irrelevant. From THG's review:
That said, the 3050 wasn't competing with the 6600. It launched at $249, which was less than the $279 that the 6500XT launched at and the 3050 trashed that card. It's up 50% in these benchmarks and MSRP'd for less. That's not trying to compete on price/performance?
![]()