News AMD Simulated RX 7800 XT Performs Similarly to RX 6800 XT

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Prices of 8GB of memory have dropped considerably. It will cost ~$30 extra for an additional 8GB. It likely less as AMD buys in bulk.
  1. You should separate the parts cost from the final cost to consumers.
  2. The recent article about falling GDDR6 prices looked only at 8 Gbit dies, but what you'd want is 16 Gbit dies.
  3. I think the prices already were probably in quantities of 10k. You probably don't save much more by ordering even larger volumes.
  4. I think AMD isn't buying, but rather the board makers.
 
New cards cannot perform the same as prior generations. This isn't just GPUS. Every CPU generation gets faster. Same with SSDs.
GPUs are different, because their opportunities for better IPC are more limited. The main way you make a faster GPU is by adding more shaders, more cache, more clockspeed, etc. If they're not scaling out (i.e. 70 CU vs. 72 in the previous gen), in order to keep prices in check, then there goes your performance gains.

Again, whether it's any good or not depends entirely on the price. If the price follows the old launch price, but adjusts for inflation, then I'd agree that we should expect a rather normal generational uplift. If it's priced below that, then you shouldn't regard it as a direct successor and just have to judge it on its own merits.

I've been saying this a lot, recently: we are not well-served by looking too hard at the names of these products. The only things you should be looking at are perf/$, perf/W, and memory capacity. Everything else is a distraction.
 
  • Like
Reactions: Jagar123
Moore's law would stand on the free market, in duopoly those 2 players can do whatever they want
Moore's law doesn't anticipate wafer pricing outpacing density or performance gains of new nodes. Think about that, for a minute.

Here's a summary of TSMC wafer pricing, for a range of nodes:

Price per Wafer$20,000$16,000$10,000$6,000$3,000$2,600$2,000
NodeN3N5N7N10N2840nm90nm
Year Introduced2022202020182016201420082004

Source: https://www.tomshardware.com/news/tsmc-will-charge-20000-per-3nm-wafer

thank god GPUs are not necessary for humans to survive on this planet.
They're required for a lot of jobs that we depend on.
 
I get your point, but that is too far off.
Oh I disagree with you 100%. I'm going by (literally) 35 years of PC building experience and watching the market evolve over that time. The current nomenclature standard seems to have been more or less started by chance when the GeForce6-series and Radeon X1000-series cards both happened to have part numbers that matched up with each other and since it made it easier for customers to compare them this way, both companies just seemed to run with it.
The 7900XTX is better than the 4080 and 7900XT is worse.
The RX 7900 XTX is, according to the TechPowerUp GPU database, a "whopping" 2% faster than the RTX 4080. That's a slimmer margin than the RTX 3080 has over the RX 6800 XT and something that I call "a tie". The only major tech advantage that the RX 7900 XTX has over the RTX 4080 is the extra 8GB of VRAM. While that's no small thing, it won't be relevant for a very long time as 16GB will be more than enough for high-resolution gaming for at least the next 5 years (if not more). The other major advantage that the RX 7900 XTX has over the RTX 4080 is price but price has no bearing whatsoever when it comes to the performance tier in which a card belongs.
Dropping them to 7800/7700 is too much. Definitely not 7900, that was a serious overreach especially for 7900XT. Maybe 7800XT and 7800 instead?
That doesn't fix the problem that the RX 7900 XTX is still a level-8 card. The proof of this really came out recently when we saw the unreleased Radeon numbers like the RX 7950 XT and 7950 XTX.

DannyzReviews talks about this as well (I have it cued up to the proper part so you don't have to search the video):

We also shouldn't forget the following:
"AMD executives, David Wang and Rick Bergman, in an interview with a Japanese media outlet, IT Media, have admitted choosing not to develop an RTX 4090 competitor because they don't want to sell $1600 GPUs to consumers."

and....

"Mr. Bergman Technically, it is possible to develop a GPU with specs that compete with theirs (NVIDIA) . However, the GPU developed in this way was introduced to the market as a ``graphics card with a TDP (thermal design power) of 600W and a reference price of $1,600 (about 219,000 yen)'', and was accepted by general PC gaming fans . After thinking about it, we chose not to adopt such a strategy ."

Whether you believe the words of business executives (aka pathological liars) or not, the card that they were referring to, the one that didn't get released, was the level-9 card, not the card that DID get released.
Ultimately Nvidia is in the driver's seat. They have such sticky buyers that the 7900XT could have been sold for $100 and Nvidia would still sell more 4080 cards.
I agree with you there. Maybe AMD has given up trying to win on price because so many people are ignorant enough that they just want to see a green box.
 
  • Like
Reactions: smitz314
  1. You should separate the parts cost from the final cost to consumers.
  2. The recent article about falling GDDR6 prices looked only at 8 Gbit dies, but what you'd want is 16 Gbit dies.
  3. I think the prices already were probably in quantities of 10k. You probably don't save much more by ordering even larger volumes.
  4. I think AMD isn't buying, but rather the board makers.
I'm spot market price differencing the 8mbit and 16mbit dies. $30 is the diff for packages of 8. At once time on gddr6 it was only $22 diff. It was the slower speed, but the market is so horrid night now that's understandable.

AMD and Nvidia both sell chipsets. That's the main GPU and the memory together. That way they get a bulk discount and partially pass that savings into on to aibs.
 
I agree with you there. Maybe AMD has given up trying to win on price because so many people are ignorant enough that they just want to see a green box.
That's exactly what they are doing. They are pricing their products to be just a bit cheaper than their Nvidia competitors - maybe 10-20%, They will get the customers that are looking for the best value as well as the AMD fanbois (they are out there). Bigger discounts won't get them enough incremental sales to cover the margin lost in the discount across the SKU.

And here is the bad news. Nothing will change. Nvidia's AI boards are bringing in bank so they don't need big volumes on consumer GPUs. They can keep margins very high and still sell plenty. This is how boutique / designer products work. Ferrari isn't slashing prices to boost their sales numbers. They just turn the marketing up and keep margins high.
 
They're required for a lot of jobs that we depend on.
So what? Who needs it will get any way, who doesn't won't buy it and PC gaming might almost disappear, who cares, it's not the end of the world if average Joe can't afford a brand new GPU or something...that was the point.
 
Moore's law would stand on the free market, in duopoly those 2 players can do whatever they want and it's to consumers whether they tolerate it or not, thank god GPUs are not necessary for humans to survive on this planet.
It’s stupid to call it a law it’s not. And it never was. All it was was an observation.
 
It’s stupid to call it a law it’s not. And it never was. All it was was an observation.
Jim Keller has gone so far as to suggest it's an aspiration, even. In other words, what started out as an observation turned into an expectation, which everyone would work to try and meet. He even implied that once AMD decided Moore's Law was dead, they started behaving as if it was dead and that's a big part of the reason they fell so far behind Intel.

 
That's exactly what they are doing. They are pricing their products to be just a bit cheaper than their Nvidia competitors - maybe 10-20%, They will get the customers that are looking for the best value as well as the AMD fanbois (they are out there). Bigger discounts won't get them enough incremental sales to cover the margin lost in the discount across the SKU.

And here is the bad news. Nothing will change. Nvidia's AI boards are bringing in bank so they don't need big volumes on consumer GPUs. They can keep margins very high and still sell plenty. This is how boutique / designer products work. Ferrari isn't slashing prices to boost their sales numbers. They just turn the marketing up and keep margins high.
As long as people keep throwing money at nVidia (and using lame excuses to do so like RT), nothing will get better.

Oh well, at least I know that whatever nVidia does, I'm not directly impacted since I don't buy GeForce cards.
 
  • Like
Reactions: Order 66