News Asus Reveals Official RX 6500 XT European €299 MSRP

Why do I have a sneaking suspicion that manufacturers are engaging in a just a teeny-tiny little bit of greed here?

And I thought the 1660 Super I got for £260.00 (about €320 I think?) almost exactly 1 year ago was lousy value for money - it was literally the only thing available that wasn't complete cr*p when my last GPU very inconveniently died at the height of the shortage/scalpedemic.
 
RX590-class performance with a cut-down frame buffer, halved PCIe bandwidth four years later for 30% more dollars. The performance-per-dollar value regression is getting worse at a ridiculous pace.
Hm... Not to defend AMD or anything that's going on here with the insane prices, but how is the inflation vs the lackluster releases as of late? Context is your comment about "performance-per-dollar", which I think is interesting to put some numbers to it.

Regards.
 
Hm... Not to defend AMD or anything that's going on here with the insane prices, but how is the inflation vs the lackluster releases as of late? Context is your comment about "performance-per-dollar", which I think is interesting to put some numbers to it.
AMD's own RX6500 presentation says it will only be 10-20% faster than the RX580, which is roughly RX590 territory. The 8GB RX580 could be had for $120 or less before the crypto craze flared up for the second time. Since the RX570-590 have roughly twice as much VRAM bandwidth (256bits x 8GT/s vs 64bits x 16-18GT/s), 8GB models have twice as much VRAM and all have a full 3.0x16 interface, I bet there will be many scenarios where the four years old designs will beat the massively crippled RX6500.

The only good thing about the RX6500 is that it will use about half as much power.
 
AMD's own RX6500 presentation says it will only be 10-20% faster than the RX580, which is roughly RX590 territory. The 8GB RX580 could be had for $120 or less before the crypto craze flared up for the second time. Since the RX570-590 have roughly twice as much VRAM bandwidth (256bits x 8GT/s vs 64bits x 16-18GT/s), 8GB models have twice as much VRAM and all have a full 3.0x16 interface, I bet there will be many scenarios where the four years old designs will beat the massively crippled RX6500.

The only good thing about the RX6500 is that it will use about half as much power.
As I said, I'm not defending the awful card the 6500XT is. I know what it is and how awful the optics are. I'm just curious about putting some actual numbers behind the assessment of "performance-per-dollar" taking inflation into account. I could do the math myself, but I'm too lazy today, haha.

Regards.
 
As I said, I'm not defending the awful card the 6500XT is. I know what it is and how awful the optics are. I'm just curious about putting some actual numbers behind the assessment of "performance-per-dollar" taking inflation into account. I could do the math myself, but I'm too lazy today, haha.
There has only been 6-7% of real inflation over the last five years, the rest is just everyone price-gouging everyone else because everyone else is trying to price-gouge everyone. Since the RX6500 is a significantly pared-down product relative to the RX580 and made on a finer process which has much lower net cost per transistor than 12/16nm, the silicon should still be cheaper despite inflation and TSMC's price-gouging.

Were it not for crypto driving GPU prices to the moon and forcing budget buyers to compete for garbage like the RX6500, this would be a ~$120 GPU.