News AMD unveils promo pricing in wake of Nvidia's Super launch — cuts pricing for RX 7900 XT and 7900 GRE

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
No, it's not just about demand[1]:

https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2F505a30e4-733d-49e4-86ea-f074a170373a_684x630.png

You want more performance? That comes mostly from more gates. As gate costs have stagnated, chips become more expensive. In the long run, Nvidia and AMD have no choice but to pass most of this cost onto you. Once they reach the limit of what the market will bear, performance must inevitably stagnate.

Here's one reason for increasing costs[1]:
https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2Fcfa276c7-737d-4895-a03e-d41275f1fc7b_778x394.png

Design costs are also going up[2]:
20160312_TQC549.png

Here's a full cost breakdown[3]:
cost-1024x542.png

Finally, here's a history of TSMC wafer pricing, but the forum software doesn't want to inline it, I think because it's SVG[4]:


Sadly, it's not that simple. I think if Nvidia thought there wouldn't be a market for a $1.7k GPU, they simply wouldn't have made the RTX 4090. You wouldn't just magically get the same card for $1k.

This page has a cool graph showing how long it takes for a 5 nm fab to break even, based on % utilization and amount of government subsidies it receives. They claim that with 100% utilization and zero government subsidies, it would take a 5 nm fab 5 years to break even! I can't embed the image, however:

References:
Good info. Thank you.
 
  • Like
Reactions: bit_user
I thought AV1 was deemed "equal" among all 3 in every test I've seen? The only encoding area where AMD is behind is H264, but even in H265/HEVC, it's more or less equal.

And DLSS vs FSR2 is not terribly different to say AMD is far behind. Plus, that one is terribly subjective for a lot of people and most have to really try and find differences on Balanced+ and higher than 1440p. The only consistent thing every one says (reviewer) is that at 1080p is where the differences truly show more evidently.

Regards.
AV1 quality, In terms of lowest error from source, goes Nvidia, Intel, lastly AMD.


https://www.tomshardware.com/news/amd-intel-nvidia-video-encoding-performance-quality-tested

Also NVIDIA has less shimmer and then ray reconstruction for RT. Most reviewers agree NVIDIA is slightly better.

When you have upscaled 1080p NVIDIAs differences really show through. (Alan Wake 2 for example struggles at 1080p with middle tier hardware). 1080p with path tracing is near impossible on AMD
 
Last edited:
AV1 quality, In terms of lowest error from source, goes Nvidia, Intel, lastly AMD.


https://www.tomshardware.com/news/amd-intel-nvidia-video-encoding-performance-quality-tested

Also NVIDIA has less shimmer and then ray reconstruction for RT. Most reviewers agree NVIDIA is slightly better.

When you have upscaled 1080p NVIDIAs differences really show through. (Alan Wake 2 for example struggles at 1080p with middle tier hardware). 1080p with path tracing is near impossible on AMD
My memory wasn't wrong, even going by Jarred's charts in that review: AV1 is more or less a tie; technically you're not wrong, but at that score percentage, the differences are "eagle eye" type finding. So I don't think my statement is incorrect. Unfortunately Jarred didn't do a qualitative analysis of the results and only went via the hard numbers, but I remember that score metric is logarithmic, so above 85, it's more or less "a tie" in terms of perceived quality. I remember checking his videos and doing my own testing and I would still pick CPU over HW encoding any day anyway, but that is besides the point! 😛

I think I also mentioned this in the review, but the streaming quality you upload with is still secondary to how YT, Twitch and any other streaming platform re-encodes your stream and handles quality per bandwidth. For local recorded gameplay for YT and such, I would dare saying most now use 4K (some are moving to 8K) for footage and downscale it or let YT handle it. This is an interesting angle as well to the whole "encoder performance" conversation, since YT does re-encode content for VP9 and such.

As for the RT angle, sorry, I've never seen newer comparisons being made, but I do recall what you're talking about. Early RT implementations I do remember AMD has lots of issues, but I haven't seen any reviewer picking up that again and checking how it is as of late. Happy to be corrected here though. Until then, I'll consider they're at parity from the "implementation quality" perspective and just say there's a performance delta.

And finally, I don't disagree for 1080p upscaling, but even DLSS is not really super impressive there, while it is objectively "better". The "let's sacrifice everything on the scene for better light effects" trade off makes little to no sense to me, personally. That's why it's just a note-worthy thing, but not practical.

EDIT: "Various encoding support within AMD Software including AVC, HEVC and AV1 codecs have undergone additional optimizations to improve video encode quality."
https://www.amd.com/en/support/kb/release-notes/rn-rad-win-24-1-1

Regards.
 
Last edited:
  • Like
Reactions: Avro Arrow
AV1 quality, In terms of lowest error from source, goes Nvidia, Intel, lastly AMD.

https://www.tomshardware.com/news/amd-intel-nvidia-video-encoding-performance-quality-tested
I think you both have a point. AMD's H.256 & AV1 were way better than their H.264, but I don't have an intuitive sense of VMAF and therefore couldn't say how perceivable the remaining quality gap would be.

pyFvwSXDhXSXYfJwxAQuQT.png


vpgYnkSwfQxyEHcXrdUTZT.png


Ld6C4mJrhXjSa9edaUJxgT.png


The article also states:

"AMD has informed us that it's working with ffmpeg to get some quality improvements into the code, and we'll have to see how that goes."

As this was 10 months ago, I wonder if those efforts came to anything.
 
  • Like
Reactions: -Fran-
As good as this is with regard to the RX 7900 XT (and it is good), I'm not sure that cutting the price of the RX 7900 GRE and leaving alone the price of the RX 7800 XT is going to pay great dividends because the GRE's availability isn't exactly widespread. Steve Walton said that dropping the RX 7800 XT's price by $50 would re-assert its position as the go-to for that performance tier over the RTX 4070 Super just like it was against the RTX 4070. I think that's a lot more important than dropping the price of the RX 7900 GRE.
 
Last edited:
  • Like
Reactions: Order 66
The 4080S is going to STOMP all over the 7900XTX
Maybe, maybe not. The "Super" version of the RTX 4080 has pretty much the same specs as the original version. For those people who want RT performance, the RTX 4080 already has been "stomping" the RX 7900 XTX. For those who care more about getting 24GB of VRAM and higher FPS, the RX 7900 XTX will still reign supreme.

The only significant difference between the RTX 4080 and RTX 4080 Super is the price drop and I'm pretty sure that there's plenty of room to drop the price of the XTX. Hell, I got mine for only (the equivalent of) $850USD last August. I mean, sure, it was a Newegg open-box but don't kid yourself, Newegg still made a tidy profit from my purchase. This tells me that the XTX could easily be dropped in price to $850 and beyond if necessary.

It took nVidia almost a year to realise that they needed to drop their prices. AMD responds to market conditions far faster than nVidia ever did, because they need to.

I don't expect the current status quo of Radeon outselling GeForce to shift to nVidia's favour for very long. I could be wrong, but I doubt it because if it does shift then AMD will have no choice but to respond. Radeon has started making some inroads and has been gaining traction with consumers as of late and AMD wouldn't dare allow it to stop.
 
Last edited:
Yes and no. They don't operate in a vacuum. I used to be able to buy a top card for £500ish. If I could buy top tier for say £700ish now I'd be happy to pay, however it's more like £1700-£2000. So I think of the trackdays / weekends away / garage gear / home improvements I could buy for that money and think that such a purchase will never manage to get to the top of the list.

Like I mentioned, if you want a top tier card just because you want top tier, yes you will pay significantly more than the top tier card from a few generations ago.

At the same time you have to look at what the difference is you are getting. Even when using Quad SLI 1080, which would have cost you more than the current single top tier card, you would still not be able to achieve the same performance as the top tier card.

So from a performance perspective, the top tier card is cheaper.
That is why it's sometimes better to make comparisons based on performance than top tier, because you should purchase hardware based on your needs and performance, not because it's top tier.
 
No, it's not just about demand[1]:

https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2F505a30e4-733d-49e4-86ea-f074a170373a_684x630.png

You want more performance? That comes mostly from more gates. As gate costs have stagnated, chips become more expensive. In the long run, Nvidia and AMD have no choice but to pass most of this cost onto you. Once they reach the limit of what the market will bear, performance must inevitably stagnate.

Here's one reason for increasing costs[1]:
https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2Fcfa276c7-737d-4895-a03e-d41275f1fc7b_778x394.png

Design costs are also going up[2]:
20160312_TQC549.png

Here's a full cost breakdown[3]:
cost-1024x542.png

Finally, here's a history of TSMC wafer pricing, but the forum software doesn't want to inline it, I think because it's SVG[4]:


Sadly, it's not that simple. I think if Nvidia thought there wouldn't be a market for a $1.7k GPU, they simply wouldn't have made the RTX 4090. You wouldn't just magically get the same card for $1k.

This page has a cool graph showing how long it takes for a 5 nm fab to break even, based on % utilization and amount of government subsidies it receives. They claim that with 100% utilization and zero government subsidies, it would take a 5 nm fab 5 years to break even! I can't embed the image, however:

References:

Thanks for the explanation, it's interesting.

It clearly shows I heavily overpaid for my first PC in the early 90's: a 486DX266 which was around $6,000? And that was on a 600 nm node and has a measly 2 Mb of RAM. Should have been super low production cost according to the graphs, yet I paid significantly more for it than a current high end setup on 5nm node costs.

But I paid $6,000 at the time because I wanted the best and I was willing to pay for it. The same thing happened with the memory size on my next PC, a Pentium 166. I paid $2,000 just for 64 Mb of RAM. At the time I was the first consumer who ever ordered a computer at that store with that amount of memory in it, because that was typically reserved for high end servers. Again, I paid because I wanted it.
 
When I bought my 1080ti, it was expensive at a bit over £500, but justifiable to me. £1700+ for a top tier card now?
This is some slightly revisionist history. The GTX 1080 Ti wasn't their top-tier card. That distinction was reserved for the Titan.

Since the RTX 3000 series, it seems Nvidia has decided to switch from releasing Titan cards to instead releasing the x090 tier.

It took nVidia almost a year to realise that they needed to drop their prices.
I think they've been aware that pricing was an issue, but it's really the maturing of the TSMC 4N process node which enabled them to provide more perf/$.

Similar factors enabled AMD to do mid-generation refreshes, like when they released the RX 6x50 models.
 
Thanks for the explanation, it's interesting.

It clearly shows I heavily overpaid for my first PC in the early 90's: a 486DX266 which was around $6,000? And that was on a 600 nm node and has a measly 2 Mb of RAM.
Those are some weird numbers. When my dad bought a 386DX-25 in about 1991, it had 8 MB of RAM. At the time, the norm for PCs running Windows 3.0 was about 2 - 4 MB.

According to this, they used either 1000 nm or 800 nm process nodes:

According to this, a Compaq brand PC, featuring the i486DX2-66 launched in Aug. 1992, at a price of $2750.


Should have been super low production cost according to the graphs,
How do you figure? They don't go back that far, and you'd need to know the number of gates it had.

I paid $6,000 at the time
Do you remember what graphics card it had or what monitor? I'll bet you got a modem and printer, too. Still hard to see exactly how you reached $6k, without being ripped off, but $1k for a big monitor wasn't unreasonable, and modem + sound card + high-end graphics card could probably account for another $1k or so.
 
Hardware in the past was stupid expensive
Quotation of PC’s from 1989 for local college

20 x 286 systems @ $2895 each
• 16 bit 286 CPU @ 12.5 mhz
• 1mb ram expandable to 5mb
• Ems 4.09 supplied and supported
• Real time clock
• 40mb HDD
• 1 3.5 inch 1.44mb floppy
• 1 5.25 inch 360k drive floppy drive
• Serial port
• Parallel port
• 80287-8 maths co processor
• 16 bit vga card and tystar vga high res monitor
• 1 optical mouse

Special features
BMS 386 sx CPU module at 16 mhz with socket for 80387sx. Price = $495
BMS i486 CPU module at 25 mhz available March 1990

Standard upgrades
From 1mb to 1.5mb = $145
From 1mb to 2mb = $290
From 1mb to 3mb = $495
From 1mb to 4mb = $990

Replace 5.25 360k drives with 5.25 1.2mb drives = $35 per drive
Upgrade from 40mb to 80mb hdd = $495.00

Ethernet card
8 bit Ethernet card = $235
16 bit Ethernet card = $255

Software
MS Dos 4.0 = $110
MS Word Education pack of 10 users = $625
MS Windows 2.11 Education pack of 10 users = $388
MS Works Education pack of 10 users = $405
And its costs now only matched by GPU prices and how much Apple prices its upgrades
 
  • Like
Reactions: bit_user and -Fran-
Those are some weird numbers. When my dad bought a 386DX-25 in about 1991, it had 8 MB of RAM. At the time, the norm for PCs running Windows 3.0 was about 2 - 4 MB.

According to this, they used either 1000 nm or 800 nm process nodes:

According to this, a Compaq brand PC, featuring the i486DX2-66 launched in Aug. 1992, at a price of $2750.


How do you figure? They don't go back that far, and you'd need to know the number of gates it had.


Do you remember what graphics card it had or what monitor? I'll bet you got a modem and printer, too. Still hard to see exactly how you reached $6k, without being ripped off, but $1k for a big monitor wasn't unreasonable, and modem + sound card + high-end graphics card could probably account for another $1k or so.

Modem was separate as the internet was not available yet at that time. US Robotics Courier. A beast at the time for something like $800, but it was in use until well after 2006 and was always upgradable to the latest standard. 28.8K, 33.6K V90/56K flex.

Came with a nice 250 MB HDD though. The space was amazing! And it did have a dedicated video card, but I forgot what exactly it was. All I remember was I changed it later to a Matrox Millennium.
 
Last edited:
  • Like
Reactions: bit_user
Modem was separate as the internet was not available yet at that time. US Robotics Courier. A beast at the time for something like $800, but it was in use until well after 2006 and was always upgradable to the latest standard. 28.8K, 33.6K V90/56K flex.
Back in the 14.4k era, I remember paying a guy $200 for a used 9600 bps modem. I think my friend saw an ad on a BBS and told me about it. It was a little dramatic, as I was just a kid and we met on a historic bridge to do the deal (it was an easy landmark and this was before MapQuest and smartphones or even commodity GPS devices).
 
Like I mentioned, if you want a top tier card just because you want top tier, yes you will pay significantly more than the top tier card from a few generations ago.

At the same time you have to look at what the difference is you are getting. Even when using Quad SLI 1080, which would have cost you more than the current single top tier card, you would still not be able to achieve the same performance as the top tier card.

So from a performance perspective, the top tier card is cheaper.
That is why it's sometimes better to make comparisons based on performance than top tier, because you should purchase hardware based on your needs and performance, not because it's top tier.
Or... I buy top tier so I don't have to change it as often - I'm not sure why I'd be trying to get performance today that will be top-tier in 8 years, so your quad-sli thing... makes zero sense to me as a point of topic, sorry.

I (used to) buy top tier, because I could be pretty sure that it will last a very long time on performance until such time as requirements overtake a component, I've done this all the way back to a GeForce 2 GTS. I could spend a notional £700 on a mid-range card now and instead of it lasting 6 years, I may only get 2 out of it before it's performance is an issue for me and I'll end up buying another card for £800 and then another 2 years after that for £900. So it's shockingly poor value and I'll probably end up dipping out of the GPU-buying market for yet another generation, or just bail on PC gaming altogether.
 
Or... I buy top tier so I don't have to change it as often - I'm not sure why I'd be trying to get performance today that will be top-tier in 8 years, so your quad-sli thing... makes zero sense to me as a point of topic, sorry.

I (used to) buy top tier, because I could be pretty sure that it will last a very long time on performance until such time as requirements overtake a component, I've done this all the way back to a GeForce 2 GTS. I could spend a notional £700 on a mid-range card now and instead of it lasting 6 years, I may only get 2 out of it before it's performance is an issue for me and I'll end up buying another card for £800 and then another 2 years after that for £900. So it's shockingly poor value and I'll probably end up dipping out of the GPU-buying market for yet another generation, or just bail on PC gaming altogether.

I used to do the same thing, until I realized that with technologies like DLSS, FSR, RT, DX revisions, engine revisions there's no chance a card will last years.
So I go for what I consider decent enough value, and pay the price.
 
Status
Not open for further replies.