News Nvidia's grasp of desktop GPU market balloons to 88% — AMD has just 12%, Intel negligible, says JPR

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

mangosaurus

Reputable
Mar 12, 2020
2
1
4,515
The sad part is that AMD cards are more efficient and run cooler, but until they fix their buggy software drivers no one will care.
What are you talking about? The 4080/4080 Super is the most efficient GPU on the market right now in frames/watt on average, closely followed by the 4070 Super and 4090. AMD's most efficient card this generation is the 7900 XTX which comes in just behind those 4.
 
  • Like
Reactions: artk2219
And no, im using the normal AMD drivers - not some OEM custom stuff. Matter of fact the OEM custom stuff is the only driver that works fine - problem is it hasn't been renewed for 2 years now so latest games just refuse to work.
So there you go, you are using drivers that are not made for your hardware configuration...you can't complain about them not working right.
 

TheHerald

Notable
Feb 15, 2024
1,041
299
1,060
So there you go, you are using drivers that are not made for your hardware configuration...you can't complain about them not working right.
A moment ago you said the problem is the oem drivers. Now the problem is not the oem drivers. Rofl.

And no, I'm using amds drivers that are specific for my hardware. Google G14 2022 amd gpu drivers and you'll find out, they are supposedly tailor made for my laptop. They are horrible though. Only the asus ones work fine but they are outdated
 
Last edited:

TheHerald

Notable
Feb 15, 2024
1,041
299
1,060
Back then, we were educated consumers.

Now we are corporations groupies approaching cult members behavior.
Personally I don't give a damn about brands. I have my preferences but whoever makes the better product wins my $$. Amd makes better laptop cpus so I have 4 of them. Intel makes better desktop cpus, so my desktop is Intel.

If amd makes an xx90 competitor, oh boy am I buying it.
 
  • Like
Reactions: valthuer
Why didn't you just RMA the 6700 xt? Sure I get the less power usage, the 4060 ti is a newer card, with a newer architecture, on a newer process, and it sucks about the coil whine, but an RMA earlier could have fixed that too. Not every card is a winner, I've had to RMA cards from just about every manufacturer, including the mythical EVGA. Everyone makes a dud every now and again.

In my country they give us 12 months of warranty before that only crying with a lawyer...
but I'm Really happy with the 4060 no complains.
can Play hell divers at 192W on the wall 60 fps
 
  • Like
Reactions: artk2219
I don't agree with a lot of people here.. i owned Ati 1980 pro i think ( my first pc ), after that i owned Radeon hd 4850 , hd 4890 ( if im sure ), hd 6950 , radeon 270 , Nvidia 1070, Nv 2070 super ( is in my daughter pc) and i have a Amd 6800 non xt ( with low power consumption with a little undervolt ). I had only once problem with drivers on Radeon gpu's ( on hd 4890 ) and never again , i don't know why people are complaining about AMD drivers ? Some i think never owned a ATI ,Radeon, AMD gpu and they just talk because they heard that somewhere.
 

Johnpombrio

Distinguished
Nov 20, 2006
252
73
18,870
Driver support for AMD has GOT to be tough. Without a lot of sales, it must be difficult to pay for a large team to constantly test out all the desktop games coming out, work with the developers, and write bug free code. It only takes ONE game that has driver issues to sour a bunch of gamers on AMD drivers. Tough to do, for sure.
 
Yeah... Because, if it weren't for those influencers, everybody would be flocking to buy 7900 XTX.

Coz everyone knows it's a much better high-end performer than 4090.
Of course it isnt, but what it IS is as fast as an RTX 4080 and 4080 Super in standard rasterization, which is the most important things for the vast majority of games ever produced, and that will be produced within the next 4 or 5 years. While also being cheaper than them, and giving you 50% more memory while being within 20% of an RTX 4090 in standard rasterization, for close to half the price. Don't get me wrong if you care about ray tracing, need CUDA, or if you have an application that requires an Nvidia GPU, then definitely, they are your only choice, of course that last part is on purpose given their love or proprietary standards. But for the vast majority of gamers out there, there really arent a ton of good reasons for why you shouldnt consider purchasing an RX 7900 XTX instead of an RTX 4080 (super). Of course if you just want that extra 20% or so of performance, and if the money doesn't matter to you, the RTX 4090 is still there at the top in all of its 4 expansion slot, power cable burning, and PCB cracking glory (to be fair thats not necessarily the GPU's fault, its more of an issue with bad packaging while shipping, and any similarly sized GPU will face the same issues).
 
Last edited:
  • Like
Reactions: NeoMorpheus

TheHerald

Notable
Feb 15, 2024
1,041
299
1,060
Of course it isnt, but what it IS is faster than an RTX 4080 and 4080 Super
No, not really.

image-2024-06-08-012614339.png

which is the most important things for the vast majority of games ever produced, and that will be produced within the next 4 or 5 years.
You are in agreement with AMD that raster is the most important thing, that's why they hit rock bottom at 12% market share. Cause people think otherwise.
 
  • Like
Reactions: artk2219

TheHerald

Notable
Feb 15, 2024
1,041
299
1,060
Of course if you just want that extra 20% or so of performance, and if the money doesn't matter to you, the RTX 4090 is still there at the top in all of its 4 expansion slot, power cable burning, and PCB cracking glory (to be fair thats not necessarily the GPU's fault, its more of an issue with bad packaging while shipping, and any similarly sized GPU will face the same issues).
Huh, but nothing about the faulty cooler on the 7900xtx. Who would have guessed. You definitely don't sound biased
 
  • Like
Reactions: artk2219

35below0

Respectable
Jan 3, 2024
1,727
743
2,090
In my country they give us 12 months of warranty before that only crying with a lawyer...
but I'm Really happy with the 4060 no complains.
can Play hell divers at 192W on the wall 60 fps
Honestly, i would think Helldivers would be too much for a 4060. I know it's enough according to recommended requirements but still... :)
Back then, we were educated consumers.

Now we are corporations groupies approaching cult members behavior.
Some i think never owned a ATI ,Radeon, AMD gpu and they just talk because they heard that somewhere.
Yes, it was parroted in tech articles for years and years. AMD drivers bad.

I have had a bad experience with ATi but every company can release a dud occassionally.
Driver support for AMD has GOT to be tough. Without a lot of sales, it must be difficult to pay for a large team to constantly test out all the desktop games coming out, work with the developers, and write bug free code. It only takes ONE game that has driver issues to sour a bunch of gamers on AMD drivers. Tough to do, for sure.
I somewhat agree. Perception is key here. Intel had a lot of issues with drivers for their Arc series, and the GPUs were not good enough for demanding gaming. And yet there is still a LOT of goodwill towards Intel (of all corps) and they are praised for improving drivers relatively quickly.

AMD couldn't catch this break because they themselves never really put their back into it and challenged gamers to put the buggy drivers issue to rest once and for all.
Even AMD loyalists are critical of the company dedication even if they still like the products.

nvidia is seen as coasting but also doing their utmost to iron out driver issues. AMD... just coasting on non-gaming segment development and success.
Of course it isnt, but what it IS is faster than an RTX 4080 and 4080 Super in standard rasterization, which is the most important things for the vast majority of games ever produced, and that will be produced within the next 4 or 5 years.
Disagree. Again it's the perception that ray tracing and DLSS3 is the hot new thing right now. Not rasterisation. Defending AMD performance in rasterisation is defending outdated technology.

Stats may show you are correct, and frankly you are. Today. But for many people looking at 4070 Ti Supers and 4080 Supers, not buying them, just looking! For them games like Cyberpunk and Alan Wake and expectation of ray tracing performance is the reason they're looking at nvidia, and not AMD. And tomorrow, when they do buy something it will have ray tracing performance out the wazoo. So, nvidia.

AMD should have an answer, whatever the price may be. And for now it's more VRAM and lower price BUT at the penalty of no new gimmicks and nvidia calmly pointing out they beat AMD GPUs even with less VRAM.
¯\_(ツ)_/¯

Not good enough.
 

valthuer

Upstanding
Oct 26, 2023
132
141
260
Of course it isnt, but what it IS is faster than an RTX 4080 and 4080 Super in standard rasterization, which is the most important things for the vast majority of games ever produced, and that will be produced within the next 4 or 5 years. While also being cheaper than them, and giving you 50% more memory while being within 20% of an RTX 4090 in standard rasterization, for close to half the price. Don't get me wrong if you care about ray tracing, need CUDA, or if you have an application that requires an Nvidia GPU, then definitely, they are your only choice, of course that last part is on purpose given their love or proprietary standards. But for the vast majority of gamers out there, there really arent a ton of good reasons for why you shouldnt consider purchasing an RX 7900 XTX instead of an RTX 4080 (super). Of course if you just want that extra 20% or so of performance, and if the money doesn't matter to you, the RTX 4090 is still there at the top in all of its 4 expansion slot, power cable burning, and PCB cracking glory (to be fair thats not necessarily the GPU's fault, its more of an issue with bad packaging while shipping, and any similarly sized GPU will face the same issues).

Please, don't get me wrong.

I only brought up 7900 XTX as an example, because i didn't like the implication that Nvidia's rising market share is due to consumers being deceived by influencers who got free 4090s.

As someone that has repeatedly opted for AMD in the past, but is currently an Nvidia user, i find that insulting.

People don't buy Nvidia over AMD because they're just too stupid to realize AMD is a good alternative. They do it because there are real reasons to pick Nvidia over AMD, especially right now.
 
No, not really.

image-2024-06-08-012614339.png


You are in agreement with AMD that raster is the most important thing, that's why they hit rock bottom at 12% market share. Cause people think otherwise.
Fair enough, I'll edit that, they're all statistically tied, i mean its a four frame difference from the "slowest" to the "fastest". But I'm going to die on that rasterization hill for the time being. Ray tracing, especially full path ray tracing is most definitely the future, but there isn't a card around RIGHT NOW that will give you performance worth a darn when it actually becomes something that is more commonly used. I'll definitely change my tune in a few years when its something that will be a must have feature for new games, but for this generation and the next, its a nice to have, not a need to have. By the point it is a need to have, most people will be upgrading from all of these cards anyway.
 
Last edited:
Huh, but nothing about the faulty cooler on the 7900xtx. Who would have guessed. You definitely don't sound biased
Honestly i missed that one, ill do some research and post a link on it because its definitely worth mentioning, it wasn't skipped on purpose. From what I'm reading it was mainly due to some faults in the manufacturing process, it only effected some of the reference cooler launch cards, and it has since been resolved 🤷‍♂️.

https://www.techpowerup.com/302917/...ay-feature-faulty-coolers-causing-overheating

https://www.tomshardware.com/news/amd-faulty-thermal-solution-7900-xtx-throttling

https://www.pcmag.com/news/not-enough-water-amd-identifies-cause-of-thermal-issue-on-radeon-rx-7900
 
Last edited:
  • Like
Reactions: 35below0
Please, don't get me wrong.

I only brought up 7900 XTX as an example, because i didn't like the implication that Nvidia's rising market share is due to consumers being deceived by influencers who got free 4090s.

As someone that has repeatedly opted for AMD in the past, but is currently an Nvidia user, i find that insulting.

People don't buy Nvidia over AMD because they're just too stupid to realize AMD is a good alternative. They do it because there are real reasons to pick Nvidia over AMD, especially right now.
Oh I agree there, while influencers do have more influence than the rest of us, they don't have THAT much influence over an entire market. I flop back and forth between the two myself, mostly depending on what i can get a good deal on. My previous card was an RX 6900 XT, but before that i was using an RTX 3080, i decided i wanted more than 10GB of VRAM, and i found someone that would do a straight swap with me for it. While i definitely agree there can be some good reasons to pick Nvidia over AMD, i do think that for many people they've had a fear put into them about going with another option. This also is an issue for Intel for that matter, now that they've gotten their driver issues somewhat sorted they're still having an issue penetrating the market even though they have some decent options available. Maybe they'll have better luck with Battlemage, but really i just see them fighting with AMD on a race to the bottom where the only real winner will be Nvidia. The GPU market does not seem like its in for a good time in the near future.
 
Last edited:

baboma

Notable
Nov 3, 2022
281
336
1,070
https://tweakers.net/nieuws/222772/...ntel-arc-gpus-verschijnen-pas-begin-2025.html

ChatGPT translate:

Next Generations of AMD Radeon and Intel Arc GPUs Will Not Appear Until Early 2025

The next generations of graphics cards from AMD and Intel will not appear until early 2025. Tweakers has verified this with multiple sources during the Computex 2024 show in Taipei. According to earlier rumors, competitor Nvidia will release a new GeForce RTX 5000 series sooner.

Officially, both manufacturers will not meet their self-imposed goals for the new graphics cards. AMD's next-generation graphics cards, which use the RDNA4 architecture and are likely to be marketed as the Radeon RX 8000 series, were supposed to be released in 2024 according to the roadmap. Intel also said at the end of last year that the new generation of Arc graphics cards, better known under the codename Battlemage, would be released this year.

It is possible that AMD and Intel will announce their new GPUs just before the end of the year to meet the previously mentioned deadlines, but an actual release at the CES show in early January is more likely. In any case, availability will only pick up in the new year.

Sources confirm to Tweakers that there will be no true high-end models in the RX 8000 series in the new AMD lineup. Last year, rumors already circulated that the Navi 4x GPU with the highest number of cores had been canceled. Shortly thereafter, the head of AMD's graphics card division left. The company no longer aims to compete with Nvidia's top models but instead wants to focus on the mainstream segment with cards like the RX 7700 XT and RX 7800 XT.

Less is known about the upcoming generation of Intel Arc graphics cards. Sources tell Tweakers that these GPUs will at least match the performance level of the GeForce RTX 4070. The top model of the current generation, the Arc A770, is just slightly slower than the GeForce RTX 4060 and Radeon RX 7600 in Tweakers' benchmarks. This would mean a generation-on-generation performance gain of more than 60 percent, but still not enough to challenge Nvidia in the top segment. More about the improvements that make such a performance leap possible can be read in our Intel Lunar Lake preview, as the iGPU in it is the first product to use the Xe2 architecture, which also underpins the Battlemage graphics cards.
 

razor512

Distinguished
Jun 16, 2007
2,154
84
19,890
AMD keeps shooting themselves in the foot when it comes to competing with Nvidia. Basically since the RX6000 generation, they started to copy the negative aspects of Nvidia. For example, when Nvidia was price gouging during the mining craze and gamers were desperate for GPU upgrades, AMD copied Nvidia's price gouging, thus leaving people who were in the market to upgrade, with no worthwhile options.

Then with the RX7000 series, they did it again, Nvidia price gouged on the RTX 4000 series, and AMD copied them. Imagine if they went with pre-mining craze prices, the RX7800 XT class card would have quickly been the most popular GPU on places like the steam hardware survey.

While AMD has charged a little less than Nvidia, they effectively became the relativist thinking minus A choice that drives undecided people to Nvidia.
Think how businesses will often place a bad option that is slightly cheaper than their most profitable plan, while other less profitable options will lack close options, as people like being able to compare.
AMD is unentitionally creating that scenario for Nvidia, by making a minus option for them.

For example, Nvidia offers more features, even if most people will not use those extra features.
If someone sees 2 similarly performing GPUs, but the Nvidia GPU is 10% more expensive, then an undecided user will be more likely to go with nvidia because they will see a bunch of extra features that they won't use, but the price is close enough that the user can justify it be thinking, it is better to have it and not need it than to need it and not have it, thus spending a little mor enow, hedges their bet if the need arises.

On the other hand, suppose AMD does pre-mining craze prices (mining craze started in early 2017, thus making the mid 2016 released the last generation that didn't have an MSRP influenced by GPU mining). Cards like the 7800XT class card would for $379.99 (equivalent of upper mid range pricing from before Etherium mining exploded). Now a user will find it insane to go with Nvidia, unless they absolutely must have an Nvidia specific feature for their work.
While it would mean less profit per card sold, it would allow for rapid market share gains, as they will be targeting the largest segment of the PC gaming community; people who were priced out of the GPU market after the mining craze started.

If anything they should consider a smaller profit margin as an investment in the future, at least to quickly bring the people who are still making due with their GTX cards, as well as RX5000 and RX500 series cards, there are many who held on due to each generation within a mining craze ended up with steep price increases, and by the time prices started to go down, the next gen was already about to be released, only for the mining craze gouging to start again. Just about all of those users will be itching to upgrade, but will be turned away due to the current pricing, and whose who finally give in, and accept being ripped off, will choose to get more features while being ripped off.
 
Last edited:
  • Like
Reactions: artk2219

razor512

Distinguished
Jun 16, 2007
2,154
84
19,890
Just like the 1050 Ti outselling the much better RX 580 before that. Didn't matter. Why take a hit on margins if you practicaly have to give them away?
Keep in mind that back in those days, AMD was also dealing with a lot of reputational damage that they needed to establish a track record to fix.

From the HDxxxx series from AMD, they developed a bad reputation due to driver bugs, refusal to listen to the community about issues, and quickly abandoning cards. For example, cards like the HD3850 had a few driver issues which lead to poor performance in a number of OpenGL titles at the time. Since the driver never specified OpenGL texture memory availability, games and applications read the lack of a value as being 0MB, this meant that many OpenGL games would use system RAM for texture memory, which had a large performance hit. The issue was especially bad on OpenGL titles that used user generated textures, as they were far more VRAM throughput intensive, since they could not be as optimized as a game that did not allow it. This meant that games like Second Life would perform very poorly, especially if you were using AGP cards which has less throughput than PCIe, thus making the use of system memory exclusivelyfor textures really bad.

At the time I had am HD3850, and experienced the issue myself. Many people, especiallly players of Second life brought the issue up on the AMD forums, and there was quite a large thread about it, and when it didn't seeem like the issue was going away, they baned most users who complained in the thread.
All of those users became Nvidia users after that, as even slower Nvidia cards were performing far better in those types of games, all because they were making use of the VRAM instead of the system memory for textures.

By the time AMD fixed that issue, the cards driver support was discontinued shortly after. The quick rate at which AMD stopped supporting older cards, also drive people to Nvidia.

Eventually AMD started to support cards for a longer time and be more responsive to issues the community brought up, but it took them years to establish that new track record.
 
  • Like
Reactions: artk2219
AMD keeps shooting themselves in the foot when it comes to competing with Nvidia. Basically since the RX6000 generation, they started to copy the negative aspects of Nvidia. For example, when Nvidia was price gouging during the mining craze and gamers were desperate for GPU upgrades, AMD copied Nvidia's price gouging, thus leaving people who were in the market to upgrade, with no worthwhile options.

Then with the RX7000 series, they did it again, Nvidia price gouged on the RTX 4000 series, and AMD copied them. Imagine if they went with pre-mining craze prices, the RX7800 XT class card would have quickly been the most popular GPU on places like the steam hardware survey.

While AMD has charged a little less than Nvidia, they effectively became the relativist thinking minus A choice that drives undecided people to Nvidia.
Think how businesses will often place a bad option that is slightly cheaper than their most profitable plan, while other less profitable options will lack close options, as people like being able to compare.
AMD is unentitionally creating that scenario for Nvidia, by making a minus option for them.

For example, Nvidia offers more features, even if most people will not use those extra features.
If someone sees 2 similarly performing GPUs, but the Nvidia GPU is 10% more expensive, then an undecided user will be more likely to go with nvidia because they will see a bunch of extra features that they won't use, but the price is close enough that the user can justify it be thinking, it is better to have it and not need it than to need it and not have it, thus spending a little mor enow, hedges their bet if the need arises.

On the other hand, suppose AMD does pre-mining craze prices (mining craze started in early 2017, thus making the mid 2016 released the last generation that didn't have an MSRP influenced by GPU mining). Cards like the 7800XT class card would for $379.99 (equivalent of upper mid range pricing from before Etherium mining exploded). Now a user will find it insane to go with Nvidia, unless they absolutely must have an Nvidia specific feature for their work.
While it would mean less profit per card sold, it would allow for rapid market share gains, as they will be targeting the largest segment of the PC gaming community; people who were priced out of the GPU market after the mining craze started.

If anything they should consider a smaller profit margin as an investment in the future, at least to quickly bring the people who are still making due with their GTX cards, as well as RX5000 and RX500 series cards, there are many who held on due to each generation within a mining craze ended up with steep price increases, and by the time prices started to go down, the next gen was already about to be released, only for the mining craze gouging to start again. Just about all of those users will be itching to upgrade, but will be turned away due to the current pricing, and whose who finally give in, and accept being ripped off, will choose to get more features while being ripped off.
While I understand the sentiment, and if this were in a world without the highest inflation that we have seen since the 1970's I would agree. Unfortunately that isn't the case, just about everything has gotten more expensive since mid 2016. I went ahead and used the Bureau of labor statistics inflation calculator and plugged in the $379.99 price you quoted for Mid 2016 (i used July, but it shouldn't make much of a difference if you picked any other month in 2016), that $379.99 is now the equivalent of $495.14 today, or roughly the price that you are paying for a current RX 7800 XT. No one tries to give their stuff away without making much of a profit, and honestly everything is just more expensive, so here we are, and here we will be. We just haven't caught up to the new reality yet because it was thrown at us too quickly. That is unless they find some amazing way to super cut production costs across the board. Thats not even getting into the fact that current cards are MUCH more complex and difficult to manufacturer than cards from 8 years ago. Also this just reminded me that 2016 was a weird year for AMD, similar to this next upcoming generation actually. R9 Fury X had come out the year before and it was a hot, power hungry, expensive flop with less VRAM than the GTX 980 TI that it under performed against. So they went back to the mainstream segment and released the RX 480 (4GB was $199 and 8GB was $239) based on the Polaris architecture which would become AMD's longest lasting GPU architecture, and AMD released the AM4 socket but only for OEM's and it was meant for use with Bristol Ridge Excavator based CPU's. The next year the released RX VEGA 56 and 64, which while it was a definite improvement over Fury, and performed well against the vanilla 1070 and 1080, had its launch spoiled immediately by the launch of the 1070 ti, and the fact that it couldn't catch the 1080 ti. It wouldn't be until RDNA and the launch of the RX 5700 series that AMD regained some market share that was lost during those 2016 - 2019 years. Granted they were just fighting to stay alive from 2016 - 2018, it was still a mess all over.

https://data.bls.gov/cgi-bin/cpicalc.pl?cost1=379.99&year1=201607&year2=202404
 
Last edited:

Hotrod2go

Prominent
Jun 12, 2023
203
57
660
Been an AMD gpu user since they took over ATI way back when. Never considered the opposition Nvidia. The AMD gpus have always met my needs, had several generations since then (late 2000s) & not one RMA ever. Rock solid, never had issues with their drivers too.
I must be an outlier with exceptional luck or just plain telling the truth... (y);)
 
Why nvidia is better than amd.

1 - Multi GPU (Intel graphics and nvidia) works (INTEL and AMD) Don't work right
2 - Power consumption (nvidia has the crown)
3 - Software (Everyone loves nvidia) You can use your graphics more than gaming
4 - Price (Easy to Sell in the future)

Here in my system I use the Intel for primary display (Can do 1920x1080 165hz with 2w 3w) and run the Wallpaper engine without issues.
With AMD graphics at 165hz the TBP of 6700XT goes to 45w need return to 120hz
My system now Idle with 50w on the wall gaming max 250w

Performance wise AMD has Good value, But everything else Nvidia all the way.
 
Driver support for AMD has GOT to be tough. Without a lot of sales, it must be difficult to pay for a large team to constantly test out all the desktop games coming out, work with the developers, and write bug free code. It only takes ONE game that has driver issues to sour a bunch of gamers on AMD drivers. Tough to do, for sure.
Yeah but on the other hand the ps5 has rdna 2 so all the games are coded directly for their GPUs...
That has to help a lot.
 
  • Like
Reactions: artk2219

razor512

Distinguished
Jun 16, 2007
2,154
84
19,890
While I understand the sentiment, and if this were in a world without the highest inflation that we have seen since the 1970's I would agree. Unfortunately that isn't the case, just about everything has gotten more expensive since mid 2016. I went ahead and used the Bureau of labor statistics inflation calculator and plugged in the $379.99 price you quoted for Mid 2016 (i used July, but it shouldn't make much of a difference if you picked any other month in 2016), that $379.99 is now the equivalent of $495.14 today, or roughly the price that you are paying for a current RX 7800 XT. No one tries to give their stuff away without making much of a profit, and honestly everything is just more expensive, so here we are, and here we will be. We just haven't caught up to the new reality yet because it was thrown at us too quickly. That is unless they find some amazing way to super cut production costs across the board. Thats not even getting into the fact that current cards are MUCH more complex and difficult to manufacturer than cards from 8 years ago. Also this just reminded me that 2016 was a weird year for AMD, similar to this next upcoming generation actually. R9 Fury X had come out the year before and it was a hot, power hungry, expensive flop with less VRAM than the GTX 980 TI that it under performed against. So they went back to the mainstream segment and released the RX 480 (4GB was $199 and 8GB was $239) based on the Polaris architecture which would become AMD's longest lasting GPU architecture, and AMD released the AM4 socket but only for OEM's and it was meant for use with Bristol Ridge Excavator based CPU's. The next year the released RX VEGA 56 and 64, which while it was a definite improvement over Fury, and performed well against the vanilla 1070 and 1080, had its launch spoiled immediately by the launch of the 1070 ti, and the fact that it couldn't catch the 1080 ti. It wouldn't be until RDNA and the launch of the RX 5700 series that AMD regained some market share that was lost during those 2016 - 2019 years. Granted they were just fighting to stay alive from 2016 - 2018, it was still a mess all over.

https://data.bls.gov/cgi-bin/cpicalc.pl?cost1=379.99&year1=201607&year2=202404
While inflation is an issue, many other aspects of the tech industry didn't scale in pricing in those ways for example, CPU prices. For example, the Core i7 6700K had an MSRP of $350, a 14700K had an MSRP of $420, but within a month, dropped to $395.

As production efficiency and newer process nodes came out, a wide range of other components ended up coming down in price, especially SSDs and hard drives.

While cards will be more complex these days, and more expensive to make, often the chip production industry was always a very high margin industry where the goal was to recoup often R&D costs in the hundreds of millions to a billion dollar range and turn a net profit before the next generation needs to come out.
While it is not as profitable per unit sold to give more reasonable prices, their current strategy of being ever so slightly cheaper than Nvidia is not working. The scenario I see with AMD and Nvidia, is an almost exact match for the relative thinking experiment referenced in the book Predictably Irrational by Dan Ariely, with the main difference being instead of a company doing it to their own products since people are more likely to choose the better of the product that is easier to compare, it is being done between 2 competing companies.

Humans rarely choose things in absolute terms.
When prices are close, humans tend to focus on the relative advantage of one thing over another, and estimate value accordingly, even if a relative advantage is of no benefit to the individual in that moment.
 
  • Like
Reactions: artk2219