News Sales of Desktop Graphics Cards Hit 20-Year Low

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
BTW, even at a N4 wafer cost of $20 000, the GPU die is still only ~$100 of an RTX4080 out of a $250-300 total build cost, nowhere near enough to justify a $1200 price tag if retail prices had any grounding in manufacturing costs. The high-end GPU prices are primarily padded by greed.
$250 to $300 build cost sounds low, compared to cost estimates I've seen for previous-generation cards. Have you seen or written a detailed workup of this figure, somewhere?

Also, I've heard that Nvidia isn't really on N4, but some kind of special "4N" variant of N5. Not sure how much difference it makes, especially since TSMC might've charged a premium for customizing a node for them, but perhaps it at least affects the # of dies per wafer you're assuming?
 
My two cents.. yes prices are the main factor, but, there's a second one which is: games did not grow graphically as much as GPUs got more powerful. Today a GTX 1060 (6 years old) can still play any game at 1080p just fine with everything maxed out (expect RT of course, which is for now still a gimmick anyway). Without the need for a new high end GPU plus prices that high of course there won't be demand.
 
$250 to $300 build cost sounds low, compared to cost estimates I've seen for previous-generation cards. Have you seen or written a detailed workup of this figure, somewhere?
I just extrapolated from other breakdowns I've seen, though I forgot that most 4080s use the same mega-sized HSFs as the 4090s, so you can tack an extra $30 or so for that.

My two cents.. yes prices are the main factor, but, there's a second one which is: games did not grow graphically as much as GPUs got more powerful.
Unless a game developer wants to alienate half of the potential audience for its PC game or port, it needs to optimize for the lowest common denominator it can be bothered to support. If AMD and Nvidia continue acting like infinitely greedy bastards, we could see game developers cutting whatever corners need trimming to keep their PC games playable on ancient hardware for a very long time.
 
  • Like
Reactions: RichardtST
Half the problem is the slowly increasing cost of entry into the gaming market. Most common now is apparently 1660 or 3060 or better. This is what game writers will write their games to. Problem is that these cards are not cheap. Youtubers have not helped the matter. As a general rule they will only preach for the very best CPU, the very best GPU, and the very best of everything. This makes everyone crazy trying way too hard to keep up, which drives up prices and makes all of us normal people crazy too. I think AMD has taken a hit with their much-too-expensive am5 socket. NVIDIA is taking a hit with their new overpriced 40 series. Will they learn from it and back off on the prices? Enh. I doubt it. Just look at Apple. As long as the hype train stays on the tracks, the prices stay high. Doesn't matter if you sell a few less if your profit margin is through the roof.
 
Unless a game developer wants to alienate half of the potential audience for its PC game or port, it needs to optimize for the lowest common denominator it can be bothered to support. If AMD and Nvidia continue acting like infinitely greedy bastards, we could see game developers cutting whatever corners need trimming to keep their PC games playable on ancient hardware for a very long time.

not sure if true, take what I say with a grain of salt. This is just something I heard about a year ago in the fighting game community. There is a rumor that game developers are trying to make their games playable just on the level of the 1050ti. Because more people able to play the game, the more popular it becomes. The more popular a game is, more people joining tournaments.

the PS5 game I posted on the previous page.. King of Fighters XV has had large tournament with big prize.

here's one that happened this year.

View: https://www.youtube.com/watch?v=roZV0UdaP9o&t=6459s
 
Last edited:
not sure if true, take what I say with a grain of salt. This is just something I heard about a year ago in the fighting game community. There is a rumor that game developers are trying to make their games playable just on the level of the 1050ti. Because more people able to play the game, the more popular it becomes. The more popular a game is, more people joining tournaments.
Nearly all eSports titles can run on a potato for exactly the reason you stated. The lower the minimum playable specs, the greater the pool of players of all skill levels since they don't need to break the bank on a bleeding-edge system every other year to stand a chance.
 
  • Like
Reactions: Tac 25
games did not grow graphically as much as GPUs got more powerful.
I can't comment on that, but I would note that higher-res monitors have been gaining greater market share. Those take more horsepower to drive, especially at the kinds of framerates people seem to want. DLSS and FSR help, but not for everyone or every game.

Also, now that GPUs are at least available, we can probably expect to see new games starting to ratchet up their requirements.
 
  • Like
Reactions: helper800
The lower the minimum playable specs, the greater the pool of players of all skill levels since they don't need to break the bank on a bleeding-edge system every other year to stand a chance.
The minimum bar is probably the iGPU. And here, Intel actually made some progress in the Xe graphics of their 11th Gen Rocket Lake CPUs, with some further tweaks in Alder Lake's version. With rumors of bigger iGPUs (2x!) destined for desktop processors in Arrow Lake, we could see even eSports games' minimum requirements move up another notch.
 
  • Like
Reactions: helper800
Also, now that GPUs are at least available, we can probably expect to see new games starting to ratchet up their requirements.
In a generation where the cheapest SKUs announced so far are $800 likely in a bid to let the channels clear out of old stock, I would question the reasonableness of game developers raising their minimum requirements by much... unless we're talking IGP-grade minimum requirements.

Even then, can't raise the minimum spec faster than the target audience is upgrading. Powerful IGPs possibly appearing in 2023-24 means we're still 5-6 years away from game genres targeted at today's IGPs catching up with the new stuff.
 
The minimum bar is probably the iGPU. And here, Intel actually made some progress in the Xe graphics of their 11th Gen Rocket Lake CPUs, with some further tweaks in Alder Lake's version. With rumors of bigger iGPUs (2x!) destined for desktop processors in Arrow Lake, we could see even eSports games' minimum requirements move up another notch.

bigger igpu's would make esports games minimum requirements go up a notch?? That does not make any sense. Sorry, I had to lol on this.

why would game developers force e-sports gamers to buy more expensive cpu's? That would just shrink the player base. Which is the opposite of what they want.
 
In a generation where the cheapest SKUs announced so far are $800 likely in a bid to let the channels clear out of old stock,
That's missing my point. What's different now than in the past couple years is that you can walk into a shop and see GPUs in stock, at approximately MSRP. Even if they're previous-gen, the mere fact that they can easily be found and bought at list price is certainly fueling some upgrades.

Powerful IGPs possibly appearing in 2023-24 means we're still 5-6 years away from game genres targeted at today's IGPs catching up with the new stuff.
The first big upgrade was in early 2021, with Rocket Lake. That moved from a Gen 9 (Skylake-era) 24 EU iGPU to a Gen 12 (Xe) 32 EU. Big jump, there - more than just the increase in EUs. IIRC, we saw improvements in the realm of 50% - 75%. That's not even mentioning the 64- 96 EU versions, in laptops.

For AMD's part, Ryzen 6000 finally moved their iGPUs beyond Vega, into the era of RDNA2. Those launched nearly 1 year ago.

So, I'd expect to see the baseline requirements beginning to shift, as a consequence of all that.
 
  • Like
Reactions: helper800
why would game developers force e-sports gamers to buy more expensive cpu's? That would just shrink the player base. Which is the opposite of what they want.
I wasn't talking primarily about eSports. I expect all game developers build games for the market that exists. If they see opportunities to add game enhancements that truly impact the experience and the hardware people are using has shifted from before, the potential definitely exists for people on older, lower-end hardware to get disadvantaged.
 
  • Like
Reactions: helper800
$250 to $300 build cost sounds low, compared to cost estimates I've seen for previous-generation cards. Have you seen or written a detailed workup of this figure, somewhere?

Also, I've heard that Nvidia isn't really on N4, but some kind of special "4N" variant of N5. Not sure how much difference it makes, especially since TSMC might've charged a premium for customizing a node for them, but perhaps it at least affects the # of dies per wafer you're assuming?

agreed,
no to mention a good portion of each wafer is not used at all, then there's testing, cutting and packaging which is not insignificant part of the price, and then there's a pcb which is populated with number of active and passive components, it's really just throwing random numbers in the air 😉 and it's certainly not 300 per card hehe. But we all get the frustrations, and NVIDIA is certainly making a nice profit on it, but it's nowhere near that figure. Moreover board partners' margins are so razor thin i often wonder why more of them had not thrown a towel just yet 😉
 
Lots of assumptions that high prices are driving low volumes, but it could just as easily be the other way around: low-priced cards have far thinner margins than higher priced ones (both in proportional terms and absolute terms). That means you need to sell much larger volumes of them to break even on a given design. If volumes were already trending downwards, and silicon development costs are well known to be very rapidly trending upwards, that makes lower cost cards less and less viable. e.g. if you have fixed costs of $10mn (and that's a low-ball estimate) to bring a given GPU design to market, a card that is distributed for $1000 with $100 of profit needs to sell 100k units before generating any profit, but a $300 card with $10 profit needs to sell 1mn units. If you expect there not to be 1mn sales of the lower end card, it may make more sense to just... not develop it to start with and put more effort into the more profitable cards. Doubly so if the OEM demand for low-end cards (the majority purchasers of these SKUs, rather than direct consumer sales) are being effectively met with previous generation parts whose cost has already been amortised.
 
  • Like
Reactions: bit_user
There are for me two main issues about brand new discrete GPU sales:

1. nvidia still remember that not so long ago miners will gladly pay usd $3000 for a top tier card, and they (miners) were buying hunderds of them

2. There some people buying the RTX 4080 at the insane asking price. And there will some who will buy the RTX 4070TI.

Yes factory wafer price has gone up, which is normal as the node become more complex and smaller, but at the same time soon as the process mature you will be able to get more working gpus peer wafer. So wafer price shouldn't have a huge impact on retail prices. The C virus still an issue where most factories are and that affect manufacturing of course.

But nothing justify usd +1200 for the performance the RTX 4080 is showing (and the GPU tier it should represent).

And the RX 7900XT and XTX are also overpriced. Not as bad as nvidia's new gen, but they are.

The world will be on recession for longer than everyone was hopping, so GPU sales won't get better at this high asking prices.

Anyways, thats what I think.

Cheers
 
Last edited:
1. nvidia still remember that not so long ago miners will gladly pay usd $3000 for a top tier card, and they (miners) were buying hunderds of them
Yes. I think Nvidia and AMD both designed flagship models that would only be profitable at higher prices, and it seemed a reasonable bet the market would continue to support such pricing.

Yes factory wafer price has gone up, which is normal as the node become more complex and smaller, but at the same time soon as the process mature you will be able to get more working gpus peer wafer. So wafer price shouldn't have a huge impact on retail prices.
Oops, can't agree with this. While you're right about better yields and increasing capacity leading to some cost reductions, N5 (and whatever variant Nvidia is using) involves more production steps. That means each wafer ties up the production equipment for longer, and the production equipment keeps getting more expensive for each smaller node. The result is a much higher price floor for newer nodes than we've seen in the past.

The C virus still an issue where most factories are and that affect manufacturing of course.
Right now is especially bad, but I expect that to clear up in a month or so.

The world will be on recession for longer than everyone was hopping, so GPU sales won't get better at this high asking prices.
We still need to see what happens in the low/mid-range.
 
  • Like
Reactions: helper800
if you have fixed costs of $10mn (and that's a low-ball estimate) to bring a given GPU design to market,
Probably $1B would be closer to the mark, if you're talking about all of the one-time design, engineering, and manufacturing-related costs needed to bring a new GPU chip to market. Perhaps that's a touch high, but multiple hundreds of $Mil, for sure.

And are we including device drivers and supporting game developers? I mean, we should, right? Nvidia is paying for all of that through hardware sales...
 
  • Like
Reactions: helper800
I have 3 failed geforces. At that failure rate, I will not spend mountains of money in a thing that may break with such a high probability.

Also, new games do not compel to invest in GPU. Quality gaming has dropped enormously.

But prices will not go down significantly. I already predicted this, because the government printing money unavoidably causes inflation. People here rejected it, but is elementary economics.

The war in ukraine is rising public spending so much, that unavoidably, more money will be printed. Prices will keep going up.
I think you have another problem if you've had 3 failed GPUs. I've been at this for 30 years and have never had a GPU fail.
 
  • Like
Reactions: bit_user
But prices will not go down significantly. I already predicted this, because the government printing money unavoidably causes inflation.
I don't know where "here" is for you, but most of the inflation we've been seeing globally wasn't caused by that. Certainly, there are some exceptions where runaway-inflation truly has been occurring, but they don't use the USD or Euro.

... unavoidably, more money will be printed. Prices will keep going up.
I think we're well past the era of generous pandemic relief packages, super-low interest rates, and post-pandemic stimulus. So, I don't expect to see more government-fueled inflation in most developed countries.
 
  • Like
Reactions: helper800
< in the market. But, I will not spend more than $5/added frame on a GPU. Tracking performance and prices, the closest high-end GPUs are used RTX 3080's that are nearing $500, which would be my tipping point.

But the market isn't playing ball; prices are going the opposite direction. New cards are closer to $18/frame. Knowing anything at all about personal finance and economics, and mixing that with the unbending rigor of an engineer, that's comedy. Show me the numbers, or walk away, AMD and Nvidia.

Edit - Just because I'm here and newly annoyed, my target prices for cards are as follows (semi-complex formula, valuing RT and Rast. performance improvement relative against semi-arbitrary valuation of those benefits, relative to RX 6700 XT performance):
4090 - $800
4080 - $670
3090 Ti - $580
7900 XTX - $550
You get the idea. Laugh yourselves to sleep. Especially if anybody here owns one of these and spent more than this amount of money, then I'll join the laughter.
 
Last edited:
imo those are mostly true. My 27 inch monitor is 1080p.. and the humble 1650 here in the house can play all PS4 and PS5 games I own just fine. The most demanding PS5 game here is King of Fighters XV (released 2022, so it's fairly new). It's minimum requirements is a GTX 480 - so it's easy for the 1650 to run the game. But I still purchased an RTX 3050 for the purpose of Ray tracing, because King of Fighters XV has a ray trace option which the 1650 cannot do.

only thing the 1650 cannot do is ray trace

uploaded a sample gameplay video on youtube. King of Fighters XV is at 9:58. :)
this video is using the weakest pc in the house. Q9500 + GTX 1650.
Somehow I got an EVGA 3060TI for $430 twelve months ago, I guess since I got on the waiting list. So once I received it, went to Best Buy and purchased a 43" 4K monitor for $300. I mean, the gaming is just amazing.
I don't even understand your post, but if you aren't considering gaming at 4K, at least consider your life choices.