News Desktop Graphics Cards Sales Hit New Multi-Decade Low: Report

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I just sold my Series X because the one game I did play on it (MLB 23) I can stream to the PC so there was no point in having the console. I have the Switch for my retro NES/SNES gaming but prefer the PC for everything else and that will probably never change.

I had the original PS and the PS2... and have never touched a PS3, 4, or 5.

It's not so much about the high end PC gaming... it's just there's so much more you can do on a PC than on a console... and that now includes streaming console games to the PC.

Streaming MLB the Show is what pushed me to get a PS5! The latency was *just* enough that my reflexes just weren't enough to get around on a good fastball.
 
Streaming MLB the Show is what pushed me to get a PS5! The latency was *just* enough that my reflexes just weren't enough to get around on a good fastball.
Haha! Yeah I've been playing the MLB Show series for a while... first one was back in 2008 or something IIRC. I'm a big Braves fan ever since the early 1980's and I love the MLB games. I honestly don't much care for hitting though... I was a pitcher in little league and high school so that's mainly what I do on the game... Road to the Show.

The streaming mode works surprisingly well... I'm enjoying it. I can see how latency would be an issue with hitting... it's not a big deal when pitching so far.

Nice comeback win tonight for the Braves and in the previous 2 games... swept the Mets. Hate the Mets. Go Braves!
 
This is 100% expected for two reasons:
  1. The crap-tonne of cheap ex-mining cards in the used market
  2. The garbage that nVidia has released in their RTX 40-series
Sure, AMD's offerings aren't much better but consumers have shown, in their lack of wisdom, that a Radeon could be made of pure gold, cost less than $100 and they'd still only buy cards in green boxes.

Intel isn't really showing as much interest as they should because they have a massive opening in the market to exploit and they're not doing it. This hole was created by nVidia's arrogance and AMD's stunning incompetence. What Intel needs to do to gain market penetration is to just do what they always did and that is use leverage their influence over the system OEMs.

As for AMD, well, they have their own hurdles to overcome, one is small, but one is gigantic. That gigantic hurdle is one that Intel also has to overcome to some degree but they have their route with OEMs that AMD doesn't.

Small Hurdle for AMD: Kick Sasa Marinkovic to the curb ASAP!

Gigantic Hurdle for both AMD and Intel:
Gordon from PC World is shows his awesomeness here by just speaking the truth and it's the same thing that I've been saying for years.

There's literally no more pathetic a gamer than one who trash-talks nVidia, their practices and their pricing but has only ever owned GeForce and will eventually capitulate and buy another. That's not a man, that's a mouse. When a man gets treated the way that nVidia treats its customers, he's supposed to tell whoever is treating him that way where to go and takes his business elsewhere. If he just bends over and takes it, the market becomes what we're currently stuck with.
 
  • Like
Reactions: cyrusfox
Certainly been a boon to the storage industry, and partially explains the prices plummeting there. Games are huge, so everyone that makes SSDs has little problem selling them. That and the standardization between laptop and desktop at the moment with M.2.

that is so true. I have purchased 4 SSD the past few years to increase storage space for games. One each for the two older pc's here in the house, and the other two SSD went to the newest pc. :)

a friend in EU told me a few days ago, the newly released Street Fighter 6 is approx 30 gb. Many ps4 / ps5 games I own are around that size or larger.
 
Last edited:
I wonder how much the improvement of integrated graphics has impacted this. Even the "grandparents" rig required some graphics card for many years. Now both AMD and Intel have integrated graphics that will handle mom/pop and office tasks with ease.
I think that the first IGP that could handle said mom & pop tasks with ease was the Radeon HD 6620G in the Llano APUs. I remember being able to run Skyrim at 720p with medium settings on my A8-3500M craptop.

If the mobile variant could run Skyrim, there's nothing that the desktop variant couldn't do for mom & pop! 😉
 
I think that the first IGP that could handle said mom & pop tasks with ease was the Radeon HD 6620G in the Llano APUs. I remember being able to run Skyrim at 720p with medium settings on my A8-3500M craptop.

I built an HTPC with an APU once. World of Warcraft on an AMD E350 HD6310, ran at low settings pretty well. 1152x964 because that wasn't any harder than 720p/768p apparently. That was just a crude benchmark though.

I used it for a lot of retro gaming until I wanted more horsepower for general browsing and moved to an i3-4130T (now 12100F). Gave it away with an emulator package to a co-workers kid.
 
  • Like
Reactions: Avro Arrow
Intel isn't really showing as much interest as they should because they have a massive opening in the market to exploit and they're not doing it. This hole was created by nVidia's arrogance and AMD's stunning incompetence. What Intel needs to do to gain market penetration is to just do what they always did and that is use leverage their influence over the system OEMs.
Intel is using TSMC for their consumer GPUs so no matter how big the opening is they can't make a plug big enough to stuff it, there is only so many GPUs that they can make.
When they build up their FABs then it will show if they will show enough interest or not.
Also I don't know if we have any numbers but I'm pretty sure most of their ARC sales are via OEMs.
 
Haha! Yeah I've been playing the MLB Show series for a while... first one was back in 2008 or something IIRC. I'm a big Braves fan ever since the early 1980's and I love the MLB games. I honestly don't much care for hitting though... I was a pitcher in little league and high school so that's mainly what I do on the game... Road to the Show.

The streaming mode works surprisingly well... I'm enjoying it. I can see how latency would be an issue with hitting... it's not a big deal when pitching so far.

Nice comeback win tonight for the Braves and in the previous 2 games... swept the Mets. Hate the Mets. Go Braves!

I pitched through high school too, but knew there was zero chance past that point, I was a sidearmer throwing slop and never could consistently hit more than 70.

Thankfully, my rather obscure last name was added to the game audio six or seven years ago. Though it's actually my fault they say the first syllable ZIM as I do rather than the SCHZIM which is more accurate.
 
Intel is using TSMC for their consumer GPUs so no matter how big the opening is they can't make a plug big enough to stuff it, there is only so many GPUs that they can make.
When they build up their FABs then it will show if they will show enough interest or not.
Also I don't know if we have any numbers but I'm pretty sure most of their ARC sales are via OEMs.

Looking forward to an Intel GPU made by Intel but I don't think that is a near term goal. Intel is talking about becoming a foundry for others with the new fab construction. Intel is even farming out the iGPU tile to TSMC for their upcoming chips. If they reserve their latest nodes for CPU logic, we might be seeing TSMC Intel GPUs for a while.
 
Looking forward to an Intel GPU made by Intel but I don't think that is a near term goal. Intel is talking about becoming a foundry for others with the new fab construction. Intel is even farming out the iGPU tile to TSMC for their upcoming chips. If they reserve their latest nodes for CPU logic, we might be seeing TSMC Intel GPUs for a while.
Intel getting 100% capacity on their new FABs from customers would be their dream come true, I think that is going to take a long time.
My guess is that they are going to use a lot of that capacity for their own products for a while.
 
For the man on the street like me and with discretionary spending virtually none existing, I can only upgrade to a used GPU (RTX 2080?) once in the $80) range and if I get the extra work overtime. I also wonder what NVIDIA will be charging for their upcoming new GPU generation based on what they have been used to getting for their product over the past few years? Reality bites!
I forget, are you in WA or OR? If you're near Seattle, maybe @TravisPNW has the inside track on some gig work or something.
 
  • Like
Reactions: Tom Sunday
I pitched through high school too, but knew there was zero chance past that point, I was a sidearmer throwing slop and never could consistently hit more than 70.

Thankfully, my rather obscure last name was added to the game audio six or seven years ago. Though it's actually my fault they say the first syllable ZIM as I do rather than the SCHZIM which is more accurate.

Same... my velocity was in the 70's as well. Never could get much beyond that. Amazes me to see guys like Spencer Strider hitting 100 mph.

My name is a lot easier that yours... it's in the game. :)
 
GPU makers still trying to sell cards for the same over inflated price as during the crypto craze, folks are burned out and would rather just get something cheap off resale market then deal with ridiculous prices. I know so many people that refuse to upgrade past a 10x or 20x series card just because of how insane the current prices are.


And intel is going to take out the low end high volume market, once they spend another year or two getting their drivers solid.
 
GPU makers still trying to sell cards for the same over inflated price as during the crypto craze, folks are burned out and would rather just get something cheap off resale market then deal with ridiculous prices. I know so many people that refuse to upgrade past a 10x or 20x series card just because of how insane the current prices are.


And intel is going to take out the low end high volume market, once they spend another year or two getting their drivers solid.

I look at it the opposite way... I paid $1750 for the best GPU that gives amazing 4K Ultra performance... and have no GPU worries for the next 4-5 years at a minimum.
 
Late 2019 - Early 2022 number were bloated from crypto mining boom. Right now it's easy to recommend used rtx 30 and RX 6000 series, even the ex-mining cards (just make sure to buy from reputable seller, and still have active warranty). Newer cards is just meh, it's possibly the worst generational inprovement in decade. I mean, GTX 10 to RTX 20, and RTX 20 to RTX 30 deliver 40-60% performance increase, while 4060Ti barely faster than 3060Ti. Btw it's nice to see Intel's sales keep rising, hope they wont miss this chance and put the right price for ACM+ G20 and G21. I won't buy intel's GPU because I need cuda cores, but more competition is a good thing.
 
I look at it the opposite way... I paid $1750 for the best GPU that gives amazing 4K Ultra performance... and have no GPU worries for the next 4-5 years at a minimum.

So you don't mind giving $600+ USD away, because that is what that is and the market is showing it. Top end cards were selling for 2k instead of MSRP during the mining craze because of insane demand with limited supply, all nVidia did was raise the MSRP to match the crypto prices hoping buyers wouldn't notice.
 
So you don't mind giving $600+ USD away, because that is what that is and the market is showing it. Top end cards were selling for 2k instead of MSRP during the mining craze because of insane demand with limited supply, all nVidia did was raise the MSRP to match the crypto prices hoping buyers wouldn't notice.

Of course I noticed... I paid $699 for a 1080 Ti in 2017.

3090 was $1499... and 4090 was $1749...

Do I like it? Of course not. Can I do anything about it? Nope.

Mining forever changed the GPU market. We'll never see a $699 flagship card again... so pay up... or run a 2nd or 3rd gen card for years.

4090 will be good for a while... at least until UE5 is mainstream.

I'm not a supporter of the current GPU prices... but Nvidia isn't going to lower them when they have no competition. Business is business.
 
Do I like it? Of course not. Can I do anything about it? Nope.
yes, DON'T buy the cards at their ripoff prices, you had a 3090, most would of kept it, and skipped the 40 series, or waited and hope prices plummet.
BTW 4090s, range from 2200 to 2900 here, those prices are at least 1000 above what they should be, if not 1500
4090 will be good for a while... at least until UE5 is mainstream.
* waits for your post in a few months stating you just bought a 5090 for 2000+ *

but Nvidia isn't going to lower them when they have no competition.
if no one buys them, and those cards sit on store shelves and in warehouses, chances are they just might lower the prices, specially if the 50 series are about to be released.

but i guess considering you have mentioned a few times how now much you have spent on your comp, i doubt you would of done any of the above
 
  • Like
Reactions: bit_user
* waits for your post in a few months stating you just bought a 5090 for 2000+ *

5090 won't be out till mid/late 2024... and I don't have a need for it.

but i guess considering you have mentioned a few times how now much you have spent on your comp, i doubt you would of done any of the above

I did skip the 2000 series... because the 1080 Ti was a sick card and an upgrade wasn't necessary. The only reason I upgraded to the 4090 from the 3090 was because it is an ASTRONOMICAL improvement in performance... we are talking SEVENTY FIVE TO 80 PERCENT in 3DMark tests.


So you can please stop acting like I spent $1750 on a marginal upgrade. The numbers don't lie.

but i guess considering you have mentioned a few times how now much you have spent on your comp, i doubt you would of done any of the above

All I care about is performance. Barring a release of UE5 in the next 12-18 months I won't be upgrading from the 4090.
 
The crazy part I take away is how low AMD GPU shipments have been for the last 3 quarters... They don't seem to even be a viable competitor, I don't understand why they price their cards comparable to Nvidia. The console revenue will keep this division afloat but they are in danger of Intel overtaking them for discrete volume in the near term future if they don't move more inventory.
YBg9idSBLqvbN49xV59HFn-970-80.png
 
Yes - this is because no-one has any money any more. Sometimes it's like the people writing these articles live in a bubble and aren't aware that western countries including the US and UK aren't facing insane levels of inflation, small business closures and restructuring with a view to heavy job losses and implementing automation and AI. Most people are more concerned with simply putting food on the table than whether they can afford an RTX 4000 series card.
 
The crazy part I take away is how low AMD GPU shipments have been for the last 3 quarters... They don't seem to even be a viable competitor, I don't understand why they price their cards comparable to Nvidia.
Did you see their recent pricing on RX 6000-series? The current prices are very competitive prices, IMO.

A problem AMD faces is that a lot of people are probably anticipating more ray tracing in games. Or, maybe they want to do some deep learning on their GPUs, where AMD's long-standing policy of virtually ignoring compute workloads on consumer hardware has really burnt a bridge.

Whatever the reason, Nvidia's GPUs are seen as the premium solution. When given the option, most people seem to go that direction.
 
Last edited:
Whatever the reason, Nvidia's GPUs are seen as the premium solution. When given the option, most people seem to go that direction.

I've always considered them premium... and my first was the GeForce3 Ti 200 in 2001. Been Team Green ever since... not by choice but because I don't recall a situation where I was looking to upgrade and ATI/AMD had the better card.

Certainly isn't true today. I'd have loved to have paired the new Ryzen with an AMD GPU... but nothing they have comes close to the 4090. RT really has nothing to do with it... I'm just all about brute performance.
 
Of course I noticed... I paid $699 for a 1080 Ti in 2017.

3090 was $1499... and 4090 was $1749...

Do I like it? Of course not. Can I do anything about it? Nope.

Mining forever changed the GPU market. We'll never see a $699 flagship card again... so pay up... or run a 2nd or 3rd gen card for years.

4090 will be good for a while... at least until UE5 is mainstream.

I'm not a supporter of the current GPU prices... but Nvidia isn't going to lower them when they have no competition. Business is business.

I mean it was $600 above where it should of been, that is the amount of profit nVidia added to their already profitable high end cards. And you are wrong about not being able to do anything, consumers vote with their wallets and can just choose not to pay the extortion prices. That is the entire point of the article, consumers are taking a look at the prices and just walking away and buying on the second hand market. There are tons of 30 series cards available for massively discounted prices.

The entire "buy the absolute best every few years" method is exactly what nVidia is wanting you to do, it means they already have a consumer lined up for their 5090 and 6090 products because they just need to
convince you there is some reason you need the newer card.
Computers do not "wear out" every few years, they are not smart phones though manufacturers are always trying to convince you otherwise. We define a set of tasks then determine if the current platform can perform those tasks adequately. We switch out the platform whenever it can no longer support the desired list of tasks.

"Mega Ultra" is largely a BS setting in most games, there is no higher quality then the next step down but it ends up consuming an inordinate amount of system resource to do absolutely nothing. Game manufacturers have noticed every reviewer immediately hits that and so have long since used it as a way to brag about how intensely awesome their game is by how much resource they can waste doing nothing.
 
Last edited:
The entire "buy the absolute best every few years" method is exactly what nVidia is wanting you to do, it means they already have a consumer lined up for their 5090 and 6090 products because they just need to
convince you there is some reason you need the newer card.
Computers do not "wear out" every few years, they are not smart phones though manufacturers are always trying to convince you otherwise.

Well barring the release of UE5 they are going to have a hard time convincing me I need anything beyond the 4090 I currently have. There comes a time when hardware reaches the peak... and IMO we are pretty close if not there already with the 4090.

I used to upgrade phones every year... iPhone 4,5,6,7,8... and then stopped. Phone hardware became less impressive year after year. I went with the 12 next... and just upgraded a few months ago to the 14 Plus... and I only did that because I finally went with the big screen that has been available for 5 years. I'm easily getting 3-4 years out of this new phone. After finally upgrading to the big screen there's nothing else phones offer me. I don't care about phone cameras nor do I care about OLED displays on a phone.

So anyway... how is Nvidia going to sell me on the 5090 and 6090? 4K 60 fps with RT in Ultra settings? I'm already there with plenty of room to spare with the 4090.

With the 3090 it was doable... but the 3090 still struggled in some areas. That isn't the case with the 4090. I just don't see how they are going to convince me to spend $2k on a card to achieve performance I'm already getting with the 4090.