News Desktop Graphics Cards Sales Hit New Multi-Decade Low: Report

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Well barring the release of UE5 they are going to have a hard time convincing me I need anything beyond the 4090 I currently have. There comes a time when hardware reaches the peak... and IMO we are pretty close if not there already with the 4090.

I used to upgrade phones every year... iPhone 4,5,6,7,8... and then stopped. Phone hardware became less impressive year after year. I went with the 12 next... and just upgraded a few months ago to the 14 Plus... and I only did that because I finally went with the big screen that has been available for 5 years. I'm easily getting 3-4 years out of this new phone. After finally upgrading to the big screen there's nothing else phones offer me. I don't care about phone cameras nor do I care about OLED displays on a phone.

So anyway... how is Nvidia going to sell me on the 5090 and 6090? 4K 60 fps with RT in Ultra settings? I'm already there with plenty of room to spare with the 4090.

With the 3090 it was doable... but the 3090 still struggled in some areas. That isn't the case with the 4090. I just don't see how they are going to convince me to spend $2k on a card to achieve performance I'm already getting with the 4090.

You already gave yourself a reason at the start, there will just be another must have game that will absolutely need "the next thing" to run some arbitrary FPS at some arbitrary resolution on Mega Ultra settings. Recent games are a good example of this is texture size, Ultra settings set that size to 4096x4096 which wastes a n absolutely massive amount of memory and bandwidth only to have the game discard the texture and render a down sampled version on 100x100 or smaller of screen space. Droping the textures down to 2048x2048 or even 1024x1024 gives you the same quality but with a huge performance savings.
 
yes, DON'T buy the cards at their ripoff prices, you had a 3090, most would of kept it, and skipped the 40 series, or waited and hope prices plummet.
If one plays at 4k going from a 3090 to 4090 which carried a minimal price premium gen on gen (if you're basing off retail not scalper as nobody should pay seller markups) completely makes sense. No client level top end card has made sense pricing wise since nvidia introduced the Titans. Thus not buying a halo card will never change the price of the halo card offerings. It's the ones further down the SKU stack which an impact can be made by not buying.
I did skip the 2000 series... because the 1080 Ti was a sick card and an upgrade wasn't necessary. The only reason I upgraded to the 4090 from the 3090 was because it is an ASTRONOMICAL improvement in performance... we are talking SEVENTY FIVE TO 80 PERCENT in 3DMark tests.

To be fair your platform upgrade has more than a little to do with that extra bump as the two cards are closer to 60% apart. The upgrade is still obviously massive, but just not quite that much.
Did you see their recent pricing on RX 6000-series? The current prices are very competitive prices, IMO.
I wish the prices had dropped when I picked up my RTX 2060 as a spare as I'd have much rather had 6600xt/6650xt. Had I been able to predict the future I'd have not bought my 3080 and gotten one of the 6950xt cards, but when I bought my card it was the highest $/perf available as mining prices hadn't totally gone away yet. Anything below the 3080 I wouldn't even consider ray tracing a selling point because of how many implementations there are you cannot guarantee baseline performance.
Whatever the reason, Nvidia's GPUs are seen as the premium solution. When given the option, most people seem to go that direction.
I think at this point it's solely a mindshare thing. I was buying hardware when ATI/AMD drivers were their problem and that stigma is something AMD has never gotten rid of. Their drivers today are no worse than nvidia's but if there's an issue you'll hear about it everywhere and coverage will be more negative, but if nvidia has an issue as long as they address it the response is more "well this happens". The grip nvidia has I don't think can be broken without severe undercutting of price.
4090 will be good for a while... at least until UE5 is mainstream.
FWIW I highly doubt UE5 is going to change anything for you given that you're generally aiming at keeping your minimum fps at or above 4k60. Now if you were a 4k120 snob... 😀
 
  • Like
Reactions: lmcnabney
Ultra settings set that size to 4096x4096 which wastes a n absolutely massive amount of memory and bandwidth
It might waste capacity, but not bandwidth. Textures are accessed via MIP Maps, which means you're only accessing at the highest visible resolution. This is done both to reduce aliasing and texture bandwidth.


Droping the textures down to 2048x2048 or even 1024x1024 gives you the same quality but with a huge performance savings.
It'd be interesting to check whether the performance savings only occur when graphics memory becomes full, causing in "thrashing" (i.e. repeated loading) of textures.

Any further performance benefit would come at the loss of perceivable detail, because it'd mean that you're forcing the GPU to render lower-resolution textures than the finest visible details.
 
It might waste capacity, but not bandwidth. Textures are accessed via MIP Maps, which means you're only accessing at the highest visible resolution. This is done both to reduce aliasing and texture bandwidth.

Modern titles now store all textures inside graphics memory whenever they load into an area. The entire texture resource needs to be in GPU memory prior to the game even knowing which of the sizes its' going to render to. Those textures didn't just magically appear there, they had to be loaded from disk into system memory and them from memory into GPU memory across the PCIe bus and through the GPU memory controller. If it's a static area, then it's loaded during the loading screen, if it's a seamless world like Diablo 4 then they are constantly being transferred from system memory into GPU memory. There is a reason it has an obscene memory footprint.

It'd be interesting to check whether the performance savings only occur when graphics memory becomes full, causing in "thrashing" (i.e. repeated loading) of textures.

Any further performance benefit would come at the loss of perceivable detail, because it'd mean that you're forcing the GPU to render lower-resolution textures than the finest visible details.

With such obscene texture sizes there is no loss in detail, which is the point I'm making. Taking a 3840x2160 resolution screen means there are only 3840 horizontal pixels and 2160 vertical pixels making a 4096x4096 larger then the screen. Further you are doing more then rendering just one texture at point blank range right? So now that 4096x4096 texture is occupying a much smaller space, scenes have hundreds of textures present pasted on all the various surfaces of the models, meaning that 4096x4096 texture is in all likeliness occupying a few hundred pixels total. A 1024x1024 or 2048x2048 texture would both rasterize to the exact same couple hundred pixels as the 4096x4096 texture.


There are two situations where those massive textures mean something, first is doing extreme resolution like 7680x4320 or 15360x8640, the second is staring at a low texture count scene showing off lighting effects.
 
Last edited:
I look at it the opposite way... I paid $1750 for the best GPU that gives amazing 4K Ultra performance... and have no GPU worries for the next 4-5 years at a minimum.
And that’s fine for you because you have the money but most people don’t
 
FWIW I highly doubt UE5 is going to change anything for you given that you're generally aiming at keeping your minimum fps at or above 4k60. Now if you were a 4k120 snob... 😀

I've got like 4 different UE5 demos right now and they are pretty intense. Intense enough where I'm not getting 60 fps 100% of the time. Will have to see how it plays out. Definitely not a 4K 120 snob... the 4090 doesn't even do 120 on Ultra. Best I've seen in my testing is 80-90... no big deal running at 60 because as said before I personally don't see a difference in 120hz and 60hz anyway which is not all that uncommon it seems.

At any rate... 4090 is plenty good for 4K 60 and I can definitely see skipping the 5000 series barring widespread UE5 gaming. The 4090 sold me on the "better than the 3090 4K gaming" but the 5000 series won't be able to do that. The 3090 somewhat struggled in certain 4K situations... the 4090 doesn't.

And that’s fine for you because you have the money but most people don’t
I just try to be efficient... no matter how much money is involved. Spend a little more now and save a little more later is my mindset... no matter if it's $1700 or $17.
 
Yes - this is because no-one has any money any more. Sometimes it's like the people writing these articles live in a bubble and aren't aware that western countries including the US and UK aren't facing insane levels of inflation, small business closures and restructuring with a view to heavy job losses and implementing automation and AI. Most people are more concerned with simply putting food on the table than whether they can afford an RTX 4000 series card.

these times are indeed difficult. My income is much lower compared to what I earn before corona became a thing. Here in my country it's worse, the number of beggars and homeless increased. Honestly, feel that it's a great blessing I'm still in a warm house, playing game in front of a pc every night.

saving small amounts at a time, still able to gather enough to buy a gpu if needed. As long as it's only about 300 dollars.
________

hmm, i do wonder if Nvidia will make an RTX 4050? Or maybe 4060 will be their weakest 40 series?
 
Modern titles now store all textures inside graphics memory whenever they load into an area. The entire texture resource needs to be in GPU memory prior to the game even knowing which of the sizes its' going to render to.
If they don't use Tiled Resources (first introduced in DX11.2 and as vendor extensions in OpenGL), then yes. Tiled Resources enable sparse textures and more.

Those textures didn't just magically appear there, they had to be loaded from disk into system memory and them from memory into GPU memory across the PCIe bus and through the GPU memory controller.
True. Accounting for isotropic MIP Map overhead, that'd amount to 64 MiB for a 4k x 4k texture.

However, I think current texture compression methods typically deliver about 6.75:1, reducing the actual footprint to a mere 9.5 MiB. That would take 2.3 ms to send over a PCIe 3.0 x4 interface (i.e. reading from SSD) and 0.58 ms to send over a PCIe 3.0 x16 interface. Halve those for PCIe 4.0, but then we should also account for some contention.

It's not nothing, but it's also not as bad as someone might expect for a 4k texture. Maybe that's why game devs have gotten complacent about using them?

Honestly, I expected the massive shader execution capacity of modern GPUs would render most static textures obsolete. That doesn't mean you don't still need plenty of GPU memory to compute things like shadow and reflection maps, though. But, if you look at the compute-to-bandwidth ratio of something like a RTX 4070 Ti, you can do 79.6 floating point operations per byte of GDDR memory it can read or write. That's not accounting for cache hits, but GPUs traditionally don't rely on cache for much other than batching semi-coherent reads or writes. Granted, you have to use some of that for lighting and any multi-texture compositing.

Another way of looking at it is 4.83 MFLOPS/pix @ 3840x2160. If your target framerate is 120 Hz, then that still gives you 40.3 k floating point ops per pixel per frame. At 4k @ 120 Hz. Truly staggering, IMO. Makes me wonder just how expensive typical game-grade procedural textures really are.

There is a reason it has an obscene memory footprint.
I take it you've compared the memory footprint with different texture resolution settings, then? Because, for an open-world game, there's a lot of geometry to manage and other state potentially to track.

Taking a 3840x2160 resolution screen means there are only 3840 horizontal pixels and 2160 vertical pixels making a 4096x4096 larger then the screen.
The issue is just that if you're standing next to a wall, looking down its length, you wouldn't want the part closest to you to appear blurry. That doesn't mean you're seeing every pixel of the entire texture, but just the part closest to you. I know it's a silly example, but it illustrates the point that it's actually not hard to find the limits of lower-resolution textures without having to walk perpendicularly into a wall, etc.
 
  • Like
Reactions: Tac 25
If they don't use Tiled Resources (first introduced in DX11.2 and as vendor extensions in OpenGL), then yes. Tiled Resources enable sparse textures and more.

I take it you've compared the memory footprint with different texture resolution settings, then? Because, for an open-world game, there's a lot of geometry to manage and other state potentially to track.


The issue is just that if you're standing next to a wall, looking down its length, you wouldn't want the part closest to you to appear blurry. That doesn't mean you're seeing every pixel of the entire texture, but just the part closest to you. I know it's a silly example, but it illustrates the point that it's actually not hard to find the limits of lower-resolution textures without having to walk perpendicularly into a wall, etc.

Yes, it's very easy to see with Diablo 4, just turning the textures from ultra down to high saves a massive amount of system and GPU memory. Scenes have thousands of textures being used so it starts to add up when you consider that it is competing with scene processing for the GPU memory bandwidth. I'm only using that game because it's the latest hotness, everyone's recently done reviews and it provides a perfect example of why "Ultra" settings are a waste compared to High or Custom (ultra with a few things turned down one notch). As for the wall example, that is why 2048x2048 is a good texture size at the modern resolutions of 1440p and 2160p. It provides plenty of detail while your nose is pressed against something while also not being the obscene resource hog that 4096x4096 is. If / when we start approaching 4320p or higher resolutions, then 4K textures start to provide benefits.

AA is another one of those settings that is abused by ultra, frequently without benefit. Using MSAA as an example, going from none to 2X shows a good improvement, going from 2x to 4x shows less improvement, going from 4x to 8x shows almost no improvement yet each increase requires even more resources then the previous one. Plus as native resolution goes up, the improvement from any sort of anti-aliasing goes down as lines naturally become smoother. Some modes like FXAA are essentially "free" while others like SS require 2x or more processing power.

The point of all this was to show that consumers are not buying the current generation of GPUs and that game titles don't even need them except when abusing obscene settings for little to no benefit.

:Edit:

Got some memory utilization details off Diablo 4. With Ultra texture settings the GPU would use 10~12GB of GPU memory. When I changed it to High that went down to 4.5 to 9GB of GPU memory. System memory used fluctuated with it growing larger the more area's you loaded into, the game seems to use system memory as a sort of "cache" for everything it needs to send to the GPU so changing textures from ultra to high shaved several GB off the amount used over time.
 
Last edited:
Yes, it's very easy to see with Diablo 4, just turning the textures from ultra down to high saves a massive amount of system and GPU memory.

Great advice for low to mid-range systems no doubt...

kqAKiM0.jpg


Just took that of my system stats while grinding out the last of my mount quest chain... 4K Ultra everything... with 20 tabs open in Edge and OBS recording gameplay.

Good thing is it's Blizzard... much like WoW and their other games you can run Diablo on pretty much any PC without much issue.
 
Great advice for low to mid-range systems no doubt...

Good idea on the best of best systems, as your getting no benefit otherwise. Might as well have the game mining crypto on your GPU. As has been mentioned previously, even at 2160p there is zero real benefit to 4096x4096 sized textures while a whole lot of down side. Of course once 4320p or 8640p become available, then 4096x4096 textures will have a benefit.

On a side off topic note, getting some real compensation vibes here.
 
Last edited:
So you can please stop acting like I spent $1750 on a marginal upgrade. The numbers don't lie.
i never said it was, but, for most going from a 3090, to a 4090, is not worth the 2000+ price tag that came with it. but, if YOU think it is worth it, and its money well spent, then thats fine

The only reason I upgraded to the 4090 from the 3090 was because it is an ASTRONOMICAL improvement
and with it an astronomical price too 🙂
All I care about is performance. Barring a release of UE5 in the next 12-18 months I won't be upgrading from the 4090.
and, you have paid through the nose for it :)
 
i never said it was, but, for most going from a 3090, to a 4090, is not worth the 2000+ price tag that came with it. but, if YOU think it is worth it, and its money well spent, then thats fine

Going from a 3090 to a 4090 is a 60-80% improvement in performance. That hasn't been seen since when?

Usually generational upgrades are in the 30% range.

Nvidia did a good job this gen in making the 4090 an out of this world upgrade while the rest of the 4000 series complete garbage.

Probably to boost 4090 sales. 🤣
 
Going from a 3090 to a 4090 is a 60-80% improvement in performance.
that may be true, but for most, the price take to even get an " entry level " 4090 is way to high, and as i mentioned, the 2000+ price for the 4090 is 1000-1500 too much.

Nvidia did a good job this gen in making the 4090 an out of this world upgrade while the rest of the 4000 series complete garbage.
that might be a side effect of the crypto boom and nvidia figured most of these cards would of gone to that, except the bubble burst.
 
that may be true, but for most, the price take to even get an " entry level " 4090 is way to high, and as i mentioned, the 2000+ price for the 4090 is 1000-1500 too much.

The 1080 Ti was $699 in 2017... that will never happen again. If the 4090 was $500 everybody would have one... 🤣 🤣 🤣

that might be a side effect of the crypto boom and nvidia figured most of these cards would of gone to that, except the bubble burst.

The miners are to blame for the GPU prices. They had their cake... now they can eat it.
 
Last edited by a moderator:
I am still using a RTX 2070 Super with an otherwise very strong system (9900KS 5Ghz CPU). I try to buy the strongest GFX I can when I can afford to and making my next major platform leap. I just don't buy a new GFX Card every generation so I guess I'm part of the drop.
bought a 3060 Ti, immediately returned after coming to the conclusion that my 2060 had 2x the heatsink* was silent under load and 15C cooler avg.
3060 Ti overheating and fans spinng usually at 30% (yes you can't go lower) and jetspeed under medium load....

Both were from ASUS.

It's not you, it's them.
 
The 1080 Ti was $699 in 2017... that will never happen again.
Even then, it was a surprisingly good price.

What I think happened is that Nvidia feared another situation like they had with the GTX 980 Ti vs. AMD Fury, but repeating with GTX 1080 Ti vs. Vega. Since the 1080 Ti launched first, they played their hand and it turned out that AMD was essentially bluffing. Nvidia overestimated their competitor, with Vega missing its mark and having to match up against the regular GTX 1080.

That's why I think it's a mistake to use the GTX 1080 Ti as a reference for what these cards should cost. It came in below expectations.
 
That's why I think it's a mistake to use the GTX 1080 Ti as a reference for what these cards should cost. It came in below expectations.

I can agree with that.

I am not an Nvidia shareholder but I don't think the 4090 is too terribly "overpriced" for what you are getting. You notice I got the $1750 card and not the $2000+ card. For as much as I spent on my system I still have a limit... and I do think $2000+ for a GPU is a bit much.

$1500-$1750 is IMHO a fair price for the 4090. Maybe even $1300 as a low point... but certainly not much lower than that.

Nvidia has no competition. They can charge what they want. As I said upthread the miners also played a part in the market being what it is today.
 
For as much as I spent on my system I still have a limit... and I do think $2000+ for a GPU is a bit much.
Once upon a time, I used to think it made sense to spend about the same on a CPU and GPU. I now think that's somewhat arbitrary, and won't necessarily result in a balanced system.

I would still have a really hard time paying more than $1k for a GPU, unless I had a good need for it.
 
Once upon a time, I used to think it made sense to spend about the same on a CPU and GPU. I now think that's somewhat arbitrary, and won't necessarily result in a balanced system.

I would still have a really hard time paying more than $1k for a GPU, unless I had a good need for it.

Yeah... $1750 was a hard pill to swallow but the benchmarks made it a lot easier. It really is an incredible upgrade when you look at the numbers compared to the 3090.

I spent twice on the GPU what I spent on the CPU. Did the same the last build... 3x actually... 10900k $500... 3090 $1500.

You would think the CPU should be the most expensive component but that's not the case anymore. I'm fine with $800 for the 7950x3D as well. High end... high price. Does everything the 7800x3D does in gaming... while being better in productivity and that to me was worth an extra $300. (I don't care if the 7800x3D is marginally better in gaming when I'm already capping max fps)

I can literally shoot a 20 min gaming video 4K or a 6K video from my drone and process/encode it in just a few minutes. Much slower with the 10900k and I can remember it taking hours with my prior 7700k build.

Time is money.
 
Nvidia will probably spin off consumer graphics cards to Intel.
When Nvidia can make $1 'T' with AI you know that's where their efforts will go.
Market cap doesn't mean much to the company, it's just people gambling on the future of the company.
For nvidia it just maybe makes it easier to borrow more money.

Nvidia made $656 million(net income) , not even billion, in q2 of 2023.
Their revenue was $3.81 billion for data center and $2.04 billion for gaming, so gaming is still a huge part of the money they make and it would be the stupidestest thing to sell off a well doing part of your business for something that might (and probably will) stick around but nevertheless isn't guaranteed to have the staying power of gaming yet.
 
Market cap doesn't mean much to the company, it's just people gambling on the future of the company.
For nvidia it just maybe makes it easier to borrow more money.

Nvidia made $656 million(net income) , not even billion, in q2 of 2023.
Their revenue was $3.81 billion for data center and $2.04 billion for gaming, so gaming is still a huge part of the money they make and it would be the stupidestest thing to sell off a well doing part of your business for something that might (and probably will) stick around but nevertheless isn't guaranteed to have the staying power of gaming yet.
From what I gather of the article I read about NVIDIA having Intel make the consumer end GPU's isn't to sell it off but to 'farm' it out.
I agree with NVIDIA still wanting to hang on to that end of the market.

The tie-up with Nvidia comes at a critical time for Intel's beleaguered foundry business
By Kishalaya Kundu May 31, 2023 at 8:05 AM
Intel-manufactured Nvidia GPUs could be coming soon