Question Should gaming GPUs cost over $1,000?

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
If you're a person who already has or needs to get a desktop PC for other reasons, you can just take the $500 you'd spend on a PS5 and use it on a low-end graphics card to add to your PC you either have or were going to get anyway. Doesn't seem hard to justify at all.
Sure, except that the $500 in question gets you two top-of-the-line game controllers and, in a family setting, gaming can be done while someone else uses the PC for something else. There is tremendous value in that. Not everybody lives alone and can just combine everything into one piece of hardware.
There are also a lot of people who need a powerful GPU for work they do, or money-making hobbies like making videos. Such people are naturally inclined to game on PCs they already own rather than consoles.
I couldn't disagree more because you're using the rhetorical "a lot" which is vague and meaningless. When the term "a lot" is used, it has no power whatsoever if left undefined. When I say "a lot", I use it as a relative term because that actually means something.

Your statement is patently false in that way. The number of PC owners who have specifically chosen to use discrete video cards is probably less than 50%. The vast majority of people aren't even tech-savvy, let alone gamers, enthusiasts or graphic designers.

Remember that the term "mainstream PC use" does not refer to gaming. It refers to audio/video (HTPC), web browsing, banking, paying bills, shopping, email, office tasks (like homework), communication and social media. People who fall into this category aren't tech savvy and so they use branded desktops, all-in-one PCs and craptops. They are the reason that companies like Dell, HP, Lenovo and Acer are so huge.

Since these mainstream tasks don't require discrete video cards, these manufacturers don't have them as standard equipment. After all, if a customer can do just fine with an Intel or AMD IGP, why would any of those manufacturers include a discrete video card that would raise the price of their PC and cause them to lose a sale to their competition? Exactly, they wouldn't.

Therefore, the number of people who have chosen to use a discrete video card is already relatively few. Of the already relatively few people who fall into that category, the overwhelming majority of those are gamers only. I don't know what the percentages are but there are enough gamers aren't tech-savvy enough to build their own PCs to support companies like Alienware, ASUS ROG and iBuyPower.

How do I know that the non-tech savvy in this category are only gamers? I just simply accept the fact that someone who does graphic design or 3D animation must be tech-savvy. The most hard-core work that you can do with a PC is either coding, 3D modelling, animation or graphic design. These are tech trades.

The tools of the trade for these people are CPUs, GPUs, RAM and software. Like people in any trade, to be successful, they have to be intimately familiar with the tools that they use. Therefore, it is safe to say that these people are tech-savvy and purchase their tools accordingly.

When compared to the number of PC enthusiasts who are tech savvy and build their own rigs for only gaming (I myself have been in that category since 1988), the number of these content creators is pitifully small, nowhere near large enough to make any kind of significant impact on the consumer marketplace. This remains true even if you include content-creators on platforms like YouTube or Switch. After all, for every one content creator, there are literally thousands of content consumers (aka viewers). Therefore, they are nothing more than a tiny minority.

And lastly, you could simply buy the expensive GPUs less often. If you upgrade every 4 years instead of every 2 years, for example, you are essentially getting a 50% discount... then that $1000 GPU could... in a way... be thought of as costing you only $500.
I agree that this makes sense to me because that's often how I roll. However, not everyone is willing (or even able) to invest this much money at once. Some people buy high-end video cards but then become spoiled by it and are unwilling to keep using it when it can no longer provide the frame rate they want at high resolutions. For some cards, this can happen relatively quickly.

Just look at the state of the RTX 3060 Ti, RTX 3070 and RTX 3070 Ti. Just two years after their release, they are unable, in some new titles, to game at 1080p Ultra, let alone 1440p or 4K. This is because nVidia didn't put enough VRAM to match the potency of the GPUs on those cards but it's nevertheless a widespread problem. Considering the market situation since 2020, just how many people do you think paid under $1,000USD for those cards? My guess is a lucky few who managed to get them at MSRP before the market became as crazy as a meth addict.

I was fortunate enough to realise right away that 8GB had no place on a modern high-end video card and so I chose an RX 6800 XT with 16GB of VRAM. That's because, as you quite rightly pointed out, I want the card to last for many years to come. Unfortunately, not everyone saw the trap in advance like I did.
 
I was fortunate enough to realise right away that 8GB had no place on a modern high-end video card and so I chose an RX 6800 XT with 16GB of VRAM. That's because, as you quite rightly pointed out, I want the card to last for many years to come. Unfortunately, not everyone saw the trap in advance like I did.

I saw it coming in early 2021 when I built my last PC... 24GB 3090 was the only way to go IMHO which is why I only considered the 4090 when building this new PC.

16GB is the bare minimum but I personally won't go any less than 24GB ever again especially gaming in 4K. Some games are VRAM hogs.
 
  • Like
Reactions: Avro Arrow
I saw it coming in early 2021 when I built my last PC... 24GB 3090 was the only way to go IMHO which is why I only considered the 4090 when building this new PC.
Yup. That's why I've been telling people that the only GeForce cards that they should be considering end in "90" or "90 Ti". Anything less will be VRAM-starved.
16GB is the bare minimum but I personally won't go any less than 24GB ever again especially gaming in 4K. Some games are VRAM hogs.
Well, it depends because what matters the most is how potent the GPU is. Like, for a GPU with the potency of an RX 6700 or 6750 XT, I believe that 16GB would be a waste because the GPU would become too slow to use long before 16GB wasn't enough VRAM. I think that 12GB is enough for that card's use case because by the time 12GB isn't enough for 1440p, that GPU wouldn't be able to handle 1440p anyway so the 12GB won't be a limiting factor.

For cards that are meant for 1080p mainstream gaming, 8GB is just fine because it's just 1080p mainstream gaming. Like, the RTX 3080 is fine with 8GB as are the RX 6600/6650(XT) cards. The problem that I had with nVidia's playbook was putting only 8GB on the higher-end cards while inexplicably putting 12GB on the RTX 3060.

As far as I'm concerned, the difference between AMD and nVidia in that respect was that AMD put the correct amount of VRAM on the cards based on their GPU horsepower. The cards with Navi 21 (6800/6800 XT/6900 XT/6950XT) got 16GB, the cards with Navi 22 got either 10GB (RX 6700) or 12GB (RX 6700 XT), the cards with Navi 23 (RX 6600/RX 6600 XT/RX 6650 XT) got 8GB and that abomination called the RX 6500 XT (I don't know why it even has an XT but whatever) got 4GB.

The RX 6500 XT wasn't even originally supposed to exist. It was a cynical move on AMD's part to just try to take advantage of the wasteland that was the GPU landscape at the time. While it's true that nVidia did the same thing with the RTX 3050 (a card that's even weaker than the GTX 1070 Ti), they weren't nearly as egregious about it because the RTX 3050 supported the entire feature set and had 8GB of VRAM while the RX 6500 XT didn't.

I saw the writing on the wall when I couldn't use the HD texture pack in Far Cry 6 because my RX 5700 XT only had 8GB of VRAM and that texture pack needed at least 11GB. When I saw that the RTX 3080 had only 10GB of VRAM, I was like "These people are paying that much for a card that potent that already can't use Far Cry 6's texture pack? Are they insane?" and as it turns out, yeah, they were.

It's ok to have a card with only 8GB on it, but, in this day and age, it's not ok to pay more than $200 for it.