If you're a person who already has or needs to get a desktop PC for other reasons, you can just take the $500 you'd spend on a PS5 and use it on a low-end graphics card to add to your PC you either have or were going to get anyway. Doesn't seem hard to justify at all.
Sure, except that the $500 in question gets you two top-of-the-line game controllers and, in a family setting, gaming can be done while someone else uses the PC for something else. There is tremendous value in that. Not everybody lives alone and can just combine everything into one piece of hardware.
There are also a lot of people who need a powerful GPU for work they do, or money-making hobbies like making videos. Such people are naturally inclined to game on PCs they already own rather than consoles.
I couldn't disagree more because you're using the rhetorical "a lot" which is vague and meaningless. When the term "a lot" is used, it has no power whatsoever if left undefined. When I say "a lot", I use it as a relative term because that actually means something.
Your statement is patently false in that way. The number of PC owners who have specifically chosen to use discrete video cards is probably less than 50%. The vast majority of people aren't even tech-savvy, let alone gamers, enthusiasts or graphic designers.
Remember that the term "mainstream PC use" does not refer to gaming. It refers to audio/video (HTPC), web browsing, banking, paying bills, shopping, email, office tasks (like homework), communication and social media. People who fall into this category aren't tech savvy and so they use branded desktops, all-in-one PCs and craptops. They are the reason that companies like Dell, HP, Lenovo and Acer are so huge.
Since these mainstream tasks don't require discrete video cards, these manufacturers don't have them as standard equipment. After all, if a customer can do just fine with an Intel or AMD IGP, why would any of those manufacturers include a discrete video card that would raise the price of their PC and cause them to lose a sale to their competition? Exactly, they wouldn't.
Therefore, the number of people who have chosen to use a discrete video card is already
relatively few. Of the already relatively few people who fall into that category, the overwhelming majority of those are
gamers only. I don't know what the percentages are but there are enough gamers aren't tech-savvy enough to build their own PCs to support companies like Alienware, ASUS ROG and iBuyPower.
How do I know that the non-tech savvy in this category are only gamers? I just simply accept the fact that someone who does graphic design or 3D animation
must be tech-savvy. The most hard-core work that you can do with a PC is either coding, 3D modelling, animation or graphic design. These are tech trades.
The tools of the trade for these people are CPUs, GPUs, RAM and software. Like people in any trade, to be successful, they have to be intimately familiar with the tools that they use. Therefore, it is safe to say that these people are tech-savvy and purchase their tools accordingly.
When compared to the number of PC enthusiasts who are tech savvy and build their own rigs for only gaming (I myself have been in that category since 1988), the number of these content creators is
pitifully small, nowhere near large enough to make any kind of significant impact on the consumer marketplace. This remains true even if you include content-creators on platforms like YouTube or Switch. After all, for every one content creator, there are literally thousands of content consumers (aka viewers). Therefore, they are nothing more than a
tiny minority.
And lastly, you could simply buy the expensive GPUs less often. If you upgrade every 4 years instead of every 2 years, for example, you are essentially getting a 50% discount... then that $1000 GPU could... in a way... be thought of as costing you only $500.
I agree that this makes sense to me because that's often how I roll. However, not everyone is willing (or even able) to invest this much money at once. Some people buy high-end video cards but then become spoiled by it and are unwilling to keep using it when it can no longer provide the frame rate they want at high resolutions. For some cards, this can happen relatively quickly.
Just look at the state of the RTX 3060 Ti, RTX 3070 and RTX 3070 Ti. Just two years after their release, they are unable, in some new titles, to game at 1080p Ultra, let alone 1440p or 4K. This is because nVidia didn't put enough VRAM to match the potency of the GPUs on those cards but it's nevertheless a widespread problem. Considering the market situation since 2020, just how many people do you think paid
under $1,000USD for those cards? My guess is a lucky few who managed to get them at MSRP before the market became as crazy as a meth addict.
I was fortunate enough to realise right away that 8GB had no place on a modern high-end video card and so I chose an RX 6800 XT with 16GB of VRAM. That's because, as you quite rightly pointed out, I want the card to last for many years to come. Unfortunately, not everyone saw the trap in advance like I did.