You're framing the question poorly. The way it's worded makes it feel like it's a loaded question, in that you're trying to make the other person feel guilty about wasting their money.*Edit: Btw, this isn't meant to judge people who buy them. It's meant to judge the prices of the GPUs. How do people justify spending that much? I've been there myself, but I suppose you could say I'm being turned off by the continuously high prices.
Yeah, I much prefer PC gaming as well. It's just hard to rationalise it to someone who isn't already a member of the PC Master Race. I had to suffer with paying an extra $300CAD over MSRP in Canada. I actually paid about $700 over MSRP but I mined $400 back over 6 months with it.You're not wrong. I also have a Switch that I use to play oldschool NES/SNES games on... and an XBox Series X for some sports titles... most recent purchase being MLB The Show 23...
.... I honestly prefer everything else on PC though. Hogwart's Legacy I just started with plans for Elden Ring afterwards. I just can't see myself playing either on the XBox.
Yeah, I much prefer PC gaming as well. It's just hard to rationalise it to someone who isn't already a member of the PC Master Race.
I knew that if I was going to pay that kind of money, that card had better last for at least five years. With 16GB of VRAM, I'll play that card until it can't play anymore (even at 1080p) and I think that it will eventually be worth it.
Why what, exactly? I'm not sure what your point is. There are many, many cards available for less than $1000 that can give you a great experience for every resolution up to and including 4K. Nobody forces you to buy a card for more than that.But why ?
what about people like me who only want to play cod multiplayer ?
and certainly not witcher assassin Valhalla sort of open world games that make no sense
why are they so expensive ? Nvidia to me looks more like a Canadian company than an american one /.Why what, exactly? I'm not sure what your point is. There are many, many cards available for less than $1000 that can give you a great experience for every resolution up to and including 4K. Nobody forces you to buy a card for more than that.
So does AMD.I don’t know about innovation…but nvidia knows people will pay that price, do probably just charging what they can get.
And? The point is?most of its engineers are either chinese or persian .
So does AMD.
And? The point is?
Yeah, I've maxed my mobo as well. Well, not really, I only have 32GB of RAM but it's an ASRock X570 Pro4 with an R7-5800X3D on board. I think that I'll get 5 years for sure because, let's face it, my R7-1700 that I got in 2017 can still game and it wasn't made to be a gaming CPU. The R7-5800X3D on the other hand, definitely was.Yep... PC Master Race. My brother is one of those people that is a console gamer... he also has a Series X and has 0 interest in PC gaming.
As for my PC... I paid retail $1600 for the 3090 about 2 1/2 years ago... and just sold it for $675. Not the greatest return but within the norm on eBay at the present time. I got the 4090 at retail $1750... and I justified it because I game in 4K Ultra and this card was by far the best option considering it is a huge generational leap over the 3090. A lot of my benchmarks have doubled and that's good enough for me.
I've pretty much reached the max for my mobo... and plan to run with what I have for the next 3/4/5 years before I build a new PC. 3 years is a good bet... but I really want 5.
People like you have it easiest because you can play your games on almost anything. In your case, I would recommend what's undoubtedly the best bang-for-your buck card in the market:But why ?
what about people like me who only want to play cod multiplayer ?
and certainly not witcher assassin Valhalla sort of open world games that make no sense
Compare computers of the 1980s. They came with manuals that could describe in detail how the computer works and how to use it if you wanted. But a Commodore 64 or an IBM 5150 has nowhere near the capabilities of even a potato PC today.
Just because it's a given doesn't mean that it can be ignored. It is the root of everything that we're seeing. Everything else is just a smokescreen to try and deflect blame from those responsible for the prices that we're seeing.Let's ignore corporate greed and what not, because it's basically a given.
Yeah, I don't really buy any of that because the exact same silicon challenges exist with CPUs and they haven't shot up in price like video cards have. There's no reason for this other than corporate greed. If CPUs had stratospheric pricing, then I'd say that your words ring true. However, that's not the case. Video cards cost what they do because these corporations think that they can get away with it and they're trying to normalise the prices paid during the mining boom/pandemic/silicon shortage.For-profit companies exist to, you know, make a profit and publicly traded companies exist to make money for their investors, lest they get potentially sued thanks to Dodge v. Ford Motor Co. (though there's debate on how much this actually set a legal precedent).
So why are video cards getting more expensive?
- The easy target right now, if you're looking right now, is inflation. The problem with inflation is once the prices go up for non-commodity goods, they tend to stay up there. A can of soup that now costs $1.50 when it used to be $1.25 isn't going to go back down to $1.25, even if the company managed to find a way to produce the can of soup cheaper.
- The cost per transistor basically stopped falling at 28nm and rose slightly over time (https://www.fabricatedknowledge.com/p/the-rising-tide-of-semiconductor). It's simply getting harder to make things smaller.
- Another point is that until 28nm, manufacturers could get away with a single method to design transistors (planar gate). It seems that every other process node after that, manufacturers had to come up with yet another way to physically make the transistor. So that adds to R&D costs.
- The complexity of hardware is increasing. Not only does the hardware have to deal with the baseline advances, but so do the features that hardware makers continue to add.
- On top of this, it's impossible for a single person to understand the entire design and implementation of a modern GPU to a level of expertise. They may be an expert in one or two areas, but not in all areas. This means you have hire more engineers to design and implement the product. Engineers aren't cheap.
- Complexity also means more time needed for other aspects of product development, such as documentation (which tends to be woefully lacking at times) and testing. No surprise, the more things you add, the more things you have to test, which takes more time to do. And time is money.
I remember those and yeah, the prices were through the roof but that was for a different reason. The PC manufacturing infrastructure that exists today didn't exist then. Technology begets technology and the 80s pretty much marked the transition from analogue to digital electronics.Compare computers of the 1980s. They came with manuals that could describe in detail how the computer works and how to use it if you wanted. But a Commodore 64 or an IBM 5150 has nowhere near the capabilities of even a potato PC today.
Every computer (and many, many other technical appliances, for that matter) needs a CPU. Not every computer needs a dedicated GPU. Most work computers, for example, are completely fine without. That means CPUs can be profitable by bulk alone, which is something GPUs cannot achieve.Just because it's a given doesn't mean that it can be ignored. It is the root of everything that we're seeing. Everything else is just a smokescreen to try and deflect blame from those responsible for the prices that we're seeing.
Don't get me wrong, the clueless noobs who buy nVidia even when they're a terrible value are every bit as responsible as Jensen Huang.
Yeah, I don't really buy any of that because the exact same silicon challenges exist with CPUs and they haven't shot up in price like video cards have. There's no reason for this other than corporate greed. If CPUs had stratospheric pricing, then I'd say that your words ring true. However, that's not the case. Video cards cost what they do because these corporations think that they can get away with it and they're trying to normalise the prices paid during the mining boom/pandemic/silicon shortage.
It's really counter-productive to make excuses for these corporations when there is no excuse. Those who make excuses for these greedy psychopaths only serve to enable this kind of behaviour and in effect, get screwed the hardest. People like me told Intel and nVidia to go pound sand years ago. People like me have the backbone and strength of will to completely ignore the existence of corporations that we hate. If everyone practiced what I do, we'd have parity in the CPU and GPU markets with lower prices and greater performance for everyone. It's really the only way to make a difference because you're actually doing something about the problem instead of just talking about it.
I remember those and yeah, the prices were through the roof but that was for a different reason. The PC manufacturing infrastructure that exists today didn't exist then. Technology begets technology and the 80s pretty much marked the transition from analogue to digital electronics.
Corporations today have no such reason for what things cost. It is nothing more than pure corporate greed.
Except they kinda have. The i7-6700K was $350 at launch. The i7-13700K was $450. Plus transistor counts for CPUs don't increase dramatically as much as GPUs tend to do. For example, if this source I'm using is correct, the i7-10700K has 6.8B transistors, the i7-11700K has 8.2B transistors, the i7-12700K has 10.2B transistors, and the i7-13700K has 13.2B transistors. This looks around a 25% increase per generation.Yeah, I don't really buy any of that because the exact same silicon challenges exist with CPUs and they haven't shot up in price like video cards have. There's no reason for this other than corporate greed. If CPUs had stratospheric pricing, then I'd say that your words ring true. However, that's not the case. Video cards cost what they do because these corporations think that they can get away with it and they're trying to normalise the prices paid during the mining boom/pandemic/silicon shortage.
GPU | Transistor Count |
AD-102 | 76.3B |
Navi 31 | 58.0B |
GA-102 | 28.3B |
Navi 21 | 26.3B |
TU-102 | 18.6B |
Navi 10 | 10.3B |
GP-102 | 12B |
Vega 20 | 13.23B |
And good for you. Continue voting with your wallet.It's really counter-productive to make excuses for these corporations when there is no excuse. Those who make excuses for these greedy psychopaths only serve to enable this kind of behaviour and in effect, get screwed the hardest. People like me told Intel and nVidia to go pound sand years ago. People like me have the backbone and strength of will to completely ignore the existence of corporations that we hate. If everyone practiced what I do, we'd have parity in the CPU and GPU markets with lower prices and greater performance for everyone. It's really the only way to make a difference because you're actually doing something about the problem instead of just talking about it.
The only time I'd ever buy the "pure" greed part is if literally nothing practical has changed. Pure greed is rearranging the content of a text book by simply changing the chapters or sections around without updating the content for the sole reason of forcing students to buy the latest edition. Pure greed is when pharmaceutical companies jack up the price of a life saving drug 3x even though there's no real reason why (no shortage of supply, no sudden uptick in cases) and when the same product is cheaper elsewhere. Pure greed is buying up a bunch of patents, not making any products out of them, forcing others to license or get sued.Corporations today have no such reason for what things cost. It is nothing more than pure corporate greed.
I did not ask the question in any such manner. Budget is not important to me. It's a matter of principle. When my PC cost me $4000 in 2018 with a $770 GTX 1080 Ti back then, I certainly do not want to add a $1500-$2000 GPU to keep it up to date. Hard pass. Maybe I'm just maturing and finding more important things to do with money, like keeping it in my bank account. I'll stick with a $700 GPU such as the currently priced 6950XT.Should it? The question is asked as though a $1000 GPU is a fundamental human right instead of a high end luxury good. Unfortunately it is the latter, and if it is out of ones reach financially then that is the way the cookie crumbles. If one wants to be a PC gamer they should start with a BUDGET, and not be salivating over wanting what some random YouTuber has.
On a more serious note, when starting a fresh gaming build always start with monitor resolution and desired framerate in chosen titles. Everything else follows that choice. If the components needed don't fit that budget, move down a resolution or refresh bracket or adjust components and or expectations accordingly. Always have a plan. If you don't know, find out. Educate oneself and purchase smartly.
You don't have to spend $1000+ to keep your system up-to-date is the point, though. There are plenty of upgrades available for your 1080 Ti that cost less. That's what people try to tell you here. And if you stick with a reasonably new card for $700... then this topic makes even less sense. Because you already know the performance you want is available for less. Be it a 6950XT, 4070Ti, or 7900XT. They are all in the $700-900 range and all ridiculously strong, able to play any game at any resolution.I did not ask the question in any such manner. Budget is not important to me. It's a matter of principle. When my PC cost me $4000 in 2018 with a $770 GTX 1080 Ti back then, I certainly do not want to add a $1500-$2000 GPU to keep it up to date. Hard pass. Maybe I'm just maturing and finding more important things to do with money, like keeping it in my bank account. I'll stick with a $700 GPU such as the currently priced 6950XT.