8GB VRAM is still plenty for todays games. The vast majority of games dont need that much. Are there any games that even exist with 12GB system requirements, most are still 4-6GB and 8 for recommended. No, you wont play Hogwartz at 8K 480FPS Max settings RT on a 4060Ti. But a lot of these high VRAM games are because of poor optimization.
Yes, that's very true but that doesn't change the fact that it's a real trend. Remember that this has been going on for decades. I still remember my old Albatron GeForce FX-5200 with 128MB of DDR, my old XFX GeForce 6200 with 256MB of DDR, my old Palit GeForce 8500 GT with 1GB of DDR3, my old XFX Radeon HD 4870s with 1GB of GDDR5, my old Gigabyte HD 7970s with 3GB of GDDR5 each, my two Sapphire R9 Furies with 4GB of HBM each, my XFX RX 5700 XT with 8GB of GDDR6 and my RX 6800 XT with 16GB of GDDR6.
A lack of optimisation isn't the root problem, time is. Sure, bad optimisation makes it worse but so too does people not paying attention to what they're buying. You could have the best optimisation possible but this would still keep happening. My first PC was the father of all PCs, the IBM PC Model 5150 and it had 640kB base RAM. To use more RAM with a 32-bit CPU like a 386, we had to use memory managers like EMM386 because MS-DOS couldn't natively address more than 640kB.
Look at us now with our 100GB+ games and 16-32GB of RAM. All that was required was the passage of time.
Hogwarts runs on a PS4 with 8GB UNIFIED. Yes, its not the prettiest, but they made it work.
Yeah but consoles and PCs aren't the same. PC's are optimised for multi-purpose use while consoles are designed to do one thing and one thing only (unless you count being a Blu-Ray player) and that's gaming. Besides, if you use the same settings that are on the PS4 (the ones that aren't the prettiest), you can play it on PC with an 8GB card without issue. The issue that people are flipping out about is
how much they paid for their 8GB cards, not the fact that 8GB cards exist.
The responsibility is on game developers to design their games properly.
Sure, but when the vast majority of your customer base are console gamers, that's all you really have to worry about. Unless a game is some marvellous spectacle that can't be played on console (like a new Crysis-type game), PC will
always be an afterthought.
We shouldn't have to all be running 4090's because of poor game development.
Who said that we have to? All you need is a mid-range card with 10-12GB of VRAM on it and you'll have no problem. Who said anything about an RTX 4090? Nobody with even a 10GB card like an RX 6700 or RTX 3080 is complaining and nobody with a 12GB card like an RTX 3060 or RX 6700 XT is complaining either. We're also not hearing complaints from people who own an RX 6600/6600 XT/6650 XT because their 8GB cards didn't cost them much (comparatively speaking). For what they paid, they're perfectly happy to turn settings down to accommodate the 8GB.
The people whining are the nVidia fanboys who paid
way too much for an 8GB card, specifically the RTX 3060 Ti, RTX 3070 and RTX 3070 Ti. Those people are having some serious buyer's remorse because, they feel (and quite rightly) that, for the prices they paid, they
shouldn't have to lower
anything.
I completely agree with them and nVidia should be panned and chided for releasing 8GB cards in that range. Having said that, the people who bought those cards mostly did so out of their own ignorance and hubris. They did no product research, they just assumed that they knew everything and bought whatever GeForce card they could afford without a second thought (like the good little sheep that they are). If they
had done their research, they'd have asked some pretty serious questions before making such stupid purchases.
Nobody held a gun to their head and told them that they
had to pay those exorbitant prices for cards with only 8GB. They
chose to do this, even though many tech reviewers pointed out that 8GB would sooner or later be a limiting factor on these cards. Of course, they couldn't be sure just how much sooner or later but I knew that it would be sooner because of experiences that I had. My R9 Furies became limited by VRAM because they only have 4GB each. Sure, 4GB of HBM can do more than 4GB of GDDR5 but it's still only 4GB. The 4GB wasn't enough for a GPU as potent as the ATi Fiji but that's all it had so it became prematurely crippled.
I got my RX 5700 XT with 8GB and suddenly everything was perfect, for about six months. Then I got the harbinger of things to come known as Far Cry 6. Its HD texture pack
required 11GB of VRAM to use and I remember being a bit annoyed by the fact that my relatively-new video card couldn't use it. Don't get me wrong, the game ran great and I had a blast with it but I stored that bit of information about 8GB not being enough in the back of my mind. I knew that I was going to need a card with more VRAM and soon.
When I saw a chance to get a reference RX 6800 XT for $500 less than the going rate at the time, I pulled the trigger after deliberating with myself for about a hour. I knew it was as good as I was going to get at the time and that I'd be able to mine back a good chunk of the cost anyway if paired with my RX 5700 XT.
If it had been GeForce cards that weren't short on VRAM and Radeon cards were, I would've broken my Radeon streak at nine cards and bought one. However, the RTX 3080 only had 10GB and cost WAY more than the RX 6800 XT. I couldn't understand why but I wasn't going to look a gift horse in the mouth. Sure, I wasn't going to get things that are (probably) pretty cool like DLSS and nVenc but I wasn't going to get any headaches either (and I don't care about RT, at least, not yet).
I saw what was coming though. I did my due diligence and read everything I could about what was to come. I tried warning people about it but they dismissed me as a fanboy and did what they did anyway. I feel bad for them but you can only lead a horse to water, you can't make him drink (or swim for that matter). One thing was certain though, I made
damn sure that
I got a card that could handle the increased VRAM needs. This course of action is the reason why I'm not one of the people whining about game optimisation. I knew it was coming and I was prepared. I wish more people had done the same but you can't fix stupid.
Any game in which the dev put in the time and effort for optimization should have no problem with 8GB now, and into the near future. If 16GB was the standard like a lot of people are claiming, and NVidia actually went with that, the next XX50 card would be $899 lol.
Stop that! Jensen might be reading this and we don't need you giving him any big ideas!
It would be insane. a 60 tier card does not need 12GB. what they did on the 3060 was stupid. They even made a 2060 12GB. Stupid, and totally unnecessary. 12GB shouldn't start until XX70, and thats where we are at.
I too thought this to be completely nonsensical until someone pointed out to me that the RTX 3060 was made for miners, not gamers. Then it suddenly made all the sense in the world.
There is nothing wrong with 8GB at 60Ti, 60, and 50Ti. 50 at 6GB. VRAM isn't cheap and the cost is passed onto the consumer. You dont want more expensive GPUs do you?
Do you work for nVidia or something? I'm sorry to say this but that's one of
the most ridiculous things that I've ever read. That sounds just dumb enough to be nVidia's company line. The amount of VRAM on a card has little to no influence on how much the card actually costs. There is no excuse for what nVidia has done so why are you trying to make one?
This whole issue is an
nVidia thing, not a "VRAM thing" or a "developer thing" and here's the proof:
Radeon RX 6600 8GB - $200
Arc A750 8GB - $250
Radeon RX 6600 XT 8GB - $254
Radeon RX 6650 XT 8GB - $260
Radeon RX 6700 10GB - $260 <- *
GeForce RTX 3050 8GB - $280
Arc A770 8GB - $330
GeForce RTX 3060 12GB - $335 <- *
Radeon RX 6700 XT 12GB - $340 <- *
Arc A770 16GB - $350 <- *Didn't you say VRAM isn't cheap?
GeForce RTX 3060 8GB - $350
Radeon RX 6750 XT 12GB - $380
GeForce RTX 3060 Ti 8GB - $385
GeForce RTX 3070 8GB - $461
Radeon RX 6800 16GB - $480
Radeon RX 6800 XT 16GB - $510
GeForce RTX 3070 Ti 8GB - $560
GeForce RTX 4070 12GB - $600
Radeon RX 6950 XT 16GB - $620
GeForce RTX 4070 Ti 12GB - $800
Radeon RX 7900 XT 20GB - $800
Radeon RX &900 XTX 24GB - $950
GeForce RTX 4080 16GB - $1,110
GeForce RTX 4090 24GB - $1,600
Maybe you can explain just what you're talking about because as we watch these prices rise, we can see that there's absolutely no correlation between price and the amount of VRAM on the card.
- The 10GB RX 6700 costs less than the 8GB RTX 3050
- The 12GB RTX 3060 costs less than the 8GB RTX 3060
- The 12GB RX 6700 XT costs less than the 8GB RTX 3060
- The 16GB A770 costs less than the 8GB RTX 3060
- The 16GB A770 costs only $20 more than the 8GB A770
- The 3060 Ti and 3070 cost more than 5 cards with >8GB
- The 16GB RX 6800 costs less than the 8GB RTX 3070 Ti
- The 16GB RX 6800 XT costs less than the 8GB RTX 3070 Ti
- The 16GB RX 6970 XT costs less than the 8GB RTX 4070 Ti
- The 20GB RX 7900 XT costs less than the 16GB RTX 4080
- The 24GB RX 7900 XTX costs less than the 16GB RTX 4080
What we
can see is that Intel only charges $20 more for a 16GB A770 than an 8GB A770 so it looks like 8GB of VRAM costs $20. I don't know about you but that certainly fits my definition of "cheap". We also see nVidia cards costing more than Arc and Radeon cards that completely outperform them. We also see that nVidia charges roughly the same amount for an RTX 3060 regardless of whether it has 8GB or 16GB. In fact, the cheapest 8GB model that I found was
more expensive than the cheapest 12GB model!
I don't know how you would get the idea that nVidia would
need to raise prices if they had put a few more gigabytes of VRAM because this list of cards from top to bottom says the
exact opposite. Note that I didn't cherry-pick these. These are all the least-expensive models that I could find on
pcpartpicker. Anyone can vet my list by just clicking on the link that I left ^^^ there.
If games are properly optimized, even the latest games would run fine on 8GB VRAM at 1080p max settings, RT, 120FPS, with the help of DLSS. Your 60Ti's would work just fine and we can all have semi-affordable GPUs. Its optimization that's the problem
Yeah but then the games themselves would cost far more than what they do because the devs would have to dedicate a crap-tonne of extra man-hours to said optimisation for a segment of the gaming market that is quite small compared to the console crowd.
Added labour would cause a significant increase in the price of games, far more than if nVidia just put enough VRAM on their mid-range cards so that badly optimised games still worked. You know, like AMD and Intel did. I don't understand how you can blame the game developers when your idea would end up costing us even more because labour hours cost A CRAP-TONNE MORE than a few extra gigabytes of GDDR6 or GDDR6X.
No you don't get it, if you have something in your PC case and can play some type of games already with that (like grand strategies that don't need much GPU power, most online shooters, eSports, MMOs,...) and want to play some new AAA games then getting 500€ GPU or whole new console for that is a no brainer, sadly most PC users are not not able to think outside their box. Upgrading to 500€ new GPU with 8 gigs of ram just does not make much sense, no matter how PCMR fanboys try to spin it the consoles and GPUs do affect each other.
I couldn't have said it better myself.
I would be amazed if it was cheaper than $499 given Nvidia's current pricing stategy. Looks like the core clock boost will allow it to outperform the 3060Ti despite the drop in CUDA cores etc and the massive bandwidth drop. I suspect that it will be a narrow win of <20% at 1440p tho.
Maybe, but that's hardly worth talking about and definitely not worth the increase in price.
Not really the xx50 is $349 at least, you might get one below $300 on offer or after it's been out >9months but I would be amazed if the 4050 was below $350 (nvidia might even go $449 for the 4060 and $399 for the 4050 and blame inflation)
Yeah, the same inflation that affects video cards but not motherboards or CPUs even though they're all made from the same parts. Funny how that works, eh?
I agree but the fact is nVidia don't need gamers, they have loads of AI business at massively larger margins, they consider they are doing gamers a favour selling them GPUs at less than $2k. They will continue like this and maybe get even worse as as much as we hate it their job is to maximise profit and the best way to do that is to sell a AI card at $50k not a GPU at $500.
Yeah, but that's true about all three corporations.
Could be said with how cut down the 4060 is that it is a xx50 class card.
Yeah, this is definitely a generation to skip.
This is spot on, the sad part is when your old GPU dies and you have a CPU with no integrated graphics, then you have to cashout some 500 usd or more for something thats barely an upgrade on your 6 or 3 years old GPU. cause yes 10XX Nvidia cards still do the job and quite good considering the asking price for anything the past 3 years.
Well, this is only the case if you're unwilling to get a Radeon or Arc card. People who are stubborn and inflexible are their own worst enemies. When I hear someone crying about nVidia's pricing, I just say "So get a Radeon or Arc card! It's not that hard a concept to grasp!".
