skit75 :
Christopher Shaffer :
dovah-chan :
skit75 :
The Great VRAM Money Grab has arrived, again. Best thing to do before dropping prices. I recently had to steer the owner of the last build I did away from such cards as he was gaming on a single 1080p monitor. While standing next to me during part procurement looking at the GTX 780Ti, "this one has more memory and costs more... why don't I get this one???" I was trying to get him a 760 or 770. He ended up with an EVGA 780 3GB.
someone has read
a good article
That's a terrible article. It's full of outdated and misguiding information. The gist of the comments on VRAM are "you don't need it if you don't want to run Ultra". That's hardly sound advice.
3GB of VRAM is probably plenty right now, but I have 4GB in my SLI setup and I can say for certain that the 3GB envelope is quickly being pushed.
The idea that you only "need" the bare-bones power to just barely run a game on high (or a 5-year old game on Ultra) isn't good building advice. Most people don't want to just barely be able to hit 60 or 120FPS; they want it to be a solid, Ultra-quality 60 or 120FPS.
In fact, that article comes to the conclusion that you don't need more than 1GB of VRAM. That's true, if you want to run on med-high settings on 1yr or older games.
I would hardly call that article terrible. People who choose to stay informed may have known a great deal about what was posted. Others, not so much. There was obviously, a lot of time put into that article.
Also, many who initially start off saying they want the best blah blah blah quickly begin back-tracking once they see how expensive their dreams are(the last build I did in the example). I would argue that spending $700.00+ on a GPU for gaming is crazy. The price that you pay to get that extra performance couldn't be justified, rationally. A system at half the cost will get you 75% of that performance. Your better off investing in the core of the rig and updating the GPU(s) every two years or whichever architecture is just behind the latest & greatest, in my opinion.
I've seen what the highest end provides in the builds I do when compared to a rig like mine. It just isn't worth it, if you have a budget. I play BF4 on High settings and have recently tried the Skyrim HD texture pack. Even on my 560Ti, both are beautiful and very playable....48 & 28 FPS respectively, give or take.
I think we're talking about two different things.
The article in the *link* above actually suggests that a 1GB card is plenty for all current games. This ignores the fact that most games from even the last 2 years won't be playable at the target of 60FPS on max settings (even w/out MSAA, etc.) with a 1GB card.
It ignores the fact that texture bit depth affects the size of the textures (in MB/GB) and as such directly affects the amount of VRAM needed to display high-res textures.
When I read that article originally, I checked more than once to make sure it wasn't an old article. It's full of outdated advice.
What you say is correct: you don't need a $700 GPU to play a game. But that's a very loaded statement. What game? What resolution? What quality settings? With AA/AF/AO? One monitor or more? What refresh rate?
But the cost of the GPU isn't what I'm getting at. It's the attitude that you only need 1GB of VRAM that is suggested by that article. 2GB should be minimum if you want any future use out of your card(s). Notice I didn't say futureproof. This is a myth.
I can max BF4 on 1 GTX 770 4GB at ~75FPS. But I have a 144Hz monitor and really want to hit at least 120FPS, so have 2 of them and can nail 120. The game uses over 2GB of VRAM regularly. For those that say "no way! my 2GB card never gets maxed" that's because the game knows you have just under 2GB available and as such handles textures differently (and less efficiently) than if you had a 3GB or 4GB card. Titanfall easily uses 3GB if you're maxing it.
So, in a nutshell, I think it's misleading and inaccurate to make a blanket statement saying "you don't need more than a 1GB card" to play "today's games" (when the article reviewed literally no games less than a year old) when none of the other factors are considered. It's ignorant at best.
The author even mentions on the same page that he says you don't need to upgrade if you have a 1GB card:
http://www.tomshardware.com/reviews/graphics-card-myths,3694-6.html - that this is "if you don't want more demanding [AA] settings", for example. But then goes on to say an "underpowered card with 4GB of VRAM" isn't going to make a difference. This is such a generic statement it doesn't see the forest for the trees. You're not going to find a GTX 750 with 4GB of VRAM. You're not going to find a 7770 with 3GB of VRAM. They put more VRAM on more powerful cards so they don't bottleneck the GPU in the most demanding of situations.
There's also use cases to consider: are you just gaming or are you gaming and also doing rendering of some sort?
Last: if you think 48FPS in BF4 and 28FPS in Skyrim (or vice versa) is acceptable, then I'm not sure why you're even talking about VRAM. You're clearly not targeting even 60FPS or the latest graphics, so it's kind of interesting that you would be the one to say it's not needed.