Directly comparing the recent releases from AMD and NVIDIA, everyone has noticed a big difference. VRAM.
Disregarding the 3.5/.5 gb VRAM issue on the 970's, I take a look at the 970 and 980 and even the 980 Ti and noticed something.
Nvidia skimps on the VRAM. Hugely, and it's effecting my games.
I know I could've waited for a card with more VRAM, but Nvidia released these cards recently. Don't they see some value in futureproofing? (Current day-proofing?)
GTA V and The Witcher 3 both saturate all four gigabytes of ram on my 970 (40 fps average, 1280 x 1024), and I don't understand why Nvidia decided to go with such a low amount. The only argument I ever get (anywhere) is that "No game will ever use all four gigs of VRAM"
Any specific reason Nvidia barely provides playable amounts of ram on ~$350 gpus and goes crazy overkill on ~$1000 gpus?
Disregarding the 3.5/.5 gb VRAM issue on the 970's, I take a look at the 970 and 980 and even the 980 Ti and noticed something.
Nvidia skimps on the VRAM. Hugely, and it's effecting my games.
I know I could've waited for a card with more VRAM, but Nvidia released these cards recently. Don't they see some value in futureproofing? (Current day-proofing?)
GTA V and The Witcher 3 both saturate all four gigabytes of ram on my 970 (40 fps average, 1280 x 1024), and I don't understand why Nvidia decided to go with such a low amount. The only argument I ever get (anywhere) is that "No game will ever use all four gigs of VRAM"
Any specific reason Nvidia barely provides playable amounts of ram on ~$350 gpus and goes crazy overkill on ~$1000 gpus?