[SOLVED] Why are tech vloggers saying 12gb of vram (gtx 3060) is merely a marketing gimmick?

Status
Not open for further replies.

barkersofgeraldine

Reputable
Nov 11, 2020
151
7
4,585
Im trying to understand JayZ2Cents's review (as well as other vlogger's) of GPUs (like 6800xt) that have more than 8gb of vram

They mentioned that current systems (or an average consumer's system) won't 100% utilize all 12gb of vram in normal use, which doesn't make sense because even 8gb cards won't be 100% utilized anyway; but anyway...

So are they saying that current programs dont require 12gb of vram, and at most will need 8-10gb, therefore, the unused vram is useless? but then wont that mean that the 3060, in the future, will be somewhat in a good position, when most programs will be able to utilize 12gb of ram (or at least more than 8gb of RAM)??

I am just confused about the part where they are saying that 12 GB GPUs are a marketing gimmick, and that it wont be used; so my question is, in the future, that won't be exactly true, right? Programs or games will be more demanding, and therefore, althought outdated, will still be able to use 12gb or more RAM?
 
  • Like
Reactions: Bassman999
Solution
Well, if you have a 192bits memory bus, you only have two logical memory sizes to pick from: 6GB or 12GB and 6GB is clearly getting too tight for modern GPUs and games so 12GB becomes the logical next baseline step for 192bits and 384bits GPUs. Same goes with 256bits where you have to pick between 8GB and 16GB, 8GB is getting kind of tight and 16GB is the next logical step up since DRAM chips only come in increments of 2X per die.

The amount of VRAM is basically a byproduct of the minimum viable amount of VRAM and memory bus size. Upgrading from 6x1GB to 6x2GB GDDR6 chips is also likely cheaper than widening the GPU from 192bits to 256-320bits and fitting it with 2-4 extra GDDR6 packages for 8-10GB total.

InvalidError

Titan
Moderator
Well, if you have a 192bits memory bus, you only have two logical memory sizes to pick from: 6GB or 12GB and 6GB is clearly getting too tight for modern GPUs and games so 12GB becomes the logical next baseline step for 192bits and 384bits GPUs. Same goes with 256bits where you have to pick between 8GB and 16GB, 8GB is getting kind of tight and 16GB is the next logical step up since DRAM chips only come in increments of 2X per die.

The amount of VRAM is basically a byproduct of the minimum viable amount of VRAM and memory bus size. Upgrading from 6x1GB to 6x2GB GDDR6 chips is also likely cheaper than widening the GPU from 192bits to 256-320bits and fitting it with 2-4 extra GDDR6 packages for 8-10GB total.
 
Solution
There's only two things where VRAM actually affects performance: how much texture data you're trying to shove into it and what resolution you're rendering at. Pressure from texture data however isn't even much because textures can be constantly swapped in and out as needed. Higher resolution kills performance anyway, so it's pointless for lower end cards to have VRAM for that purpose, and even then, raising the resolution doesn't seem to add to much to VRAM usage (like, it's a <1GB increase from 1080p to 4K)

Well, if you have a 192bits memory bus, you only have two logical memory sizes to pick from: 6GB or 12GB and 6GB is clearly getting too tight for modern GPUs and games so 12GB becomes the logical next baseline step for 192bits and 384bits GPUs. Same goes with 256bits where you have to pick between 8GB and 16GB, 8GB is getting kind of tight and 16GB is the next logical step up since DRAM chips only come in increments of 2X per die.
They could get around this with segmented memory, but we all know how that turns out.
 
Status
Not open for further replies.