[SOLVED] Why is the 3060 a 12GB card?

Prad_Bitt

Notable
Jul 4, 2020
1,079
244
1,140
39
So Nvidia released the 3080 at $700, aimed at 4K gaming. The 3070 and 3060 Ti at $500 and $400 aimed at 1440p, and they all have 10/8/8 GB memory. Then Nvidia releases the 3060 (which according to their FPS charts, looks like 2070 level performance to me instead of the 2070 Super level I was expecting), aims it at 1080p60 RTX ON and gives it 12GB, prices it at $330. This is so stupid. They could've just made it 8GB, kept it $300 to be a true successor to the 1060 and 2060 instead of this weird price point that Nvidia messed with with the 16 series as well (the useless segmentation that doesn't help anyone, $220, $230, $280 $350 what is the point). Why, besides marketing ("oh this has 12GB vs 8GB this must be faster") , does this GPU which is aimed at 1080p gaming have 12GB GDDR6? Thoughts?
 

Phaaze88

Champion
Ambassador
You can say that again... it's hard to find a video on YouTube that doesn't have FPS meters all over it. I'm talking about the entire top left section of the screen with the FPS count 4 times the size of everything else... all 10-16 cores listed... all their temps, usage, voltage... etc...

I'm like "dude? really?" I use a fps meter but mine is like 2 lines... fps, gpu/cpu temps (not all 10 cores lol) and vram usage. That's it... really don't need much more than that.

I'm still waiting on the right moment to reply to the guy that says "will my CPU bottleneck this GPU? I need 144 fps!" I wanna be like "did you run it through the bottleneck calculator? Your cpu shows a 4.18% bottleneck on the 3080.... but if you upgrade and knock that bottleneck percentage down to less than 3%, you should be good to go."

sarcasm off
If it's a new title, I'll run it with Afterburner monitoring once or twice to make sure everything's running normally.
After that, I turn monitoring off, but leave the main HUD on the 2nd monitor and roll with it, unless something seriously goes wrong and I have to troubleshoot.

You might not have to wait long for those fps drunk folks - they're out there alright.

Bottleneck calculators? I can't take them seriously at all.
I just checked one of the popular ones out of curiosity, and...
7820X + 1080Ti
7820X + 3080
Just what...

I'd agree... I went with the 3090 because in addition to gaming I wanted the 24gb vram for my video work that I do. Old build was a 1080 Ti at 11gb...

... not gonna lie... going to a 3080 with 10gb was kinda feeling like a downgrade and I didn't really consider it.
Ya I do 3d rendering and 8gb is min and I can max that np. I might have to go dual 3060ti for more vram. Lol
This right here, would be the strongest reason for the 3060's extra Vram.
Too bad 'gamers' will snatch this one up for the wrong reasons... Oh, and can't forget about the miners too, I guess.
 

Prad_Bitt

Notable
Jul 4, 2020
1,079
244
1,140
39
Because the 3070 and 3080 should have at least been 12 or 16gb. But rush to market, shortages on RAM, etc I think made Nvidia put out something, and customers aren't happy, so now its slap more RAM on an unobtainable product.
It's still useless though isn't it? I mean this card can't even do 4K. Even a 2080 can't. This memory would be better off being used in a 3080
 

TravisPNW

Proper
Aug 26, 2020
109
34
110
0
Because the 3070 and 3080 should have at least been 12 or 16gb. But rush to market, shortages on RAM, etc I think made Nvidia put out something, and customers aren't happy, so now its slap more RAM on an unobtainable product.
I'd agree... I went with the 3090 because in addition to gaming I wanted the 24gb vram for my video work that I do. Old build was a 1080 Ti at 11gb...

... not gonna lie... going to a 3080 with 10gb was kinda feeling like a downgrade and I didn't really consider it.
 
Reactions: Prad_Bitt

Phaaze88

Champion
Ambassador
It's still useless though isn't it? I mean this card can't even do 4K. Even a 2080 can't. This memory would be better off being used in a 3080
All according to plan. Most don't understand Vram allocation and usage anyway, so Nvidia's looking to cash in on those who don't know any better.
Scaremongering Tactics 101, people.
"Yesss... Ssspend more on a card you don't actually need, and whine later when we launch a more powerful card with lesss Vram ssstill! Mwahahahah..!"
I expect that card to be one of the most popular, and yet the most pointless in the product stack.
In the applications that amount of Vram would help, the xx60 is still weaker by quite a bit than a 3080 which could benefit from a larger buffer more.

I had my eye on the RX 6800XT, and it had nothing to do with the Vram; it was a better product from my POV. Sadly, the pricing and the lack of hybrid cooled options turned me away.

Futureproofing? Don't even get me started on that folly.

With what I do, the 3080's 10GB buffer wouldn't hurt me, and would do me just fine for a few years.
I've not used more than 8GB(allocated) of Vram on my 1080Ti(GDDR5X) at 1440p, and I've had it for almost 4 years. GDDR6/R6X is even faster and should be more effective per GB.
[sarcasm start]Games progress fast don't they?[sarcasm end]


For awhile now, some people have been doing things(piling on mods, texture packs, etc) to purposefully use needless amounts of Vram to try and justify the 'need' for more, and like...
Seriously? What's the point? Can't people just pop in a game and bloody enjoy them anymore?
The fps counters/monitors are poison enough already, now we don't have enough Vram because we forced it to all be used up, as if that's bloody practical...
 

TravisPNW

Proper
Aug 26, 2020
109
34
110
0
The fps counters/monitors are poison enough already
You can say that again... it's hard to find a video on YouTube that doesn't have FPS meters all over it. I'm talking about the entire top left section of the screen with the FPS count 4 times the size of everything else... all 10-16 cores listed... all their temps, usage, voltage... etc...

I'm like "dude? really?" I use a fps meter but mine is like 2 lines... fps, gpu/cpu temps (not all 10 cores lol) and vram usage. That's it... really don't need much more than that.

I'm still waiting on the right moment to reply to the guy that says "will my CPU bottleneck this GPU? I need 144 fps!" I wanna be like "did you run it through the bottleneck calculator? Your cpu shows a 4.18% bottleneck on the 3080.... but if you upgrade and knock that bottleneck percentage down to less than 3%, you should be good to go."

sarcasm off
 
Reactions: Prad_Bitt

getochkn

Polypheme
Moderator
I'd agree... I went with the 3090 because in addition to gaming I wanted the 24gb vram for my video work that I do. Old build was a 1080 Ti at 11gb...

... not gonna lie... going to a 3080 with 10gb was kinda feeling like a downgrade and I didn't really consider it.
Ya I do 3d rendering and 8gb is min and I can max that np. I might have to go dual 3060ti for more vram. Lol
 
Reactions: TravisPNW

Phaaze88

Champion
Ambassador
You can say that again... it's hard to find a video on YouTube that doesn't have FPS meters all over it. I'm talking about the entire top left section of the screen with the FPS count 4 times the size of everything else... all 10-16 cores listed... all their temps, usage, voltage... etc...

I'm like "dude? really?" I use a fps meter but mine is like 2 lines... fps, gpu/cpu temps (not all 10 cores lol) and vram usage. That's it... really don't need much more than that.

I'm still waiting on the right moment to reply to the guy that says "will my CPU bottleneck this GPU? I need 144 fps!" I wanna be like "did you run it through the bottleneck calculator? Your cpu shows a 4.18% bottleneck on the 3080.... but if you upgrade and knock that bottleneck percentage down to less than 3%, you should be good to go."

sarcasm off
If it's a new title, I'll run it with Afterburner monitoring once or twice to make sure everything's running normally.
After that, I turn monitoring off, but leave the main HUD on the 2nd monitor and roll with it, unless something seriously goes wrong and I have to troubleshoot.

You might not have to wait long for those fps drunk folks - they're out there alright.

Bottleneck calculators? I can't take them seriously at all.
I just checked one of the popular ones out of curiosity, and...
7820X + 1080Ti
7820X + 3080
Just what...

I'd agree... I went with the 3090 because in addition to gaming I wanted the 24gb vram for my video work that I do. Old build was a 1080 Ti at 11gb...

... not gonna lie... going to a 3080 with 10gb was kinda feeling like a downgrade and I didn't really consider it.
Ya I do 3d rendering and 8gb is min and I can max that np. I might have to go dual 3060ti for more vram. Lol
This right here, would be the strongest reason for the 3060's extra Vram.
Too bad 'gamers' will snatch this one up for the wrong reasons... Oh, and can't forget about the miners too, I guess.
 

Prad_Bitt

Notable
Jul 4, 2020
1,079
244
1,140
39
You can say that again... it's hard to find a video on YouTube that doesn't have FPS meters all over it. I'm talking about the entire top left section of the screen with the FPS count 4 times the size of everything else... all 10-16 cores listed... all their temps, usage, voltage... etc...

I'm like "dude? really?" I use a fps meter but mine is like 2 lines... fps, gpu/cpu temps (not all 10 cores lol) and vram usage. That's it... really don't need much more than that.

I'm still waiting on the right moment to reply to the guy that says "will my CPU bottleneck this GPU? I need 144 fps!" I wanna be like "did you run it through the bottleneck calculator? Your cpu shows a 4.18% bottleneck on the 3080.... but if you upgrade and knock that bottleneck percentage down to less than 3%, you should be good to go."

sarcasm off
That stuff to me is only useful when deciding between a couple CPUs of the same price. GPUs are usually more distinguished and easier to pick than CPUs. I was totally convinced the 10400F is a better option until I watched Gamers Nexus' video on the 10400F. Those counters are useful, but only like 0.1% of the time lol. All I have is a GeForce experience fps counter on the bottom left out of my vision :D
 
Reactions: TravisPNW
So Nvidia released the 3080 at $700, aimed at 4K gaming. The 3070 and 3060 Ti at $500 and $400 aimed at 1440p, and they all have 10/8/8 GB memory. Then Nvidia releases the 3060 (which according to their FPS charts, looks like 2070 level performance to me instead of the 2070 Super level I was expecting), aims it at 1080p60 RTX ON and gives it 12GB, prices it at $330. This is so stupid. They could've just made it 8GB, kept it $300 to be a true successor to the 1060 and 2060 instead of this weird price point that Nvidia messed with with the 16 series as well (the useless segmentation that doesn't help anyone, $220, $230, $280 $350 what is the point). Why, besides marketing ("oh this has 12GB vs 8GB this must be faster") , does this GPU which is aimed at 1080p gaming have 12GB GDDR6? Thoughts?
competition and more VRAM is what the consumer wants. that's all to it.
 
Reactions: Prad_Bitt
Because the 3070 and 3080 should have at least been 12 or 16gb. But rush to market, shortages on RAM, etc I think made Nvidia put out something, and customers aren't happy, so now its slap more RAM on an unobtainable product.
it has nothing to do being rushed to the market or having VRAM shortage. nvidia built their cards with the amount of VRAM that they think good enough for the card. you want more VRAM? go for faster more expensive model. this is nvidia segmentation strategy for years. also nvidia cards able to handle bandwidth and VRAM pressure better than AMD cards for several generations already. and we already seeing this affect in several new triple A games involving 4GB cards from both company.
 
It's still useless though isn't it? I mean this card can't even do 4K. Even a 2080 can't. This memory would be better off being used in a 3080
consumer will want that extra VRAM even if they are useless just to make them feel better that nothing will hold them back in the future for "just in case game end up using more VRAM even at low resolution" situation. they don't care about the technical details. even if you present them with the fact they will still going to say "what if...". nvidia is well aware about this. so in the end they will just going to follow what consumer wants rather than fighting them. the x70 and x80 most likely going to be updated with models that have more VRAM in the future.
 

ASK THE COMMUNITY

TRENDING THREADS