Discussion Why is Nvidia doing this?

jacob249358

Commendable
Sep 8, 2021
636
215
1,290
People don't seem to be talking about this, but why are the VRAM amounts so strange on 3000 series cards? 12gb on the 3060 but then 8gb for 3060 ti, 3070, 3070 ti, then only 10 on the 3080. Now we got all these weird variants like the 12gb 2060, 12gb 3080, 16gb 3070 ti. like what the hell are these numbers. maybe learn something from amd
 

Ralston18

Titan
Moderator
Leaving AMD out of it.....

So is the problem simply the strangeness or weirdness you cited with respect to Nvidia Series 3000 GPU's?

Is there a specific Nvidia GPU problem that you are dealing with?

Update your post to include full system hardware specs and OS information.

What is the specific GPU/Nvidia problem? Which GPU?
 

jasonf2

Distinguished
Over the years it hasn't been that unusual to see RAM variation, especially in the mainstream part SKUs. The 1000 series went as far as offering DDR4 or GDDR5 on the 1030 which made a huge difference. The RAM on the 3080 isn't really the thing here though. The 12 gig variant is more of a 3080 super as it is actually a more powerful chip with more cores and bandwidth than the original release. Anyone buying one would be wise know the differences, not that you can get one anyways.
 
People don't seem to be talking about this, but why are the VRAM amounts so strange on 3000 series cards? 12gb on the 3060 but then 8gb for 3060 ti, 3070, 3070 ti, then only 10 on the 3080. Now we got all these weird variants like the 12gb 2060, 12gb 3080, 16gb 3070 ti. like what the hell are these numbers. maybe learn something from amd

3060 have 12GB because of it's 192 bit memory interface. It was either 6GB or 12GB. 6GB seems to little so maybe that's the reason for going with 12GB. 3060ti, 3070 and 3080 were all released before 3060. So those card having less VRAM is not weird. Initially they were supposed to be updated with a model with more VRAM but nvidia are adapting to both market condition and competition and not just one of of them. The situation with mining give nvidia an opportunity to not responding to what competition had immediately like the need to release cards with more VRAM ASAP. the mess we see right now? For the most part it doesn't matter to consumer really. Just buy what suit your need and budget.
 

jacob249358

Commendable
Sep 8, 2021
636
215
1,290
Leaving AMD out of it.....

So is the problem simply the strangeness or weirdness you cited with respect to Nvidia Series 3000 GPU's?

Is there a specific Nvidia GPU problem that you are dealing with?

Update your post to include full system hardware specs and OS information.

What is the specific GPU/Nvidia problem? Which GPU?
Im not having a problem with a card rn but I got a 1650 a week or so ago but I think Im going to get a 3060 or 3060ti cause the youtube channel is finally bringing in some cash so the system would be more about the video editing and idk if a 3060 could benefit cause the vram and Im just kinda venting my confusion.
 
People don't seem to be talking about this, but why are the VRAM amounts so strange on 3000 series cards? 12gb on the 3060 but then 8gb for 3060 ti, 3070, 3070 ti, then only 10 on the 3080. Now we got all these weird variants like the 12gb 2060, 12gb 3080, 16gb 3070 ti. like what the hell are these numbers. maybe learn something from amd
The 3070, 3080, and 3090 series use GDDR6X instead of the vanilla GDDR6. This was for performance reasons as the GPUs need a lot of bandwidth to keep themselves busy. At the time, the only company who was making GDDR6X (Micron) only had 1GB chips available, with 2GB chips on the roadmap for 2021 (and that was just to get a production run rolling, not necessarily achieve what you'd call full-steam-ahead). Obviously, NVIDIA couldn't wait a year to release their flag ship card, so they made due with what was available.

However in GDDR6 land, where we have Micron, Samsung, and SKHynix as manufacturers, 2GB chips were available. Since midrange cards aren't as data hungry, they don't need as much performance, and it's likely that 2GB chips were the more economical solution. AMD can get away with using GDDR6 in their higher end cards with Infinity Cache supposedly, but who knows how much it actually helps them.

Sometimes you can't pick and choose between what makes sense on paper and what makes sense in reality.
 
Last edited: