News Nvidia Might Resurrect the RTX 2060 With 12GB Of VRAM

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
This is the explanation that would have been great in the article. A question that follows is : Are there problems with Samsung 30xx chip production or is it just that Nvidia can't get more production out of them but are selling everything made ?
Another thing to keep in mind is that GPUs do not exist in a vacuum: every finished GPU coming off of the AiB manufacturer's assembly line needs a BGA substrate to put the GPU die on, PWM controller, power stages for the various GPU supply rails, VRAM chips, PCB, etc. so every RTX2060-12GB coming out would be competing against the entire RTX3000 series for GDDR6 and a bunch of support components.

As for what the problem with Samsung is, last I heard, it simply is that Samsung's 8nm isn't so good.

If they're not scalped, these might be OK entry level 1080p cards that support high effects. I can see this competing in price with the 6600s.
If you have money for a 6600(XT), you can mine the sofa for an extra $80 and get a 3060Ti.
 
Last edited:
Weren't the Turing cards going into the CMP line? So now they are dumping it on gamers? I certainly hope it is not because they are selling Ampere to miners and then giving gamers the leftovers.

In any case, I feel this card if true, is not going to be cheap. TSMC already announced a 20% increase for older nodes, and with GDDR6 prices now, it is going to be quite a bit more expensive.
 
  • Like
Reactions: bigdragon
yeah to some people 12GB might be useless but right now this 2060 12GB is not an official product from nvidia yet. depending on the situation early next year nvidia might as well retain it's 6GB configuration. just like 3080Ti that initially planned with 20GB of VRAM. some "prototype" of the 20GB model do exist but in the end nvidia decided to give the card only 12GB.

Weren't the Turing cards going into the CMP line? So now they are dumping it on gamers? I certainly hope it is not because they are selling Ampere to miners and then giving gamers the leftovers.

doesn't matter really. because the one that was made into true gaming card by AIB (not CMP cards) also being sold by AIB directly to miner. the way i see it to prevent this from happening nvidia have to cut AIB and distributor completely. that's mean nvidia have to create their own distribution system instead of relying on third party to do it and sell their non AIB cards on their own physical store.
 
Another thing to keep in mind is that GPUs do not exist in a vacuum: every finished GPU coming off of the AiB manufacturer's assembly line needs a BGA substrate to put the GPU die on, PWM controller, power stages for the various GPU supply rails, VRAM chips, PCB, etc. so every RTX2060-12GB coming out would be competing against the entire RTX3000 series for GDDR6 and a bunch of support components.

As for what the problem with Samsung is, last I heard, it simply is that Samsung's 8nm isn't so good.


If you have money for a 6600(XT), you can mine the sofa for an extra $80 and get a 3060Ti.
Why would you do that? Recent tests show the 6600XT to be as fast as the 3060 (non Ti) for the same price, while the -ti, which is indeed faster, costs almost 300 bucks more.
Mine your sofa that much, you'll end up sitting in your basement.
 
Why would you do that? Recent tests show the 6600XT to be as fast as the 3060 (non Ti) for the same price, while the -ti, which is indeed faster, costs almost 300 bucks more.
Going between the lowest-priced resellers on NewEgg.com since most models aren't in stock at any of the primary sources, the cheapest 6600XT is $780, the cheapest RTX3060Ti is $860 and the cheapest RTX3060 is $890. So, between the GPUs you can order through NewEgg right now, there really is only a $80 difference between the 6600XT and 3060Ti and the non-Ti is more expensive at least for now.

Definitely not looking good for the 2060 super-duper being remotely affordable by normal lower-end standards.
 
Going between the lowest-priced resellers on NewEgg.com since most models aren't in stock at any of the primary sources, the cheapest 6600XT is $780, the cheapest RTX3060Ti is $860 and the cheapest RTX3060 is $890. So, between the GPUs you can order through NewEgg right now, there really is only a $80 difference between the 6600XT and 3060Ti and the non-Ti is more expensive at least for now.

Definitely not looking good for the 2060 super-duper being remotely affordable by normal lower-end standards.
This really must depend on the country - the cheapest 6600XT I can find here is €500, the cheapest 3060 is €510, and the cheapest 3060Ti is €560 - and it's an underclocked Ti (boost clock have -14% frequency). So, OK for the sofa mining, but 1 - you have to actually manage to get the card (there are several 6600XT available for €500, only one 3060Ti under €650, all are - of course - out of stock) 2 - at least with minimal spec'ed performance (3060 Ti non-OC is €600 at least, but there are several models) and 3 - at non-scalper prices (I'm using the only website I could find that use MSRP for its models).
 
Another thing to keep in mind is that GPUs do not exist in a vacuum: every finished GPU coming off of the AiB manufacturer's assembly line needs a BGA substrate to put the GPU die on, PWM controller, power stages for the various GPU supply rails, VRAM chips, PCB, etc. so every RTX2060-12GB coming out would be competing against the entire RTX3000 series for GDDR6 and a bunch of support components.

As for what the problem with Samsung is, last I heard, it simply is that Samsung's 8nm isn't so good.


If you have money for a 6600(XT), you can mine the sofa for an extra $80 and get a 3060Ti.

If you can find one, I've atleast seen several RX 6600 XT's in the wild, I've rarely seen a 3060 TI.

As for the 2060 12gb, if you can find it in stock for a decent price eh, go for it if there isn't anything else available. Heck if AMD could roll out a bunch of RX 570's, 580's or 5500XT's at MSRP I would be all over those for builds, at this point its amazing if you can find anything near msrp or around the $200 mark. Honestly i'd even take some equally ancient GTX 1060 6GB's, they perform well enough for most games at 1080p medium, but they're still competing for resources like vrm's display controllers etc with the newer cards unfortunately, thankfully not for the ram though since they're GDDR5.
 
Last edited:
As for the 2060 12gb, if you can find it in stock for a decent price eh, go for it if there isn't anything else available. Heck if AMD could roll out a bunch of RX 570's, 580's or 5500XT's at MSRP I would be all over those for builds, at this point its amazing if you can find anything near msrp or around the $200 mark.
If Nvidia really wanted to re-launch older GPUs to leverage 12nm fabs, it should have relaunched the 1650S-1660S instead... then again, that might not be possible due to memory manufacturers having migrated most of their GDDR5 production to GDDR6(X).

In any case, since anything with 6GB of VRAM is just about as good as anything else with 6GB of VRAM at mining ETC, there is basically no chance in hell of seeing anything with 6GB or more VRAM for a reasonable price until ETC goes PoS as its primary validation mechanism or something crashes the crypto market.
 
  • Like
Reactions: artk2219
In any case, since anything with 6GB of VRAM is just about as good as anything else with 6GB of VRAM at mining ETC, there is basically no chance in hell of seeing anything with 6GB or more VRAM for a reasonable price until ETC goes PoS as its primary validation mechanism or something crashes the crypto market.
Any Turing GPU isn't going to have a mining limiter. One way to make them less attractive to miners is to reduce profitability by adding on costs that don't benefit mining. Extra memory that does nothing but use more power will also decrease efficiency.
 
Any Turing GPU isn't going to have a mining limiter. One way to make them less attractive to miners is to reduce profitability by adding on costs that don't benefit mining. Extra memory that does nothing but use more power will also decrease efficiency.
Extra memory is actually extremely useful for mining: one DAG needs about 5GB of VRAM so a 6GB GPU can only work on one DAG at a time while a 12GB GPU can work on two. The 12GB RTX2060S has 6x32bits memory channels, so running two DAGs concurrently would exercise two out of six channels at any one time and almost double the hash rate.
 
  • Like
Reactions: artk2219
Extra memory is actually extremely useful for mining: one DAG needs about 5GB of VRAM so a 6GB GPU can only work on one DAG at a time while a 12GB GPU can work on two. The 12GB RTX2060S has 6x32bits memory channels, so running two DAGs concurrently would exercise two out of six channels at any one time and almost double the hash rate.
That's not how mining works. You can't run 4 ethereum miners concurrently on a 3090 and quadruple the hashrate.
 
Persisting into 2022 and lasting till 2023 are both correct statements.

You can read more about how Timelines work at

https://en.wikipedia.org/wiki/Timeline

Yes both statements are correct but persisting into 2022 and lasting till mid- 2023 are very much different timelines. If I was to say the shortage would last beyond next week or beyond the end of the year they would also be correct but they would not be very accurate.
 
That's not how mining works. You can't run 4 ethereum miners concurrently on a 3090 and quadruple the hashrate.
The memory size scaling stops once you have enough VRAM to run enough concurrent attempts to consume most VRAM bandwidth.

A 6GB GPU only has enough VRAM to run one attempt at a time (4.5GB DAG + 1GB PAD = 5.5GB), which isn't enough to consistently occupy all six memory channels. Double the hash rate was a bit of an exaggeration since there is bound to be some memory channel contention between concurrent attempts but there will definitely be non-trivial gains.
 
His comment was likely in relation to SSG's comment about extra VRAM being great for artists. Yes, most 'pro' GPUs will game fine, though people paying 2X, 3X, 5X or even 10X as much for those driver certifications and extra VRAM usually don't have gaming as a primary concern.

Not all 3d artists are working at Industrial Light & Magic. Most of us don't. We are running bottom end hardware & software. Blender, Daz Studio, Poser, etc. Those of us at the hobbyist end aren't investing in quatros. I found it easier (and cheaper) to build out my own personal render farm. For the price of a 1080, I got 4 HP z210s, added an ssd, maxed out ram & away I went.
 
waste of good RAM....

it makes no sense to release a 6gb card today. That's why they released the 3060 with 12gb, instead of 6. The price of 12 is probably not a lot higher than 6, anyway, if you can even still get memory chips with a density that low. They may not be able to get the chips for the 6gb configuration anymore. They may also be using the 3060 board design to take advantage of economies of scale. If they already have a bunch of 3060 boards with 12gb mounted, they just have to find a way to get the 2060 chip to fit it, instead of having to go back to the old board design. Another possibility is that these could be binned Ampere chips with so many defects, that they are effectively down to a 2060 level of performance.
 
Last edited:
The price of 12 is probably not a lot higher than 6, anyway, if you can even still get memory chips with a density that low.
1GB of GDDR6 costs $12-15, so an extra 6GB adds $70-90 to manufacturing cost and 1GB chips (6x1GB for 6GB) is still a very common chip size as that is how you get 8GB on a 256bits bus or 12GB on 384bits. For a GPU that is allegedly intended to be around $300, it is a pretty large chunk of cost.

There is no such thing as GPU manufacturers sitting on stacks of PCBs with 12GB already on them. The GPU, VRAM and all other SMD components on a board all get put on the board soldered at the same time, they don't do partial assembly. Also, RTX2xxx is 12nm TSMC while the RTX3xxx are 8nm Samsung so the GPUs' electrical characteristics are likely too different to be drop-in replacements even if Nvidia decided to use the same pinout.