If they're not scalped, these might be OK entry level 1080p cards that support high effects. I can see this competing in price with the 6600s.
Another thing to keep in mind is that GPUs do not exist in a vacuum: every finished GPU coming off of the AiB manufacturer's assembly line needs a BGA substrate to put the GPU die on, PWM controller, power stages for the various GPU supply rails, VRAM chips, PCB, etc. so every RTX2060-12GB coming out would be competing against the entire RTX3000 series for GDDR6 and a bunch of support components.This is the explanation that would have been great in the article. A question that follows is : Are there problems with Samsung 30xx chip production or is it just that Nvidia can't get more production out of them but are selling everything made ?
If you have money for a 6600(XT), you can mine the sofa for an extra $80 and get a 3060Ti.If they're not scalped, these might be OK entry level 1080p cards that support high effects. I can see this competing in price with the 6600s.
Weren't the Turing cards going into the CMP line? So now they are dumping it on gamers? I certainly hope it is not because they are selling Ampere to miners and then giving gamers the leftovers.
Why would you do that? Recent tests show the 6600XT to be as fast as the 3060 (non Ti) for the same price, while the -ti, which is indeed faster, costs almost 300 bucks more.Another thing to keep in mind is that GPUs do not exist in a vacuum: every finished GPU coming off of the AiB manufacturer's assembly line needs a BGA substrate to put the GPU die on, PWM controller, power stages for the various GPU supply rails, VRAM chips, PCB, etc. so every RTX2060-12GB coming out would be competing against the entire RTX3000 series for GDDR6 and a bunch of support components.
As for what the problem with Samsung is, last I heard, it simply is that Samsung's 8nm isn't so good.
If you have money for a 6600(XT), you can mine the sofa for an extra $80 and get a 3060Ti.
Going between the lowest-priced resellers on NewEgg.com since most models aren't in stock at any of the primary sources, the cheapest 6600XT is $780, the cheapest RTX3060Ti is $860 and the cheapest RTX3060 is $890. So, between the GPUs you can order through NewEgg right now, there really is only a $80 difference between the 6600XT and 3060Ti and the non-Ti is more expensive at least for now.Why would you do that? Recent tests show the 6600XT to be as fast as the 3060 (non Ti) for the same price, while the -ti, which is indeed faster, costs almost 300 bucks more.
This really must depend on the country - the cheapest 6600XT I can find here is €500, the cheapest 3060 is €510, and the cheapest 3060Ti is €560 - and it's an underclocked Ti (boost clock have -14% frequency). So, OK for the sofa mining, but 1 - you have to actually manage to get the card (there are several 6600XT available for €500, only one 3060Ti under €650, all are - of course - out of stock) 2 - at least with minimal spec'ed performance (3060 Ti non-OC is €600 at least, but there are several models) and 3 - at non-scalper prices (I'm using the only website I could find that use MSRP for its models).Going between the lowest-priced resellers on NewEgg.com since most models aren't in stock at any of the primary sources, the cheapest 6600XT is $780, the cheapest RTX3060Ti is $860 and the cheapest RTX3060 is $890. So, between the GPUs you can order through NewEgg right now, there really is only a $80 difference between the 6600XT and 3060Ti and the non-Ti is more expensive at least for now.
Definitely not looking good for the 2060 super-duper being remotely affordable by normal lower-end standards.
Another thing to keep in mind is that GPUs do not exist in a vacuum: every finished GPU coming off of the AiB manufacturer's assembly line needs a BGA substrate to put the GPU die on, PWM controller, power stages for the various GPU supply rails, VRAM chips, PCB, etc. so every RTX2060-12GB coming out would be competing against the entire RTX3000 series for GDDR6 and a bunch of support components.
As for what the problem with Samsung is, last I heard, it simply is that Samsung's 8nm isn't so good.
If you have money for a 6600(XT), you can mine the sofa for an extra $80 and get a 3060Ti.
AMD is still producing 580's, but they are going to miners.Heck if AMD could roll out a bunch of RX 570's, 580's or 5500XT's at MSRP I would be all over those for builds,
If Nvidia really wanted to re-launch older GPUs to leverage 12nm fabs, it should have relaunched the 1650S-1660S instead... then again, that might not be possible due to memory manufacturers having migrated most of their GDDR5 production to GDDR6(X).As for the 2060 12gb, if you can find it in stock for a decent price eh, go for it if there isn't anything else available. Heck if AMD could roll out a bunch of RX 570's, 580's or 5500XT's at MSRP I would be all over those for builds, at this point its amazing if you can find anything near msrp or around the $200 mark.
Any Turing GPU isn't going to have a mining limiter. One way to make them less attractive to miners is to reduce profitability by adding on costs that don't benefit mining. Extra memory that does nothing but use more power will also decrease efficiency.In any case, since anything with 6GB of VRAM is just about as good as anything else with 6GB of VRAM at mining ETC, there is basically no chance in hell of seeing anything with 6GB or more VRAM for a reasonable price until ETC goes PoS as its primary validation mechanism or something crashes the crypto market.
Extra memory is actually extremely useful for mining: one DAG needs about 5GB of VRAM so a 6GB GPU can only work on one DAG at a time while a 12GB GPU can work on two. The 12GB RTX2060S has 6x32bits memory channels, so running two DAGs concurrently would exercise two out of six channels at any one time and almost double the hash rate.Any Turing GPU isn't going to have a mining limiter. One way to make them less attractive to miners is to reduce profitability by adding on costs that don't benefit mining. Extra memory that does nothing but use more power will also decrease efficiency.
That's not how mining works. You can't run 4 ethereum miners concurrently on a 3090 and quadruple the hashrate.Extra memory is actually extremely useful for mining: one DAG needs about 5GB of VRAM so a 6GB GPU can only work on one DAG at a time while a 12GB GPU can work on two. The 12GB RTX2060S has 6x32bits memory channels, so running two DAGs concurrently would exercise two out of six channels at any one time and almost double the hash rate.
Persisting into 2022 and lasting till 2023 are both correct statements.
You can read more about how Timelines work at
https://en.wikipedia.org/wiki/Timeline
The memory size scaling stops once you have enough VRAM to run enough concurrent attempts to consume most VRAM bandwidth.That's not how mining works. You can't run 4 ethereum miners concurrently on a 3090 and quadruple the hashrate.
His comment was likely in relation to SSG's comment about extra VRAM being great for artists. Yes, most 'pro' GPUs will game fine, though people paying 2X, 3X, 5X or even 10X as much for those driver certifications and extra VRAM usually don't have gaming as a primary concern.
waste of good RAM....
1GB of GDDR6 costs $12-15, so an extra 6GB adds $70-90 to manufacturing cost and 1GB chips (6x1GB for 6GB) is still a very common chip size as that is how you get 8GB on a 256bits bus or 12GB on 384bits. For a GPU that is allegedly intended to be around $300, it is a pretty large chunk of cost.The price of 12 is probably not a lot higher than 6, anyway, if you can even still get memory chips with a density that low.