News Nvidia Confirms RTX 2060 12GB, Expected Launch Imminent

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
This could really help the market due to being made with different chips that aren't in such high demand but all people wanna do is complain. What a shame
I'm just more cynical than you in thinking that NVIDIA and AMD are NOT doing all they can to increase output to meet demand.
From a greed/profit standpoint, why would they? Everything they're currently producing (under already-inked yield contracts) is being bought by AIB partners at (most likely) increased prices.
 
I'm just more cynical than you in thinking that NVIDIA and AMD are NOT doing all they can to increase output to meet demand.
From a greed/profit standpoint, why would they? Everything they're currently producing (under already-inked yield contracts) is being bought by AIB partners at (most likely) increased prices.
I don't think they're doing "everything" they possibly can do to increase output, because neither wants to end up with a situation like AMD had with Polaris 20 GPUs. RX 570/580 spiked up to more than $500, and all the cards were sold out in early 2017. Nvidia maybe increased production but didn't go whole hog. AMD put in a massive order for more GPUs. In the interim, crypto mining profitability tanked and by the time AMD got all those Polaris 20 GPUs back and into the AIB partners' hands, round about June 2017, no one really wanted them any more. The result was that RX 570 4GB cards were going for $120-$140 for the next two years. No other budget GPU could hope to compete, because AMD and partners were essentially forced to sell at break-even margins just to get rid of the inventory.

Ironically, just about the time RX 570/580 stock finally got cleared out, we had the most recent crypto boom. Bitcoin reached it's previous high in late November 2020, and then proceeded to triple that. If AMD and/or its partners still had Polaris 20 GPUs lying around, they could now sell the 8GB models for $300-$500 again. (Current average price on a 570 8GB on eBay is $410, and 580 8GB is $450.)
 

TJ Hooker

Titan
Ambassador
Yeah, based on the gap between retail pricing and street pricing, it looks like AMD/Nvidia could increase their supply a fair bit without having to worry about needing to lower their prices. If that's the case, any increase in supply they could muster would be increase profits for them. So I doubt they're deliberately passing up opportunities to produce more GPUs just for the sake of maintaining the current level of supply/demand imbalance.
 

jacob249358

Commendable
Sep 8, 2021
636
215
1,290
I'm just more cynical than you in thinking that NVIDIA and AMD are NOT doing all they can to increase output to meet demand.
From a greed/profit standpoint, why would they? Everything they're currently producing (under already-inked yield contracts) is being bought by AIB partners at (most likely) increased prices.
Of course, they aren't doing all they can because it's a business. It's not about being nice, it's about making money. So why sell 1000 GPUs for 500 each for $5000 if you can just sell 500 GPUs for $1000 each. (not real numbers obviously) It's actually a rare business opportunity because there really isn't any competition.
 
I don't think they're doing "everything" they possibly can do to increase output, because neither wants to end up with a situation like AMD had with Polaris 20 GPUs. RX 570/580 spiked up to more than $500, and all the cards were sold out in early 2017. Nvidia maybe increased production but didn't go whole hog. AMD put in a massive order for more GPUs. In the interim, crypto mining profitability tanked and by the time AMD got all those Polaris 20 GPUs back and into the AIB partners' hands, round about June 2017, no one really wanted them any more. The result was that RX 570 4GB cards were going for $120-$140 for the next two years. No other budget GPU could hope to compete, because AMD and partners were essentially forced to sell at break-even margins just to get rid of the inventory.

Ironically, just about the time RX 570/580 stock finally got cleared out, we had the most recent crypto boom. Bitcoin reached it's previous high in late November 2020, and then proceeded to triple that. If AMD and/or its partners still had Polaris 20 GPUs lying around, they could now sell the 8GB models for $300-$500 again. (Current average price on a 570 8GB on eBay is $410, and 580 8GB is $450.)
Yup.
They're being risk-averse because they are making money hand over fist as it is.
 
Of course, they aren't doing all they can because it's a business. It's not about being nice, it's about making money. So why sell 1000 GPUs for 500 each for $5000 if you can just sell 500 GPUs for $1000 each. (not real numbers obviously) It's actually a rare business opportunity because there really isn't any competition.
Would be great if there was room for a middle ground, no? Where investors could be happy, C-level managers could be happy, retailers could be happy, and end customers could be happy - all at the same time.
 

jacob249358

Commendable
Sep 8, 2021
636
215
1,290
Would be great if there was room for a middle ground, no? Where investors could be happy, C-level managers could be happy, retailers could be happy, and end customers could be happy - all at the same time.
yeah. right now everyone is happy except us. I'm just excited about this because I want a cheap RTX card just for some Minecraft with RTX on but don't want to spend $600 for a 2060.
 
How many games, out now, have this feature? I don't think the performance benefit comes close to negating the lack of rasterization power, but I'd like to read more on it.
It's hard to say because some games are explicit about it with an actual option, others just do it because they can. Also a cursory glance at the option I was looking into for Call of Duty ("fill remaining memory") may actually cause issues, but it's hard to say if this caching causing a problem. However there's another option called "Shader Preload" which reportedly helps reduce stuttering.

There was also a test I did with GTA V to study VRAM consumption behavior and found that with all of the VRAM consuming options turned down, i.e, the game settings reported the least amount it'd say the game would consume, it ended up consuming significantly more VRAM after running the benchmark. You can't really call this a "memory leak", because repeated tests afterwards don't cause VRAM consumption to increase. So I can only guess that GTAV will happily keep things in VRAM if there's enough space to do so.

Do you have any links detaling testing done with this VRAM caching enabled vs not enabled?
Sometimes you're given no choice in the matter. Just because the option isn't there to toggle it doesn't mean the game isn't actually doing it. And besides, caching results is a common way to speed up calculation-heavy workloads

EDIT: Any option that deals with caching is likely not going to significantly improve performance, because it shouldn't be optional at that point. Such options may be there simply to allow people with the hardware that can support it enjoy a subtly better experience.
 
Last edited:
It's hard to say because some games are explicit about it with an actual option, others just do it because they can. Also a cursory glance at the option I was looking into for Call of Duty ("fill remaining memory") may actually cause issues, but it's hard to say if this caching causing a problem. However there's another option called "Shader Preload" which reportedly helps reduce stuttering.

There was also a test I did with GTA V to study VRAM consumption behavior and found that with all of the VRAM consuming options turned down, i.e, the game settings reported the least amount it'd say the game would consume, it ended up consuming significantly more VRAM after running the benchmark. You can't really call this a "memory leak", because repeated tests afterwards don't cause VRAM consumption to increase. So I can only guess that GTAV will happily keep things in VRAM if there's enough space to do so.


Sometimes you're given no choice in the matter. Just because the option isn't there to toggle it doesn't mean the game isn't actually doing it. And besides, caching results is a common way to speed up calculation-heavy workloads
Understood. I was just looking for actual gaming tests done that show a difference between these options enabled and not. (Youtube vid or detailed article)

I still don't think that extra caching would cause much of an increase in performance. The RTX 2060 12GB will be starved for memory bandwidth as it is, with is 192-bit interface. Trying to cache more into memory may even slow performance!
 
I still don't think that extra caching would cause much of an increase in performance. The RTX 2060 12GB will be starved for memory bandwidth as it is, with is 192-bit interface. Trying to cache more into memory may even slow performance!
I wouldn't really say that. Caching in this context isn't like say L1 or L2 cache in a CPU. It's meant to be "oh, I already did this, so I don't have to calculate the results again." It's going to use the bandwidth whether or not it has to calculate a result.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
Of course, they aren't doing all they can because it's a business. It's not about being nice, it's about making money. So why sell 1000 GPUs for 500 each for $5000 if you can just sell 500 GPUs for $1000 each. (not real numbers obviously) It's actually a rare business opportunity because there really isn't any competition.
Nvidia doesn't sell every GPU to the highest bidder. They have contracts with AIB's which don't allow them to raise prices whenever they feel like it because the market decides to go crazy. The only way for them to make more money is to increase production.

The PC GPU market shipments decreased by -18.2% sequentially from last quarter and increased by 12% year-to-year.

Overall GPU unit shipments decreased by -18.2% from last quarter, AMD shipments decreased by -11.4%, Intel's shipments decreased by -25.6%, and Nvidia's shipments increased 8.0%.

This was released just a couple weeks ago, and pretty much confirms what has been assumed all along. Intel was prepping for Alder Lake's release helping CPU shipments drop 23.1% for the quarter and almost all of Intel's GPU shipments are iGPU's. AMD, the Peoples' Champion, has given all of you the middle finger and has shifted most of their production to consoles and high end CPU's, because of contracts and that's where the money is. Good luck finding an AMD GPU in the US that isn't in a prebuilt or console. Nvidia, whose only income source is GPU production, actually managed to increase production 8% for the quarter, because that's what they have to do to make more money.

As Jarred said above, not doing everything possible to maximize production is not about trying to maintain elevated prices (it's the middle men and resellers that are truly benefiting here, not the AIB's and manufacturers), it's about making sure they don't end up with warehouses full of unwanted inventory if the mining market crashes. Which is exactly what happened last time there was a mining bubble.
 
  • Like
Reactions: jacob249358
Good luck finding an AMD GPU in the US that isn't in a prebuilt or console. Nvidia, whose only income source is GPU production, actually managed to increase production 8% for the quarter, because that's what they have to do to make more money.
I was at Microcenter three weeks ago and their AMD cabinets were full, but the NVIDIA ones weren't, save for one full of GT 1030s. Though the prices on the AMD GPUs were still at least 50% higher than what they should be.

That might just be a regional thing though.
 
  • Like
Reactions: helper800

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
I was at Microcenter three weeks ago and their AMD cabinets were full, but the NVIDIA ones weren't, save for one full of GT 1030s. Though the prices on the AMD GPUs were still at least 50% higher than what they should be.

That might just be a regional thing though.
Either they got a shipment the day you walked in, or they were way more than 50% over MSRP. 6800 xt's MSRP at $650. They're selling for about $1400-1500 on Ebay. If they were on sale for under $1000 as you claim, that's $400 instant profit for every card. There's no way there is no one living near that Microcenter who wouldn't take advantage of that and let a cabinet full of those sit there.
 
Either they got a shipment the day you walked in, or they were way more than 50% over MSRP. 6800 xt's MSRP at $650. They're selling for about $1400-1500 on Ebay. If they were on sale for under $1000 as you claim, that's $400 instant profit for every card. There's no way there is no one living near that Microcenter who wouldn't take advantage of that and let a cabinet full of those sit there.
Yeah now that I looked at it, the 6600s were floating around 50% while everything else was floating around 100% extra. Either way, the online store if it's still good within the past 48 hours, claims they have an assortment of RX 6000s in stock, but no recent NVIDIA video cards are even listed. The latest they have is 1660
 
I wouldn't really say that. Caching in this context isn't like say L1 or L2 cache in a CPU. It's meant to be "oh, I already did this, so I don't have to calculate the results again." It's going to use the bandwidth whether or not it has to calculate a result.
Maybe.
I'd still like to see real world gaming examples of the performance gain though.
 
As Jarred said above, not doing everything possible to maximize production is not about trying to maintain elevated prices (it's the middle men and resellers that are truly benefiting here, not the AIB's and manufacturers), it's about making sure they don't end up with warehouses full of unwanted inventory if the mining market crashes. Which is exactly what happened last time there was a mining bubble.
The manufacturers and AIBs are most assuredly benefiting from the current supply-demand situation. Maybe not as much as those sly middlemen, but they are benefiting.
 

Eximo

Titan
Ambassador
Is it updated with HDMI 2.1 for max bandwidth 4K HDR. This would be great for a HTPC if it does

A little overkill, but it wouldn't be the first time. I've used a 750Ti and a 950, but downgraded as they released new GPUs. Currently running a GT1030 and that will stay until the next 30 model from Nvidia. Or my great hope of Intel releasing a crippled ARC card with all the latest connectivity. 128 EU, 4 times the UHD770 and 6GB. I still want a DG1, but they only work with certain BIOS.
 
Either they got a shipment the day you walked in, or they were way more than 50% over MSRP. 6800 xt's MSRP at $650. They're selling for about $1400-1500 on Ebay. If they were on sale for under $1000 as you claim, that's $400 instant profit for every card. There's no way there is no one living near that Microcenter who wouldn't take advantage of that and let a cabinet full of those sit there.
I have been to the micro center in Tustin about 7 times in the last couple months and they always have dozens of 6900 xt's and uncountable amounts of 6700 xt's. Just about every time I had been there there were more than a handful of 6600 xt's and a few 6800 xt's. As for price i do not remember specifics but they were all way over MSRP.
 

Joseph_138

Distinguished
The core is actually going to be the 2060 Super core. I will definitely grab one if I see one for sale anywhere. The 2060 Super/2070 have base performance comparable to my GTX 1080, but can do raytracing better and use DLSS in supported games to boost the frames higher. It will also have the added bonus of reduced heat, takes up less space inside the case, and lower power consumption.
 
The core is actually going to be the 2060 Super core.
Not exactly.
It will share some of the specs of the Super but one of the most important, memory bus width, will still be 192-bit. The extra 6GBs of memory will barely be utilized. The memory bandwidth and rasterization power of the card will be the limiting factor on performance. It's guessed that the only reason the extra 6GBs is there is for marketing - to make it look like a new/different card.

Check out the video link I posted above. Steve explains it nicely at around the 9min mark.
 
  • Like
Reactions: helper800

TRENDING THREADS