News Nvidia Reveals The GeForce RTX 2060 12GB GPU's Specifications

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I seriously doubt that Nvidia or AMD are paying anywhere near $150 for 12GB of GDDR6. That might be the open market price, but mass buyers probably pay half that.
Well, all of the newest estimates I have seen say the current prices for GDDR6 are in the $13-16/GB range. My $150/GB was based on contract prices (you usually need to have a large sustained volume to qualify for that) dating back to 2019 where it was $12/GB.
 
  • Like
Reactions: helper800
Why the 2060 and not something half good like the 2070 super.
TU104 (2070S): 545sqmm
TU106 (2060S): 445sqmm

The 2070S is 40+% more expensive to produce at the wafer level since there is a 20+% higher chance of defect per chip and 20+% fewer possible chips per wafer.

Since the 2060 re-launch is intended to appease people looking for something they can buy for a remotely reasonable price, it makes sense to go with the lowest-end part that is still relevant and the 2060 is the lowest-end part in the 2000 family.
 
  • Like
Reactions: JarredWaltonGPU
Well, all of the newest estimates I have seen say the current prices for GDDR6 are in the $13-16/GB range. My $150/GB was based on contract prices (you usually need to have a large sustained volume to qualify for that) dating back to 2019 where it was $12/GB.
But that's what I'm saying: I think $12/GB or $13-$16 per GB are really just for "large enough" contract orders. Nvidia is the only company using GDDR6X, so everything Micron makes goes to Nvidia. They almost certainly hashed out a deal where Nvidia only pays about 10% over cost for millions of GDDR6X chips. Between AMD (which includes Xbox/PS5), Nvidia, and Intel, that's basically all of the GDDR6 allocation as well, and again I'd expect they pay just a bit more than cost. The $12/GB price is likely for what's left over for other companies to use -- maybe someone making a new Bitcoin or Crypto ASIC, or an FPGA, or whatever.

I don't think the cost of making GDDR6 is fundamentally a lot higher than making DDR4 or DDR5 memory. They're all variants of DRAM. I'd be very surprised if Nvidia/AMD paid more than about $75 for 12GB of GDDR6, but of course neither they nor their suppliers would ever divulge that information. Spot prices (e.g. at places like Digi-Key) are for the open market, and they're going to be far more impacted by short-term supply and demand. Right now Digi-Key showed a price of $26.22 for a 16Gb GDDR6 chip, but only 1260 were available. That's enough for just 210 RTX 2060 12GB, RTX 3060, or RX 6700 XT cards, and would have a cost $157.32 per card.

If a third party AIB is desperate for more GDDR6, and they intend to sell their cards at 50% or more over the nominal MSRP, sure, they might pay that much. But Nvidia has likely shipped several million RTX 30-series GPUs in the past year. That's several orders of magnitude more volume that what's currently listed at Digi-Key, and agreements would have been hashed out a year or more in the past. It's like AMD's supply of Xbox Series S/X and PlayStation 5 SoCs and components. The total retail cost of the console is basically the same as the actual bill of materials, because Sony and Microsoft plan to make up the difference on software. But AMD knows it will be providing tens of millions of those chips over the lifetime of the consoles, so it can make a deal for 10% over cost.
 
The $12/GB price is likely for what's left over for other companies to use
Contract pricing is when you want to want to secure a steady bulk supply at a steady price. You are bidding directly against AMD and Nvidia for a chunk of the manufacturers' output similar to how AMD, Nvidia, Intel, Apple, etc. are bidding against each other for TSMC wafers.

The leftovers are what lands on the spot market where prices can actually be lower than contract since spot-market prices come with no guarantees on pricing and availability.
 
The short-run side effect of mining demand for cards means higher prices and less supply for gamers, sure. The long-run effect is exactly the opposite: the increased demand and associated revenue allows AMD and NVidia to amortize costs, particularly fixed costs, across a larger customer base, and thus ultimately results in faster cards at lower prices for everyone.

As for scalpers, every card they sell is purchased by someone who will presumably use it for the same purposes they would have, had it been purchased directly, so I'm not sure why you believe they're impacting the penetration of raytracing into the gaming market.
My reply to you from the other article that got closed as I pressed post, however, still relevant to this discussion.

To be clear, I may disagree with your views on the economics of consumer graphics cards, however, it is not as if I do not believe in fundamental free market ideals. I do not consider a consumer dGPU a luxury good when our lives have become so intertwined with technology. I consider electricity at this point, a pubic utility, something even the poorest of people should have the ability to access. The ability to access and pay for a PC in society is essential. If you look at outcomes of people who do not have a PC in the household and people who do in schooling, you will find that those with access to a PC get better higher paying jobs, graduate high school at a higher rate, and are more likely to go to and finish college. Given this as a base, I believe that a dGPU is not a luxury item meant for only people with disposable income. As you have implied in previous posts that myself and other are just emotional bleeding hearts, please try to at least gain perspective from our views.

I also want to touch on the issue of how the unintended consequences of scarcity of supply for current generation cards affect all graphics cards and their prices. Before the 3000 and 6000 series graphics cards came out there were still plenty of cards on the market whilst in the pandemic and crypto was still booming, but you could still obtain a 2000 or 5000 series card for at or below MSRP. So what happened? Well from my perspective there were a few things that lead to this market we see now.

Firstly, a mostly paper launch of both 3000 and 6000 cards before major component shortages really set in with promised once in a few generation performance uplift at a reasonable MSRP really hyped up the market for the products. Secondly, when a new line of graphics cards comes out the production of previous cards ceases because companies do not want to cannibalize their new products launches. Here is the thing about that though, if there are no 3000 series cards to buy because it is a paper launch, then you are leaving an unknown thousands+ people waiting to finish a PC build which explodes demand. A lot of people waited months to find out that they are likely not going to get a new card so that started to gobble up the limited stock of the previous generation cards in the order of more performant to least, generally. Thirdly, Crypto miners were extremely enthused about the performance per watt of the new cards for mining adding to the strained supply that only at first Nvidia and AMD did not care about because they are selling everything they can ship out until they realized that end users were not able to get the cards anymore. Botting the cards at launch was largely unmitigated for months and the people getting the cards to sell for a profit were largely selling to miners in bulk because the miners knew they would get their money back eventually and saw grossly overpriced cards as a long term investment rather than a liability.

In conclusion, I am sure I missed a point or two, however, my main point is there. The negative feedback loops of the industry has lead to nearly impossible to get new generation cards which drove demand on previous generation cards that were previously under manufactured and now also impossible to find at a decent price relative to their age and performance. This also does not just affect the prices of the new and prior generations of cards as we can see. Literally anything with a display out is selling at outrageous prices compared to their performance and age. This is bolstered even further by the fact that AMD's 5000 series CPUs, which were/are industry leading performance monsters, do not have an iGPU. I am sorry for the wall of text but I hope this leads you and others to believe that I, at least, have a more nuanced opinion on the matter than what I believe I previously displayed. I am sure I missed a few poignant points to support my opinions, but alas I am not trying to become a writer. Oh well.
 
Last edited:
I do not consider a consumer GPU a luxury good when our lives have become so intertwined with technology. I consider electricity at this point, a pubic utility, something even the poorest of people should have the ability to access. The ability to access and pay for a PC in society is essential. If you look at outcomes of people who do not have a PC in the household and people who do in schooling, you will find that those with access to a PC get better higher paying jobs, graduate high school at a higher rate, and are more likely to got to and finish college. Given this as a base, I believe that a GPU is not a luxury item meant for only people with disposable income. As you have implied in previous posts that myself and other are just emotional bleeding hearts, please try to at least gain perspective from our views.
It seems you're conflating "GPU" with "discrete, gaming graphics card". Finding a CPU/computer with integrated graphics at a reasonable price doesn't seem to be an issue.

Before the 3000 and 6000 series graphics cards came out there were still plenty of cards on the market whilst in the pandemic and crypto was still booming, but you could still obtain a 2000 or 5000 series card for at or below MSRP.
Crypto got a small bump around August 2020, but only really took off near the end of the year, after the Geforce 30/RX 6000 releases.
 
Before the 3000 and 6000 series graphics cards came out there were still plenty of cards on the market whilst in the pandemic and crypto was still booming, but you could still obtain a 2000 or 5000 series card for at or below MSRP. So what happened? Well from my perspective there were a few things that lead to this market we see now.
The main things that happened there is that Nvidia was too quick at discontinuing the 16xx/20xx to clear space for the 3000 series made on Samsung's not-quite-ready 8nm while AMD had to discontinue the RX5000 and Ryzen 3000 series because it needed all of the 7nm wafers it had available for the quadruple-whammy of Ryzen 5xxx, RX6xxx, Xbox-SX and PS5.

And then you throw the pandemic and crypto resurgence on top of that a couple of months later while everything was already in short supply.
 
  • Like
Reactions: TJ Hooker
It seems you're conflating "GPU" with "discrete, gaming graphics card". Finding a CPU/computer with integrated graphics at a reasonable price doesn't seem to be an issue.


Crypto got a small bump around August 2020, but only really took off near the end of the year, after the Geforce 30/RX 6000 releases.
Firstly, that is semantical. GPU and graphics card can be used interchangeably even though I agree that the hardline definition separates the 2. Also gaming is one of the functions of a dGPU. Professional workloads et cetera come into the picture as well as multi monitor setups with high refresh rate panels. Secondly, Crypto got a bump around august and didn't take off only until after because that was when more volume of cards became available to mine on. Correct me if my logic is wrong.
 
Battlefield 2042, 1080p ultra with ray tracing. Not happening on the 6GB card.


I would think that card is hitting other limitations first before its vram is legitimately maxed out on Ultra settings in a brand new AAA title, especially with RT on. With 12gb it certainly will hit its rasterization limitations before even coming close to utilizing that. Seems pretty overkill.
 
Firstly, that is semantical. GPU and graphics card can be used interchangeably even though I agree that the hardline definition separates the 2. Also gaming is one of the functions of a dGPU. Professional workloads et cetera come into the picture as well as multi monitor setups with high refresh rate panels.
Your statements about people not being able to get a PC implies there is a shortage of all GPUs. This is false, as there is no shortage of iGPUs (as far as I can tell, please correct me if I'm wrong). As such, there is no issue getting a PC in general, as dGPUs are not required for a PC. Meaning the impacts of not having a PC available growing up/while in school is moot in this context.

Saying someone needs a dGPU for professional work, or for multi monitor/high refresh rate, is a very different than saying someone needs a dGPU for education/childhood development (which is what I was replying to).

Crypto got a bump around august and didn't take off only until after because that was when more volume of cards became available to mine on. Correct me if my logic is wrong.
Crypto price determines mining, not the other way around. The massive price increase is what drove the massive mining increase, and that only happened towards the end of the year. Even if the new cards hadn't come out yet, miners would have been buying up every card they could to mine, likely resulting in a shortage regardless. But my point was that the period "before the 3000 and 6000 series graphics cards came out there were still plenty of cards on the market whilst in the pandemic and crypto was still booming" didn't really exist, because crypto wasn't booming until after the release of those cards.
 
Last edited:
I would think that card is hitting other limitations first before its vram is legitimately maxed out on Ultra settings in a brand new AAA title, especially with RT on. With 12gb it certainly will hit its rasterization limitations before even coming close to utilizing that. Seems pretty overkill.
https://www.tomshardware.com/news/battlefield-2042-pc-performance-benchmarks-settings

Based on my testing, at 1080p ultra, the RTX 3060 gets 106 fps and the RTX 2060 gets 95 fps. Turn on DXR and the 3060 performance drops 17% to 89 fps. The RTX 2060 drops nearly 80% to 20.5 fps. Based on the other GPUs, if it had more VRAM, it could still do about 75-80 fps, but because it has 6GB it tanks. Not even DLSS can help, because the problem is all the RT stuff consumes memory along with the textures.

Now, I'm not saying the RTAO in BF2042 is the best example of a game that benefits from ray tracing, but it could have been a different game where it would have mattered more. And it's not just RTAO.

BF2042 runs at 80fps at 1440p ultra on the 3060, but 10 fps on the 2060. Normally, when VRAM capacity isn't the bottleneck, the 3060 is only about 20% faster than the RTX 2060, meaning 1440p with DLSS Quality should have allowed the 2060 to run 1440p ultra at 75-80 fps, but instead it dropped to just 45 fps.

And again, I'm not just pointing at Battlefield 2042. There are other games that exhibit similar behavior (Godfall immediately springs to mind), and with the latest consoles having more RAM, the number of games that will push well beyond 8GB is only going to grow. A 20% drop in performance, from a 3060 to a 2060 12GB, is not insurmountable. A 50-80% drop, though? Yeah, you'll need to do a lot more than just tweak a couple of settings. The RTX 2060 is actually still a very competent gaming GPU, and with 12GB it eliminates its biggest bottleneck. It won't ever be faster than an RTX 3060, but I will be surprised if an 8GB RTX 3050 will be able to keep up (assuming a 128-bit memory interface).
 
  • Like
Reactions: _dawn_chorus_
Your statements about people not being able to get a PC implies there is a shortage of all GPUs. This is false, as there is no shortage of iGPUs. As such, there is no issue getting a PC in general, as dGPUs are not required for a PC. Meaning the impacts of not having a PC available growing up/while in school is moot in this context.

Saying someone needs a dGPU for professional work, or for multi monitor/high refresh rate, is a very different than saying someone needs a dGPU for education/childhood development (which is what I was replying to).


Crypto price determines mining, not the other way around. The massive price increase is what drove the massive mining increase, and that only happened towards the end of the year. Even if the new cards hadn't come out yet, miners would have been buying up every card they could to mine, likely resulting in a shortage regardless. But my point was that the period "before the 3000 and 6000 series graphics cards came out there were still plenty of cards on the market whilst in the pandemic and crypto was still booming" didn't really exist, because crypto wasn't booming until after the release of those cards.
I had since corrected my original writing, as it was my original intent to say dGPU but said GPU. That what I meant by saying it was semantical. Have you seen some of the educational games they have kids playing at school? Normal games have been scientifically shown to develop brains in many ways normally not possible without. To your point on crypto, prices of crypto always explode after major GPU releases, please loot at a long term graph for bitcoin and compare it to release dates for rtx 2000 series. That coincides almost exactly like what happened with the 3000 series launch.
 
https://www.tomshardware.com/news/battlefield-2042-pc-performance-benchmarks-settings

Based on my testing, at 1080p ultra, the RTX 3060 gets 106 fps and the RTX 2060 gets 95 fps. Turn on DXR and the 3060 performance drops 17% to 89 fps. The RTX 2060 drops nearly 80% to 20.5 fps. Based on the other GPUs, if it had more VRAM, it could still do about 75-80 fps, but because it has 6GB it tanks. Not even DLSS can help, because the problem is all the RT stuff consumes memory along with the textures.

Now, I'm not saying the RTAO in BF2042 is the best example of a game that benefits from ray tracing, but it could have been a different game where it would have mattered more. And it's not just RTAO.

BF2042 runs at 80fps at 1440p ultra on the 3060, but 10 fps on the 2060. Normally, when VRAM capacity isn't the bottleneck, the 3060 is only about 20% faster than the RTX 2060, meaning 1440p with DLSS Quality should have allowed the 2060 to run 1440p ultra at 75-80 fps, but instead it dropped to just 45 fps.

And again, I'm not just pointing at Battlefield 2042. There are other games that exhibit similar behavior (Godfall immediately springs to mind), and with the latest consoles having more RAM, the number of games that will push well beyond 8GB is only going to grow. A 20% drop in performance, from a 3060 to a 2060 12GB, is not insurmountable. A 50-80% drop, though? Yeah, you'll need to do a lot more than just tweak a couple of settings. The RTX 2060 is actually still a very competent gaming GPU, and with 12GB it eliminates its biggest bottleneck. It won't ever be faster than an RTX 3060, but I will be surprised if an 8GB RTX 3050 will be able to keep up (assuming a 128-bit memory interface).
I felt the 3060 was very anemic for the price at MSRP on launch. I guess i got too used to the XX70 cards costing around 330-350 and performing around 15% better than the equivalent xx60.
 
To your point on crypto, prices of crypto always explode after major GPU releases, please loot at a long term graph for bitcoin and compare it to release dates for rtx 2000 series. That coincides almost exactly like what happened with the 3000 series launch.
That's merely correlation, not causation. Bitcoin has been around since 2009, with plenty knowing about it after 2011. Only the RTX 30 and GTX 10 series launches had price spikes — RTX 20 series did not.
 
I felt the 3060 was very anemic for the price at MSRP on launch. I guess i got too used to the XX70 cards costing around 330-350 and performing around 15% better than the equivalent xx60.
Nvidia muddies the waters a lot with more GPU models in some series. Not counting OEM models and overseas stuff, here are the past several generations of GPUs. I'm estimating performance in some of the older cases (700 series), but mostly using the data that I see in the benchmarks I've run for the GPU Hierarchy. I've listed launch prices just for kicks.

700-series, six major GTX models:
GTX 750, $119
GTX 750 Ti (20~25% faster), $149
GTX 760 (50~70% faster), $249
GTX 770 (25~35% faster), $399
GTX 780 (15~25% faster), $649
GTX 780 Ti (20~25% faster), $699

900-series, five major GTX models:
GTX 950 (50~75% faster than GTX 750), $159
GTX 960 (15~25% faster), $199
GTX 970 (50~60% faster), $329
GTX 980 (15~25% faster), $549
GTX 980 Ti (20~30% faster), $649

10-series, eight major GTX models:
GTX 1050 (0~5% faster than GTX 950), $109
GTX 1050 Ti (20~30% faster), $139
GTX 1060 3GB (40~50% faster), $199
GTX 1060 6GB (15~20% faster), $249
GTX 1070 (35~45% faster), $379
GTX 1070 Ti (10~15% faster), $449
GTX 1080 (5~10% faster), $599
GTX 1080 Ti (25~35% faster), $699

GTX 16-series, four major GTX models (1660 Ti ~= 1660 Super):
GTX 1650 (65~75% faster than GTX 1050), $149
GTX 1650 Super (30~40% faster), $159
GTX 1660 GDDR5 (10~20% faster), $219
GTX 1660 Super (10~20% faster), $229

RTX 20-series, seven major RTX models:
RTX 2060 (15~25% faster than GTX 1660 Super), $349
RTX 2060 Super (10~15% faster), $399
RTX 2070 (~5% faster), $499
RTX 2070 Super (10~15% faster), $99
RTX 2080 (~5% faster), $699
RTX 2080 Super (5~10% faster), $699
RTX 2080 Ti (15~20% faster), $1,199 (the $999 models were mostly unobtainium)

RTX 30-series, seven major RTX models:
RTX 3060 (20~25% faster than RTX 2060). $329
RTX 3060 Ti (25~30% faster), $399
RTX 3070 (~10% faster), $499
RTX 3070 Ti (5~10% faster), $599
RTX 3080 (10~20% faster)$, 699
RTX 3080 Ti (5~10% faster), $1,199
RTX 3090 (0~5% faster), $1,499

There's been an upward creep in pricing on some of the tiers of GPU, like the 700-series, which launched in 2013, had a $250 760 part and the 3060 is now a $329 part. But when you factor in performance most of the upgrades up to the xx70/xx70 Ti models can be reasonably justified.

What's crazy to me is that, right now, the average eBay price on a GTX 950 2GB card is only slightly below the launch price — $130 vs. $160, after six years. The GTX 970, with an average price of just $210, is a much better option, or the GTX 980 for $250. That's still about the same performance as the GTX 1060 6GB... except when memory capacity becomes a problem, which it certainly can in newer releases.
 
Firstly, that is semantical. GPU and graphics card can be used interchangeably
In general, yes. But we weren't speaking of GPUs in general, but rather the high-end bleeding-edge cards which are presently being scalped. Such cards are indeed luxury goods.

Also gaming is one of the functions of a dGPU. Professional workloads et cetera come into the picture as well as multi monitor setups .
Professional users tend to use professional cards. And even when they use consumer-grade hardware, they typically are far more insensitive to the price. As for "multi monitor setups", that is best accomplished by (very cheap) multi-port 2D cards, which can easily drive three, four, or more monitors.

Have you seen some of the educational games they have kids playing at school? Normal games have been scientifically shown to develop brains in many ways normally not possible without.
I don't wish to put words in your mouth, but are you suggesting that GPU shortages and scalpers' prices are causing long-term brain dysfunction among the youth of the nation?
 
It's not even correlation, but rather simple coincidence. The old "correlation is not causation" saw typically applies to situations in which a strong correlation signal exists, but due to confounding factors, rather than a direct causal relationship.
Keep looking into GPU release's relative to bitcoin value, its clearly linked. Cards release, then values of bitcoin peaks within 2-3 months of the release. It happened with the 3000, 2000, 1000, 900 series launches. Even for a coincidence that would be very interesting.
 
Last edited:
Keep looking into GPU release's relative to bitcoin value, its clearly linked. Cards release, then values of bitcoin peaks within 2-3 months of the release. It happened with the 3000, 2000, 1000, 900 series launches.
Out of morbid curiosity, I checked June 2016, the GTX 1000 series launch date. Bitcoin's price then was ~$550, and it didn't experience a significant pullback for 18 months, Dec 2017, at which point it was over $15,000.

If you're going to count insignificant dips of 2-3% as "peaking", then Bitcoin, and essentially every stock, bond, and commodity in existence, "peaks" several times per year.
 
  • Like
Reactions: JarredWaltonGPU
There are reports that NVIDIA has cancelled the FE of the 2060. Another thing is that the release is supposed to be some time this week, but I have yet seen it listed anywhere.
 
Out of morbid curiosity, I checked June 2016, the GTX 1000 series launch date. Bitcoin's price then was ~$550, and it didn't experience a significant pullback for 18 months, Dec 2017, at which point it was over $15,000.

If you're going to count insignificant dips of 2-3% as "peaking", then Bitcoin, and essentially every stock, bond, and commodity in existence, "peaks" several times per year.
1000 series cards came out May 27, 2016 May 27th 2017 when bitcoin was worth approximately 2000 dollars 500 dollars. In the following 6.5 months 18.5 months its value sharply increased to over 16,000. I have no idea where you got this information. Immediately within 1 month of the release it went up 1000 dollars. As the more efficient cards released in the following months, the value continued to go up. My personal belief is that graphics card launches bring e-coins into public conversation thus increasing interest which is what bitcoin is basically propped up on. Just as stocks are propped up by the general consensus of speculation, up or down.

edited because I made a dumb mistake on the date.
 
Last edited: