News Nvidia Reveals The GeForce RTX 2060 12GB GPU's Specifications

... but it will help in modern titles that are pushing the memory envelope beyond 8GB even at 1080p.
I'm not the hardcore gamer some of you are: are there indeed games consuming more memory than this at 1080p resolutions? Obviously memory requirements for textures and models don't scale linearly with resolution, but it would imply the game at 4K would require 16GB, or even more.
 
Unless this can be built using a different fab / schedule or they have a crapload of old 20XX GPUs sitting around to make these cards I don't get the point. They really don't need a "new" product, they need production. They are getting ready to announce the 40XX GPU when you still can't reasonably buy a 30XX GPU.
 
I'm not the hardcore gamer some of you are: are there indeed games consuming more memory than this at 1080p resolutions?
Even if games don't need 8GB for themselves, stuff opened in the background consumes VRAM too so extra VRAM can be useful for those too: if I let Chrome and Firefox use GPU acceleration, they can consume over 1GB of VRAM each. With no GPU hogs open, I believe I've seen Firefox gobble up to 1.7GB of my GTX1050's 2GB. Now I'm forcing Chrome and Firefox to use GT730 IGP instead to conserve VRAM so I don't get massive artifacting in game due to low VRAM.

Unless this can be built using a different fab / schedule or they have a crapload of old 20XX GPUs sitting around to make these cards I don't get the point. They really don't need a "new" product, they need production. They are getting ready to announce the 40XX GPU when you still can't reasonably buy a 30XX GPU.
And production is exactly why the RTX2060 is getting resurrected: the RTX2060 is made on TSMC's 12nm, so it doesn't compete with the RTX30xx on Samsung 8nm and everything else on TSMC 7nm. It also has the "benefit" of being less power-efficient, which should help make it somewhat less desirable for mining.
 
They will be using a 12nm Fab process which will not take away from their RTX 3000 or future RTX 4000 series production. This will allow them to sell even more product since the demand will not go away any time soon even if they sell mountains of these cards. I expect most future triple A games to target the RTX 2060 performance at 1080p High detail at 60 frames per second without ray tracing. Hardcore gamers with money and miners will still buy all the high end RTX 3000 and 4000 series cards almost as fast as they can make them unless cryptocurrency crashes. However; as long as the world governments keep printing money without any restraint then crypto currency will continue to gain momentum since it, unlike our current system, can't be printed out of thin air and must be authenticated in the blockchain.
 
will the msrp be $300? and everywhere on earth?
and sold at the market @$250?

nvidia own MSRP most likely around $350 to $400. i think nvidia rep said this is more like a premium version of the original GTX2060 so base price will reflect that. some people might say such pricing might not make sense since 3060 MSRP starts at $330 but when nvidia reintroduce 2060 6GB and 2060 super 8GB earlier this year both card retain their turing era pricing ($300 and $400 respectively) instead of being adjusted by nvidia to fit with 30 series pricing.
 
  • Like
Reactions: hannibal
will the msrp be $300? and everywhere on earth?
and sold at the market @$250?
The 12GB of VRAM alone costs ~$150 and TSMC jacked up its 12nm wafer prices by 20%. I expect the MSRPs to be $70-100 higher than the original RTX2060's mainly to cover increased costs and street prices will be whatever miners are willing to pay for 'em as usual.

It doesn't make sense to lower MSRPs when everything that is currently coming out of fabs ends up on the scalper market at 1.5-2X the price.
 
I'm not the hardcore gamer some of you are: are there indeed games consuming more memory than this at 1080p resolutions? Obviously memory requirements for textures and models don't scale linearly with resolution, but it would imply the game at 4K would require 16GB, or even more.
Lol no.. I don't know what the author is on about. Would love to see some examples of that claim. A lot of games will allocate quite a lot but actually require? At 1080p? Not that I'm aware of.
 
12gb on 2060 is stupid.

should of just put the 12gb on the 2070. way more sense.

12gb of vram on a 2070 makes zero sense. nvidia would have to cut the memory bus down to 192 bit from 256 bit in order to put 12gb on it. im pretty sure the silicon doesnt have a 320 bit bus that gets pared down to 256 bit and even if it did, that card would begin to encroach on 3060 ti performance
 
The subtitle "RTX. It's on again." is a bit ironic because the 2060's performance was never quite strong enough to turn on ray tracing and deliver decent performance. Browsing through the Steam survey, the number of gamers with truly RTX-capable cards (2070 or above, including all 3xxx cards) is still remarkably low - only about 10% of gamers. It's no wonder then developers have been slow to implement ray tracing in games, even years after initial release. The side effect of scoring big on crypto miners and scalpers (to the detriment of gamers) is Nvidia failed to establish RT gaming in any kind of meaningful way. This re-release of the 2060 really doesn't change any of that.
 
The side effect of scoring big on crypto miners and scalpers (to the detriment of gamers) is Nvidia failed to establish RT gaming in any kind of meaningful way
The short-run side effect of mining demand for cards means higher prices and less supply for gamers, sure. The long-run effect is exactly the opposite: the increased demand and associated revenue allows AMD and NVidia to amortize costs, particularly fixed costs, across a larger customer base, and thus ultimately results in faster cards at lower prices for everyone.

As for scalpers, every card they sell is purchased by someone who will presumably use it for the same purposes they would have, had it been purchased directly, so I'm not sure why you believe they're impacting the penetration of raytracing into the gaming market.
 
No hdmi 2.1 is just dumb. This could have been the ultimate HTPC GPU while everyone else gravitates to the 3000 series. I really don't see any reason to buy this for any usage. Just be patient and grab a 3060. Sign up for alerts and everybody's site and wait it out.
 
  • Like
Reactions: vanderbeam
The short-run side effect of mining demand for cards means higher prices and less supply for gamers, sure. The long-run effect is exactly the opposite: the increased demand and associated revenue allows AMD and NVidia to amortize costs, particularly fixed costs, across a larger customer base, and thus ultimately results in faster cards at lower prices for everyone.

As for scalpers, every card they sell is purchased by someone who will presumably use it for the same purposes they would have, had it been purchased directly, so I'm not sure why you believe they're impacting the penetration of raytracing into the gaming market.

Faster cards at lower prices? Huh? When has this ever happened? MSRP, if you can even find MSRP, has only been steadily rising. Your theoretical prediction has no correlation at all with reality.

The Steam survey results I mentioned indicate massively low uptake of 3xxx series cards in gaming. I invite you to actually look at the Steam survey results to ground your statement in fact rather than whimsical conjecture.

https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
 
Faster cards at lower prices? Huh? When has this ever happened?
Every year since the dawn of the GPU. Each generation is significantly faster than the one before, and ever generation gives a lower cost per performance-unit.

The R&D costs -- much less the foundry costs -- of a developing a modern GPU are now well above what can be afforded by the gaming community itself. The majority of those costs now are being born by customers other than gamers: scientific and engineering users, data centers, and yes,crypto miners. Without that revenue, gamers would not be seeing the same steady increases in both speed and performance/dollar.
 
The subtitle "RTX. It's on again." is a bit ironic because the 2060's performance was never quite strong enough to turn on ray tracing and deliver decent performance. Browsing through the Steam survey, the number of gamers with truly RTX-capable cards (2070 or above, including all 3xxx cards) is still remarkably low - only about 10% of gamers. It's no wonder then developers have been slow to implement ray tracing in games, even years after initial release. The side effect of scoring big on crypto miners and scalpers (to the detriment of gamers) is Nvidia failed to establish RT gaming in any kind of meaningful way. This re-release of the 2060 really doesn't change any of that.

Forget about RT (which being introduced in 2018 with DXR) even DX12 adoption still considered as slow which is already more than 6 years at this point. With DX11 at similar point of time majority of game developer (including indie games) already migrate to the API.
 
Every year since the dawn of the GPU. Each generation is significantly faster than the one before, and ever generation gives a lower cost per performance-unit.

The R&D costs -- much less the foundry costs -- of a developing a modern GPU are now well above what can be afforded by the gaming community itself. The majority of those costs now are being born by customers other than gamers: scientific and engineering users, data centers, and yes,crypto miners. Without that revenue, gamers would not be seeing the same steady increases in both speed and performance/dollar.

I think you're severely out of touch with the realities of the market for the last few years. I have been buying GPUs since days of VoodooFX and never more than now have I felt I'm getting less progress and value for my money, Nevermind the scalping and inavailabiity, even based on MSRP performance/dollar has dropped, and I'm not the first one to notice it. People are frustrated. Outfits like GN have produced videos. Yet you're going to sit here and try to convince me all this profiteering is to our benefit, based on what seems to be a hill of hypothetical economic claptrap.
 
Forget about RT (which being introduced in 2018 with DXR) even DX12 adoption still considered as slow which is already more than 6 years at this point. With DX11 at similar point of time majority of game developer (including indie games) already migrate to the API.

In my experience, less and less graphic improvement over time coupled with higher costs slows uptake time. It's certainly made me less likely to upgrade components. Not that I could even do so in the current market.
 
I have been buying GPUs since days of VoodooFX and never more than now have I felt I'm getting less progress and value for my money
Yes, human psychology is quite illogical at times. The original Voodoo1 debuted for $299. Even ignoring entirely its lack of 2D functionality, it provided less than 1/5000 the 3D performance of, say, an RTX 3070. At the same cost/performance ratio, a 3070 should thus cost more than $1.5 million.

Too far back for you? How about the GTX 660, launched 10 years ago for a price of $229. It provides roughly 1/12 the performance of a 3070, which would price that 3700 at $2,800. Even paying scalper prices, you're getting today's cards at a steal. Or how about the GTX 980, launched a little more than 5 years ago for a price of $549. Performance-wise, it's 1/3 of a 3070, which would price the 3070 at $1650, roughly what you'd pay a scalper, but only 1/3 the card's MSRP.

Conclusion: even during the worst pandemic and associated supply-side crunch in over one full century, you're still getting a better performance/dollar ratio than at any other time in history. Logic and facts trump emotional hysteria. Is the pace of that price/performance ratio improvement slowing? Of course. Semiconductors and GPUs are a more mature industry. Take a look at how little autos, airlines, or air conditioners have improved over the past 20 years. The semi industry is following the same curve. They're not all out to get you.
 
Conclusion: even during the worst pandemic and associated supply-side crunch in over one full century, you're still getting a better performance/dollar ratio than at any other time in history.
While overall performance per dollar may be better than at any previous time in history, the fact is that 20 years ago, you could get a cheaper GPU that was twice as fast every other year and now you have to wait 3-5 years to double the performance for 40-100% higher prices. For the most part, price-performance progression has slowed down drastically from what it used to be.

Also, you used to be able to game decently well on a $150 GPU whereas today, you cannot get anything new worth buying under $300 and those aren't even twice as fast as $150 GPUs from 3-4 years ago. Major regression at the lower-end.
 
  • Like
Reactions: helper800
While overall performance per dollar may be better than at any previous time in history, the fact is that 20 years ago, you could get a cheaper GPU that was twice as fast every other year and now you have to wait 3-5 years
Of course. That's the traditional curve of a maturing industry. When I first began buying CPUs 35 years ago, you could buy one twice as fast for less money in less than one year's time. In the early years of the auto industry, horsepower and top speed doubled on models every 2-3 years as well, and airplaces followed a similar curve. It's a temporary phenomenon, albeit for those who have lived their entire lives among this era, seeing it come to a close may be an earth-shattering event.
 
It's a temporary phenomenon, albeit for those who have lived their entire lives among this era, seeing it come to a close may be an earth-shattering event.
While people like to blame the "rising costs from covid" for the price increases, much of the price increases are actually from lack of effective competition: for the nearly 10 years that AMD was nowhere near competitive, we had incremental price increases from Intel for little to no performance gains. Then, enters Ryzen and Intel's entire product stack shifted one tier down each year with mainstream (i5/$250-300) going from 4C4T to 6C6T, 6C12T and now 10C16T, 3-4X the performance in the span of four years at mostly unchanged prices.

In a healthy competitive market, you cannot have a net profit margin much over 10% before competitors jump in for a slice of the pie. Due to their effective monopoly positions, AMD, Intel, Nvidia, etc. are aiming for 30-40% net income - that is after R&D expenses and taxes.
 
Lol no.. I don't know what the author is on about. Would love to see some examples of that claim. A lot of games will allocate quite a lot but actually require? At 1080p? Not that I'm aware of.
Battlefield 2042, 1080p ultra with ray tracing. Not happening on the 6GB card.
The 12GB of VRAM alone costs ~$150 and TSMC jacked up its 12nm wafer prices by 20%...
I seriously doubt that Nvidia or AMD are paying anywhere near $150 for 12GB of GDDR6. That might be the open market price, but mass buyers probably pay half that.