News Nvidia Confirms RTX 2060 12GB, Expected Launch Imminent

This is so stupid. Why are we doing this? Any game that will legit use 12GB of VRAM will be bottlenecked by the 2060's little baby core. Now the top model of the 20 series has less VRAM than the base model. There's a reason the original 2060 on had 6GB. Yay, more VRAM for 4K textures, so we can play 4K games at 3FPS lol
 
This is so stupid. Why are we doing this? Any game that will legit use 12GB of VRAM will be bottlenecked by the 2060's little baby core. Now the top model of the 20 series has less VRAM than the base model. There's a reason the original 2060 on had 6GB. Yay, more VRAM for 4K textures, so we can play 4K games at 3FPS lol
It's not just 4K gaming, though. Some games are pushing beyond 8GB VRAM use even at 1080p. Sure, RTX 2060 won't do 60+ fps in most of those, but it could potentially do more than 30 fps in a lot of games with more VRAM. Pricing is going to be a key consideration. Real price, not MSRP. If Nvidia can provide enough GPUs to keep prices below $400, that would be great.
 
  • Like
Reactions: phenomiix6
This is so stupid. Why are we doing this? Any game that will legit use 12GB of VRAM will be bottlenecked by the 2060's little baby core. Now the top model of the 20 series has less VRAM than the base model. There's a reason the original 2060 on had 6GB. Yay, more VRAM for 4K textures, so we can play 4K games at 3FPS lol
It may be cheaper and easier on the supply chain to source 2GB GDDR6 chips, especially when the 3060 and 3050 use 2GB chips.

Consider this, if an AIB is planning on making this 2060 and they're already making 3060s and 3050s, then they can use their existing stock of 2GB GDDR6 chips, rather than put in an order of 1GB chips which probably has long been exhausted.
 
Last edited:
This is so stupid. Why are we doing this? Any game that will legit use 12GB of VRAM will be bottlenecked by the 2060's little baby core. Now the top model of the 20 series has less VRAM than the base model. There's a reason the original 2060 on had 6GB. Yay, more VRAM for 4K textures, so we can play 4K games at 3FPS lol

This may come as a great shock to you, but video cards can be used for more than mining and gaming.

I do 3d art and this is exactly the kind of card that I can use.
 
"For now, all we have is an official confirmation of the card, with no word yet from Nvidia on the actual specifications. Obviously, it will feature double the memory of the original RTX 2060 that launched several years ago and remains as the lowest tier of the RTX series (unless you could the mobile RTX 3050 and 3050 Ti). We're hoping to get a card for testing, to see what the added memory (and potentially other changed specs) do for its standing in our GPU benchmarks hierarchy. "

Perhaps it was meant to say: unless you count......
 
Could be good assuming the GDDR6 shortage has been fixed else it not going to help much.
One things I was thinking of with the increased memory would they also increase the size of the memory bus and thus could use slower memory but still deliver the same bandwidth. This could mean they can use memory that has little other use than in this product?

Edit: This is either going to be great or disappointing, guess we may find out on the 7th (or not) 🤔
 
This is Nvidia's latest attempt to squeeze every last cent from the gaming market. This card is for the truly desperate. So glad I have two working video cards that I will hold on to till this GPU price gouging of the gaming marketing stops.
 
Great. Just what we need. A half baked solution who's only purpose it to keep RTX 30 series cards over priced 🙁
 
This is so stupid. Why are we doing this? Any game that will legit use 12GB of VRAM will be bottlenecked by the 2060's little baby core. Now the top model of the 20 series has less VRAM than the base model. There's a reason the original 2060 on had 6GB. Yay, more VRAM for 4K textures, so we can play 4K games at 3FPS lol

Wake up, see the world....

ITs not stupid. This is because stock is just 6GB. This is insufficient for many modern games even at 1080.....
 
  • Like
Reactions: phenomiix6
Wake up, see the world....

ITs not stupid. This is because stock is just 6GB. This is insufficient for many modern games even at 1080.....
It all depends on how much you are willing to compromise between FPS and details. Even 4GB is still workable in most games at low/lowest details.

That said, an RTX2060 is powerful enough to make reasonable use of at least 8GB and 8GB wouldn't be a typical option on 192bits.
 
  • Like
Reactions: renz496
The thing about VRAM is that it's not JUST used as a framebuffer, it also stores in use assets and assets not currently in use but stored for quick loading in the future. If more room for the framebuffer is required, currently unused assets are paged to RAM (if space is available) and then to the page file on the hard drive which increases loading time when those assets have to be used again.

Also as newer, more computationally efficient ray tracing algorithms are developed and more accessible to lower end hardware without DLSS, the extra VRAM will allow these cards to have the space available to utilize those effects, as those effects are quite the VRAM hog.
 
This is so stupid. Why are we doing this? Any game that will legit use 12GB of VRAM will be bottlenecked by the 2060's little baby core. Now the top model of the 20 series has less VRAM than the base model. There's a reason the original 2060 on had 6GB. Yay, more VRAM for 4K textures, so we can play 4K games at 3FPS lol

consumer ask for it haha. anyway i think nvidia initial plan was to give even all the 30 series more VRAM. the existence of 3080Ti 20GB prototype is a direct evidence for this. but then mining happen. since the GPU will be sold out anyway nvidia change their plans. next year the demand from mining might soften a bit. so they probably going to put more VRAM on 30 series refresh that was supposed to happen directly after AMD launch their GPU. hence we are hearing about 3070Ti 16GB and 3080 12GB. and there is a question why only giving 3080 12GB upgrades only while slower 3070Ti get 16GB? 3080 is based on GA102 chip. rather than making it into 3080 nvidia probably prefer to use majority of the GA102 chip for the more expensive 3090 (or the upcoming 3090Ti).
 
Wake up, see the world....

ITs not stupid. This is because stock is just 6GB. This is insufficient for many modern games even at 1080.....

if you want to push everything to ultra then yes 6GB is not enough. but realistically majority of games already good enough even at medium setting. it's not like you're going to see night and day difference between medium and ultra. in the end PC gaming is all about flexibility.
 
Ridiculous GPU release - even worse than the 12GB RTX 3060.
The rasterization power of this RTX 2060 will be used up way before you even get close to using all of the 12GBs of VRAM.
Developers are looking caching results, which helps boost performance. For example, one of the features of Unreal Engine 5 is it can cache indirect lighting results, which helps with global illumination. Someone also mentioned that this could benefit ray tracing with caching BVH structures.

Lots of games do this already. I mean heck, for a while (I'm not sure if it's still there), Call of Duty had an option literally called "Fill remaining VRAM."
 
Developers are looking caching results, which helps boost performance. For example, one of the features of Unreal Engine 5 is it can cache indirect lighting results, which helps with global illumination. Someone also mentioned that this could benefit ray tracing with caching BVH structures.

Lots of games do this already. I mean heck, for a while (I'm not sure if it's still there), Call of Duty had an option literally called "Fill remaining VRAM."
How many games, out now, have this feature? I don't think the performance benefit comes close to negating the lack of rasterization power, but I'd like to read more on it.
Do you have any links detaling testing done with this VRAM caching enabled vs not enabled?
 
How many games, out now, have this feature? I don't think the performance benefit comes close to negating the lack of rasterization power, but I'd like to read more on it.
Do you have any links detaling testing done with this VRAM caching enabled vs not enabled?
It will definitely come down to price vs. the regular RTX 2060 (and RTX 2060 Super), but assuming there are more than a handful of GPUs using the 12GB configuration, having another option isn't bad. Lack of rasterization and just general compute will definitely be a concern, but performance should be somewhere between RTX 2060 and RTX 2060 Super, and lack of VRAM should never be a concern given the other specs. Basically, we're looking at a GPU that likely isn't far off the RX 6600 in terms of performance -- maybe 2-4% slower overall.