[SOLVED] 20 GB VRAM 3080's?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

farmfowls

Distinguished
Jul 23, 2014
299
0
18,790
So I know there have been rumors/confirmations(?) that a 20 GB VRAM SKU for the 3080 exists but am trying to understand something. As far as I am aware (could be wrong), games aren't using even 10 GB VRAM (isn't it around 6?), so why would NVIDIA make a 20 GB version? Is it just for the "big numbers" to lure people in? Is there actual reason to wait for a 20 GB card vs trying to get one's hands on the 10 GB cards that are almost impossible to get now anyway?

I didn't think that having more VRAM (past what you need) improved performance or offered anything extra in terms of gaming. Has this changed? Or is it just for people who hold onto cards for a few years and maybe in that time there will be need for VRAM past 10 GB? I for one am looking to upgrade before I head back to school and won't be able to upgrade again for a while after this. In this case, is waiting for a 20 GB 3080 the smarter choice vs trying to get a 10 GB one now?

I seem to remember a Gigabyte (I think it was them) leak where something was said that maybe the 20 GB VRAM cards could be clocked higher? I'm not sure so correct me if I'm wrong. I would assume that if a 20 GB card would come, it'd be in a year from now, so is it worth waiting?

Thoughts?
 
Solution
Here is what Tom's Hardware said about the 10GB VRAM in its 3080 review:

"
GeForce RTX 3080: Is 10GB VRAM Enough?

In the past few weeks since the RTX 30-series announcement, there have been quite a few discussions about whether the 3080 has enough memory. Take a look at the previous generation with 11GB, or the RTX 3090 with 24GB, and 10GB seems like it's maybe too little. Let's clear up a few things.
There are ways to exceed using 10GB of VRAM, but it's mostly via mods and questionably coded games — or running a 5K or 8K display. The problem is that a lot of gamers use utilities that measure allocated memory rather than actively used memory (e.g., MSI Afterburner), and they see all of their VRAM being sucked...
Something related to the topic at hand, if there is going to be a 20 GB version of the 3080, it likely isn't going to be the current 3080 with 20GB. Keep in mind there are 14 GPCs between the 3080 and 3090. There's plenty of wiggle room for an updated 3080. If there is, it'll likely happen within a year.

So that means that it does in fact use the allocated memory. That's contradictory to what is being said.
Apps have to first ask the OS for memory. Almost every memory allocation system will give the app more memory than it requested because of the bookkeeping overhead to mark sections of memory as "owned" by an app. So rather than give the app just enough memory, which it will likely blow through, memory allocation systems will give the app a lot more than needed. That doesn't mean however that the app will actually use it.

To give a real world example, lets say you're planning a gathering at a restaurant. You invite 20 people, 10 say they're definitely showing up, 5 say they can't make it, the rest say they're not sure. When you call up the restaurant to make a reservation, it's logical to assume that you'll ask for a place for 15 people. The restaurant will reserve a spot for that many people.

When the day comes, those 10 people that said they'd show up do. 1 of the people on the "maybe" list showed up, but the rest bailed out. You still have 15 seats at the restaurant, even though you're only using 11. If the restaurant starts fill up their capacity, they may take away those 4 seats, otherwise, they'll leave those 4 seats for you.

see, and that is where the benefits of higher vram speed come into play. The game can switch textures much faster and do faster changes to the files that a currently held in the allocated vram space, without causing delays theft would show as lower performance
A lot of that is streamed in and out of VRAM anyway. Games have ways to continue running more or less smoothly even under less-than-ideal conditions. GTAV for instance can continue to run even if assets don't load in fast enough, though you may have oddities like chunks of the world not being rendered.
 

mjbn1977

Distinguished
U
Is there any software that can accurately measure VRAM being utilized?

Nothing easy to use comes to mind. I dabble around a lot in Unreal Engine and within the engine you have tools that give you all kind of stats (literally hundreds) for debugging and optimization. That includes detailed infos about vram use and and allocation. Buy the way it’s easy to fill the game world with too many objects real quick and kill performance due to too many draw calls and using high res Textures. But that is what optimization is for. I working on a small open fantasy world using mostly 4K textures and my problem is mostly draw call amount for vegetation. Don’t run into too much vram issues. At least not with 8GB. And my world not much optimized at all.....
 
So I know there have been rumors/confirmations(?) that a 20 GB VRAM SKU for the 3080 exists but am trying to understand something. As far as I am aware (could be wrong), games aren't using even 10 GB VRAM (isn't it around 6?), so why would NVIDIA make a 20 GB version? Is it just for the "big numbers" to lure people in? Is there actual reason to wait for a 20 GB card vs trying to get one's hands on the 10 GB cards that are almost impossible to get now anyway?

I didn't think that having more VRAM (past what you need) improved performance or offered anything extra in terms of gaming. Has this changed? Or is it just for people who hold onto cards for a few years and maybe in that time there will be need for VRAM past 10 GB? I for one am looking to upgrade before I head back to school and won't be able to upgrade again for a while after this. In this case, is waiting for a 20 GB 3080 the smarter choice vs trying to get a 10 GB one now?

I seem to remember a Gigabyte (I think it was them) leak where something was said that maybe the 20 GB VRAM cards could be clocked higher? I'm not sure so correct me if I'm wrong. I would assume that if a 20 GB card would come, it'd be in a year from now, so is it worth waiting?

Thoughts?

Higher VRAM is for CUDA/AI. When training a neural network much more memory is used versus while running. In the past the reason why the "Titan" series existed with more VRAM was not for video games...these are aimed at less expensive CUDA/AI training. A big reason for the higher end cards (e.g., Tesla, Quadro) is either for CAD or for AI. The Titans are far less expensive than a Tesla or Quadro, but they have enough extra VRAM to work for some training situations that a normal GPU won't be able to handle. CUDA/AI training requires a physical address space/VRAM, and virtual swapped VRAM will not work for that situation...it suddenly becomes worth the money to get higher VRAM.

EDIT: Selling higher VRAM models to gamers is probably just marketing, but the guy who mentioned 8k is also probably on the right track.
 

mjbn1977

Distinguished
Another example where 20 gb would be nice is in rendering work in n game engine. If you pre-rendering big and detailed shadow maps for example you can do this either with the CPU which takes long (in my case up to 1 hour or so in my case) or you do it on the gpu which can do it much faster with the CUDA cores. But in that case you need to load all the objects of the scene in the vram. and that can be a lot of stuff (same is true for other 3D Modeling Software).
So, something like the 3090 or a 3080 20GB model is something that is awesome for either people who want the best and don’t care about money or gamers who also do game development and 3D art creation, animation etc. if you just a gamer it’s overkill, unless you are an enthusiast with big pockets....
 
Another example where 20 gb would be nice is in rendering work in n game engine. If you pre-rendering big and detailed shadow maps for example you can do this either with the CPU which takes long (in my case up to 1 hour or so in my case) or you do it on the gpu which can do it much faster with the CUDA cores. But in that case you need to load all the objects of the scene in the vram. and that can be a lot of stuff (same is true for other 3D Modeling Software).
So, something like the 3090 or a 3080 20GB model is something that is awesome for either people who want the best and don’t care about money or gamers who also do game development and 3D art creation, animation etc. if you just a gamer it’s overkill, unless you are an enthusiast with big pockets....
I'd argue if you're doing that for your job, you're probably not going to settle for anything other than the best anyway.
 

TRENDING THREADS