I noticed it's only getting blasted by Youtubers and folks who like Youtubers telling them how to think. I think the card is just fine and wouldn't need to upgrade again for at least the next five years.
12GB of VRAM won't last 5 years.
I noticed it's only getting blasted by Youtubers and folks who like Youtubers telling them how to think. I think the card is just fine and wouldn't need to upgrade again for at least the next five years.
I don't know, I have been playing 1440p and 4k with my 3080 10gb with no issues.The only way that 12gb vram is gonna make it 5yrs, is if you play at 1080p, and even that is questionable, the way games are getting more demanding.
I don't know, I have been playing 1440p and 4k with my 3080 10gb with no issues.
I have played a few triple A games recently with seemingly no VRAM related issues. Monster hunter wilds benchmark being the latest.Yea you are on the jagged edge today, considering 8gb is starting to become a problem for some games, even at 1080p. 5yrs from now, I don't see it being enough, unless you aren't playing AAA titles.
What he actually means is, to stay under 12GB of VRAM, either:The only way that 12gb vram is gonna make it 5yrs, is if you play at 1080p, and even that is questionable, the way games are getting more demanding.
Is this not also a bit of a misconception with how games say they will "use" X amount of vRAM, but in reality that is just "allocated" vRAM? If I remember correctly, there is no good way of knowing how much vRAM is actually being used to render a scene except getting texture artifacts/unexplained stuttering.What he actually means is, to stay under 12GB of VRAM, either:
1) Run lower resolutions
2) Run lower settings (esp. don't max out textures or shadows)
3) Don't expect new demanding game X to run 60 FPS at max or even high settings
4) Potentially look at using upscaling and/or framegen
Is this not also a bit of a misconception with how games say they will "use" X amount of vRAM, but in reality that is just "allocated" vRAM? If I remember correctly, there is no good way of knowing how much vRAM is actually being used to render a scene except getting texture artifacts/unexplained stuttering.
MSI's OSD does not track usage just what the game has allocated. Games do not use all of the memory it allocates, especially all the time.In HUB's 8gb vs 16gb vram videos, there was definitely rendering issues, when the vram availability was less than the game was demanding. MSI afterburner's OSD seemed to catch the usage pretty well.
MSI's OSD does not track usage just what the game has allocated. Games do not use all of the memory it allocates, especially all the time.
If you don't want to take my word for it here is an excerpt from TPU's conclusion section for the Avatar game on the subject:Whenever the OSD was showing a game bumping up against that 8gb limit, and the game was obviously wanting more, there were issues with things rendering properly, on the 3070, while the RX 6800 and in a later video an RTX A4000 did not suffer from, which both would show usage over 8gb. Even if the OSD isn't perfect, you could see a correlation between the displayed vram usage, and rendering quality.
Its not the best test for the subject at hand in my opinion because the 3070 and 6800 are different architectures, different drivers, different optimizations for any particular game, more or less CPU overhead for the driver, et cetera, et cetera. It would be interesting to have someone test with specifically with the 4060 ti 8gb and 16gb because this takes away all of that guesswork for a perfect apples to apples test.I am mostly referencing this. The frame time graphs also showed a difference when bumping that 8gb buffer.
![]()
8GB VRAM vs. 16GB VRAM: RTX 3070 vs. Radeon 6800
PC gamers have been hearing a lot about VRAM as of late, a topic that has stirred up a ton of controversy. Here's a new GPU shootout...www.techspot.com
The problem is that you can't find out how much VRAM games really need based just on how much VRAM they allocate. We can definitely find a lot of games that exceed 8GB of VRAM these days. That's not difficult. We can even find games that exceed 12GB, and in a few cases, 16GB. But if you want to know "how much VRAM is truly needed to run [Game X] at [Settings Y]" all you can really do is try to test it with GPUs that have less VRAM. So 16GB works fine, 12GB is fine, 10GB struggle... or maybe 10GB is fine and 8GB struggles. And even then, there may be other explanations.I am mostly referencing this. The frame time graphs also showed a difference when bumping that 8gb buffer.
![]()
8GB VRAM vs. 16GB VRAM: RTX 3070 vs. Radeon 6800
PC gamers have been hearing a lot about VRAM as of late, a topic that has stirred up a ton of controversy. Here's a new GPU shootout...www.techspot.com
The RX 6800 was even winning in RT, because the 3070 just didn't have the vram resources need to stretch its legs.
The only game I know wants more that my 3080's 10gb of vRAM is Cyberpunk at 4k with certain settings. Depending on various vRAM dependent settings I can get 40-50 fps or 10-20 fps, and this is excluding RTX settings.The problem is that you can't find out how much VRAM games really need based just on how much VRAM they allocate. We can definitely find a lot of games that exceed 8GB of VRAM these days. That's not difficult. We can even find games that exceed 12GB, and in a few cases, 16GB. But if you want to know "how much VRAM is truly needed to run [Game X] at [Settings Y]" all you can really do is try to test it with GPUs that have less VRAM. So 16GB works fine, 12GB is fine, 10GB struggle... or maybe 10GB is fine and 8GB struggles. And even then, there may be other explanations.
One example of the wonkiness that can happen is Far Cry 6, which at Ultra with the HD texture pack can definitely clear 8GB of VRAM use. But the game behaves differently on various 8GB GPUs. If I run an RTX 3070 as an example, 1080p will be fine, 1440p will also be fine... and 4K will waffle between runs of as low as 5 FPS and as high as 40 FPS (or something similar). Do the same test on an RX 6650 XT and you get similar oddities at 4K. But the last time I tested an Arc A750, it ran 4K just fine, every time. Or maybe it was just more consistently good, I don't recall 100%. The point is that the game was exceeding 8GB of "needed" memory but there was some interaction between drivers and the game where it wouldn't free up assets properly on some GPUs.
TLDR: It's easy to tell if a game is exceeding a certain VRAM threshold. It's harder to tell what amount is needed to satisfy the game's VRAM requirement for any specific setting.
4K ultra at native resolution, or sometimes even 1440p native (alternatively 4K with DLSS quality upscaling) can exceed 12GB of VRAM in Indiana Jones and the Great Circle I believe. Or maybe it's only if you try to enable path tracing? If you do PT plus DLSS plus Framegen it will exceed 16GB (though obviously 3080 doesn't have DLSS framegen). And that's not even trying for the "Very Ultra" or "Extreme" presets.The only game I know wants more that my 3080's 10gb of vRAM is Cyberpunk at 4k with certain settings. Depending on various vRAM dependent settings I can get 40-50 fps or 10-20 fps, and this is excluding RTX settings.
Here you go, and I could come up with several other examples (this is ultra, no extra RT (game requires RT period) and no PT):If you don't want to take my word for it here is an excerpt from TPU's conclusion section for the Avatar game on the subject:
"Our VRAM testing would suggest that Avatar is a VRAM hog, but that's not exactly true. While we measured over 15 GB at 4K "Ultra" and even 1080p "Low" is a hard hitter with 11 GB, you have to consider that these numbers are allocations, not "usage in each frame." The Snowdrop engine is optimized to use as much VRAM as possible and only evict assets from VRAM once that is getting full. That's why we're seeing these numbers during testing with the 24 GB RTX 4090. It makes a lot of sense, because unused VRAM doesn't do anything for you, so it's better to keep stuff on the GPU, once it's loaded. Our performance results show that there is no significant performance difference between RTX 4060 Ti 8 GB and 16 GB, which means that 8 GB of VRAM is perfectly fine, even at 4K. I've tested several cards with 8 GB and there is no stuttering or similar, just some objects coming in from a distance will have a little bit more texture pop-in, which is an acceptable compromise in my opinion."
I have been reading stuff about this from as early as 2013 about how game engines will allocate much more vRAM than it actually needs to. Steve from Gamer's Nexus has talked about this at length more than a few times over the years. The earliest video I can find on the subject that supports what I am saying is from 2015.
I don't expect "thousands" of cards.I don't get it. nVidia puts out a thousand cards and the reviewer is expected to account for every $50/5 FPS increment? I don't think that is fair.
Yeah, it's been flopping around. It's been on Newegg for $549 a few times, $600~$650 has been pretty common. Obviously not for the higher-end, triple-fan models with lots of bling, but base models are pretty much just as fast (within a few percent).Surprised to already see some on newegg for $609.
Still more then i'm willing to pay, but trending down toward MSRP.
MSI has had their ventus and shadow models in stock several times at 550 over the months as well. Funnily enough, walmart has had them in stock in the past for 550 as well. Gotta look, but you can get MSRP cards eventually.Yeah, it's been flopping around. It's been on Newegg for $549 a few times, $600~$650 has been pretty common. Obviously not for the higher-end, triple-fan models with lots of bling, but base models are pretty much just as fast (within a few percent).