News Nvidia RTX 5060 Ti 8GB struggles due to lack of VRAM — and not just at 4K ultra

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
That was communication on your part, it was phrased such that the 5700-XT wasn't enough for RT.

Like I said most people do not understand how VRAM works, so here is a very quick class which explains the results everyone gets, including the discrepancies and why "8GB is THE DEVIL" is very wrong headed.


In modern OS/GPU architecture your VRAM can largely be split into two categories, nondiscretionary and discretionary. Nondiscretionary is things that absolutely must be inside VRAM, buffers and working area, while discretionary is everything else, mostly resources like textures and models. The difference is that discretionary can be dynamically loaded from system RAM turning your VRAM into a cache. The more nondiscretionary things you have, the less space you have from discretionary resources, aka the smaller said cache is. Things like Multiple monitors and borderless window mode result in about 400~500MB of nondiscretionary space being used for example, while single monitor and full screen exclusive mode would free that up.

Now the big eater of nondiscretionary space is buffers, DLSS alone requires 1.5GB worth of buffers and scratch space and that's all nondiscretionary. Things like MFG and RT also require additional buffer space further limiting the VRAM resource cache size. The smaller that VRAM cache the higher the probability that there will be a cache failure in the middle of a frame, meaning that rendering has to be paused while the missing resource is fetched from system RAM. Though some games kinda cheat by delaying the rendering of that thread and press forward resulting in some missing textures and weird effects for a few seconds.

What this boils down to is that entry level 60 class cards (really 50 class but different discussion) should not be expected to render DLSS, MFG, RT at ultra settings. This is a failure in expectation management fueled by outrage farming content creators. 8GB has proven to be sufficient for rasterization as long as your not doing extremely high settings, and even then the entry level cards don't generally have the compute to do those settings in a playable way.

If someone said "8GB is not enough for mid range cards", I would absolutely agree with them. It's belongs in entry tier right next to the 128-bit bus (four chips).
There is a lot of egotism here, people don't need to look further than the VRAM/process monitor and the actual VRAM util in a program like MSI afterburner to know their GPU is out of Vram when there is major stutter, textures not loading in properly and the power draw for the video card is not anywhere close to where it should be.

By the way.

"Ehh just not being able to turn on RT with good textures in FC6 a long time ago told me 8GB was done and that was back in 2022 just after I replaced the dodgy 5700-XT with the RTX 3070."


Reads exactly that the 5700-XT was replaced with an RT capable RTX 3070 and that did not have enough Vram.

It's like having to explain the intricate details to a car crash when we can verifiably see it has crashed.

Oh but technically it was designed to crash, so it's not really a crash.


Like the R9 Fury and 4GB, yeah it is HBM but it is still 4GB.


The 8GB 4060Ti is out of Vram before the 16GB, it's not unplayable you said, well from a bar graph in a fixed part of a game not the overall experience, how can you come to this conclusion?
 
There is a lot of egotism here, people don't need to look further than the VRAM/process monitor and the actual VRAM util in a program like MSI afterburner to know their GPU is out of Vram when there is major stutter, textures not loading in properly and the power draw for the video card is not anywhere close to where it should be.

Nuance matter a lot, especially with outrage farming titles like "Nvidia RTX 5060 Ti 8GB struggles due to lack of VRAM — and not just at 4K ultra". Giving details and nuance do not generate as much traffic as outrage farming and gassing on lynching mob.

Knowing the details is important because it explains why the whole 8 vs 16GB on an entry class card doesn't matter and is rather silly. We should start from a users specific needs and situation, then determine which product best matches them and the situations where a 16GB 4060/5060 is worthwhile over a 8GB 4060/5060 is rather limited.

Now having said that, the entire 5060 class cards have terrible value proposition. It's better to just go to a 5070 or really a 9070. The last two generations 60 class cards are really 50 class cards, the kinds they stick in OEM boxes from Dell, slapping more memory onto that relatively anemic compute unit won't do much.

The 8GB 4060Ti is out of Vram before the 16GB, it's not unplayable you said, well from a bar graph in a fixed part of a game not the overall experience, how can you come to this conclusion?

This is why I like Jared's graphs and numbers over some outrage farming YT video, easy to reference and are unbiased and not trying to push an angle. "Running out of VRAM" is a very easy thing to notice because essentially you never "run out of graphics memory". About half of your total system ram is available as "display memory", that gets added onto your VRAM for total graphics memory. At 32GB system RAM those 8GB cards have 24GB worth of display memory available to them. It's the graphic framework / drivers (they work together) to predict when a resource is needed and move it move it from system ram into GPU VRAM ahead of time. That is where the whole discretionary /nondiscretionary thing comes into play, the more discretionary space available the bigger the window for stuff to be loaded and the less chance of something being missed. So while your playing a game, there is a bunch of work happening in the background that moves graphics resources into that window. An occasional miss is fine, it's a micro stutter that lasts a few ms and chances are you won't even notice.

Of course if the scene your rendering requires more simultaneous resources then your discretionary space can hold, then you will have non-stop micro-stutter and this will be represented by 1% lows being significantly effected. This is the point where the user really should turn down texture sizes, going from "ultra" to "high/very high" is somewhere around ~50% reduction in texture size depending. But before we even get to that point we need the GPU compute to be pushing the frames at an acceptable rate to begin with, generally accepted to be 60fps with sub 30 being "unplayable" and 30~60 being "kinda acceptable".

So the situation we're looking for is when a GPU has enough compute to push 60+ FPS but the discretionary space isn't enough to hold the requires resources for that detail settings. That is not a common situation as demonstrated by Toms various graphs. It does happen occasionally, and the outrage farmers attempt to key in on those as then magnify them and misrepresent them as common everyday occurrences. That last part is what I have a problem with and what I seek to correct and explain.
 
Last edited:
  • Like
Reactions: KyaraM
The 5060 Ti does struggle due to lack of Vram, I think you picked the wrong argument.

Nah the data is in on that one.

And really ... could you not have proven my point more strongly with a video named "don't get ripped off".

Talk about emotional manipulation... and outrage farming. Anything that use's emotional language should immediately be taken with a mountains worth of salt and disbelieved.
 
  • Like
Reactions: KyaraM
Like I said, for now I'm totally okay with my 1660 Super. Plus, I'm poor, and the US government isn't handing out stimulus checks like candy anymore like they were when I first built my PC and later when I bought the GPU. 🤷‍♂️ Maybe if its price comes down in a few years.
okay, so your actual position is "It doesn't matter if the GPU offered 10x the performance, even at $250 because I can't afford it anyways." 🙄
 
okay, so your actual position is "It doesn't matter if the GPU offered 10x the performance, even at $250 because I can't afford it anyways." 🙄
My actual position is that I personally would be fine with owning a 5060 Ti if it weren't overpriced and were something I could afford, and that a B580 would also be a fine upgrade—but I'm not looking to replace my GPU at the moment and can't afford to anyway. My actual position is that the hoopla over 8 GB being insufficient is a tad excessive, but so is the price of the 5060 Ti 8 GB.

I didn't start a thread asking "What GPU should I upgrade to?" only to turn around and say that actually I'm not intending to upgrade. I just voiced opinions about the subject at hand.

Footnote: I don't think the B580 is actually available at $250 anywhere online, but I could be mistaken.
 
  • Like
Reactions: KyaraM
Nah the data is in on that one.

And really ... could you not have proven my point more strongly with a video named "don't get ripped off".

Talk about emotional manipulation... and outrage farming. Anything that use's emotional language should immediately be taken with a mountains worth of salt and disbelieved.
You must be anti-human then, as all language is emotional and is a description made up by human perception, made-up and manipulating everything outside its own perception by its own mind.

Therefore, emotionally manipulative is a misnomer.
 
  • Like
Reactions: utroz
Nah the data is in on that one.

And really ... could you not have proven my point more strongly with a video named "don't get ripped off".

Talk about emotional manipulation... and outrage farming. Anything that use's emotional language should immediately be taken with a mountains worth of salt and disbelieved.
Not sure why you're so quick to defend Nvidia and dismiss results of very reputable sites. I've been on Tom's Hardware since the beginning (this is not my original account) it's gone downhill bad the last 10 years. I trust GN and HUB more than you guys with your AI news. 5060 TI 8 GB is an unbalanced card. It does not have enough RAM to balance GPU power. You shouldn't have to turn down the settings to fit within the vram limitations when you still have plenty of GPU power to keep the FPS playable. HUB backed their statements with facts and data. I'm sure plenty of other sites will do their own tests and find the same exact results. Oh and running 1080p medium it's kind of a joke on a 60 class card that at its price point..
 
  • Like
Reactions: Kreploon9
5060Ti with 8gb of VRAM is a joke that should not even exist .
nvidia was withholding it from being reviewed .
that should tell you something about how much confidence they have in their own product
they know this card should not exist in 2025 .

5060Ti should just sell as 1 unified model with 12gb of VRAM and 192 memory bus .
(basically nvidia should have named rtx 5070 correctly and release it as 5060Ti for $350
5060 should be a cut down version of the same card with some cores/ROPs disabled and sell for $250-300)

8gb cards with such small pcb and 128 memory bus such as current 5060/5060ti
should be relagated to 5050/5050ti and sell for $200-250 .
 
Last edited:
  • Like
Reactions: Kreploon9 and utroz
I agree that they shouldn't be using 128-bit memory bus for 60 class cards anymore but for whatever reason they seem dead set on tiering cards by memory bus width and insisting that 60 is at the bottom instead of 50. Having said that, they did this with the 40 series so not sure why people are surprised or outraged.

GDDR7 really upped the ram bandwidth, despite the 128bit bus, compared to the 4060ti, and put it on par with the bandwidth of a 3060ti, which had a 256 bit bus, iirc from Jay's review. I do agree that we shouldn't be using 128-bit for a 60 series though.
 
The RTX 5060 Ti 8GB suffers from decreased performance, texture pop-ins, stutters, and crashes across all resolutions, compared to its 16GB counterpart.

Nvidia RTX 5060 Ti 8GB struggles due to lack of VRAM — and not just at 4K ultra : Read more

There is also something I find extremely strange with the 8 GB of VRAM in the RTX 50 series: AI.

Basically, NVIDIA has been of what I've heard one of the most advanced organisations in artificial intelligence, and they of course want us to use it. A very simple example is Frame Generation, that's now bundled with DLSS 4.

However, when you have a look at the NVIDIA app's homepage, you'll notice that already two of the AI features NVIDIA released are VRAM-hungry:

- ChatRTX that requires 8 GB of VRAM -> RTX 5060 Ti 8 GB barely supports that feature.
- G-Assist that requires 12 GB of VRAM -> RTX 5060 Ti 8 GB's AI capabilities can't shine here.

We are more or less re-experiencing the RTX 3050 (Ti) laptop's situation (but how the RTX 5060 Ti 8 GB performs in 2025 is way better than the RTX 3050 in 2021) with its 4 GB of VRAM: just enough VRAM to do some stuff, not enough for the future (which is the case at the moment writing). In reaction, NVIDIA used DLSS as an argument.

In a personal point of view, I don't really want to buy a RTX 50 series GPU mostly with many people saying that the performances only increase once you "smash" AI in your games. And when we find out that RTX 5060 Ti's 8 GB of VRAM doesn't help for neural operations, that's even less reassuring.

If you want a new GPU, 12 GB VRAM is the minimum if you don't want it to become obsolete too quickly.
 
Last edited:
  • Like
Reactions: Kreploon9
Remember guys GTX1070 8GB (2016).
I remember reviewers claiming the card had not enough VRAM on its release, just for it to carry me through the entire GPU mining crisis just fine. This card is literally why I don't give a hoot about that particular opinion. Also basically the best card I ever owned, looking at its ROI.
 
I remember reviewers claiming the card had not enough VRAM on its release, just for it to carry me through the entire GPU mining crisis just fine. This card is literally why I don't give a hoot about that particular opinion. Also basically the best card I ever owned, looking at its ROI.

HUB said the 3070 would age poorly, due to 8gb, and later follow up videos proved them right, as the RX 6800 ended up performing better, in more v-ram hungry titles, even with RT enabled. 8gb is fine if all you do is play E-Sports, or older games, but for the latest AAA titles, it's simply not the case anymore.
 
  • Like
Reactions: Kreploon9
HUB said the 3070 would age poorly, due to 8gb, and later follow up videos proved them right, as the RX 6800 ended up performing better, in more v-ram hungry titles, even with RT enabled. 8gb is fine if all you do is play E-Sports, or older games, but for the latest AAA titles, it's simply not the case anymore.
Why are you talking about a card half a decade younger than the one I talk about, as if that proves your point? They aren't really comparable, and neither are their situations. My point is that just because reviewers claim something it isn't necessarily true, so I stopped listening to their doom saying long ago and see for myself. If you claim something for a decade and after that decade it becomes true, in a constantly developing market like PC parts, you don't get to claim "told you so!" when it does happen. A decade is a very long time in this field. Nowadays, morons already started claiming 12 or 16GB wasn't enough for anything. Doesn't make it true, though.

I had two paragraphs typed out about what was wrong with quite a few of HUB's tests you are referring to, but I don't think it will even matter so I won't. I'm too tired for this.

Btw, my laptops 4060 is still playing everything I throw at it at the 1080p resolution it is made for. Pretty sure the low FPS in certain newer games are due to GPU limitations, not VRAM, seeing how the card still runs at basically full core utilization. But what do I know, when the grand hardware gurus know it actually MUST be the VRAM, there is no other way.
 
News flash, the 5060ti is only about 10% faster than a 3070, so it is still quite relevant to the discussion. The difference is with regards to V-ram. The 3070's workstation card equivalent had 16gb V-ram, and didn't suffer from the issues the 3070 did. The limit isn't GPU limitations as you claim.

The 5060ti 8gb should not exist, period.
 
Hey guys it's just GPU limitation!!!!!


Ahem, people forgot Tom's tested the 16GB version of the 3070 that was modded and it had a lot better results due to added Vram.

https://www.tomshardware.com/news/3070-16gb-mod



The only reason people argue is that they don't want to buy another GPU / delusional.
 
Last edited:
  • Like
Reactions: logainofhades
Still rocking with my attractive looking 2060. No reason to buy anything for next 4-5 years.

Stop buying new hardware every 2 years and you wont be talking about these techy stuff every month, its aint rocket science. Don't watch NVIDIA trailers and learn to hold your tech urge. Use your dollars properly. No offense
 
  • Like
Reactions: Vikko151
GDDR7 really upped the ram bandwidth, despite the 128bit bus, compared to the 4060ti, and put it on par with the bandwidth of a 3060ti, which had a 256 bit bus, iirc from Jay's review. I do agree that we shouldn't be using 128-bit for a 60 series though.

Yes it started with how they "readjusted" their lineup with the 40 series. 3050 was 128-bit, 3060 was 192, 3070 was 256 with the 3060 ti being a binned 3070, 3080 was 320-bit and 3090 was 384. Then the 40 series happened and everything got knock down an entire category. I said then that it was outrageous but that's just how nVidia is going to roll.


. 8gb is fine if all you do is play E-Sports, or older games, but for the latest AAA titles, it's simply not the case anymore.

I think this is close but not entirely accurate, it really depends on the title and most importantly settings. Mostly it's texture sizes, trying to load ultra textures into a 8GB card might cause issues. Turning on DLSS and running at higher resolutions with ultra textures is definitely going to cause issues. I guess cause I see the 4060/5060 as an entry tier card similar to the 3050 my expectations are that someone is going to be doing medium/high at 1080 or 1440p at most. If you were to append "at 1440p Ultra" to the AAA titles then it would be representative. A mid range or enthusiast card should absolutely have more then 8GB of VRAM, if anything I think 12GB is too low for the xx70 line but nVidia is nVidia.

On the whole crashing thing, that's a game engine bug not a VRAM problem. A 32GB system with an 8GB video card has 24GB worth of display memory. A 16GB system with a 8GB video card would have 16GB of display memory. In both situations a portion of that display memory is quite slow, but it still exists and will hold resources. The whole game crashing due to insufficient VRAM was something from the DX9 WDDM era when games would manage their graphics memory space. DX11/12/Vulkan with WDDM 2 abstracts that layer and instead treats display memory as one giant pool. Of course if there isn't sufficient VRAM to load all the buffers and required resources for a scene, what ends up happening is very similar to swap file thrashing, incredibly bad and inconsistent performance. Like you don't have to guess it, you know when you've run into VRAM issues.
 
  • Like
Reactions: Vikko151 and KyaraM