News Nvidia RTX 5060 Ti 8GB struggles due to lack of VRAM — and not just at 4K ultra

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
It should exist... for $150 or less.
it shouldn't. 8gb ends up in landfill sooner than not given how much vram is bloating in the past few yrs.
8gb is too small for gaming.
its too low for productivity. (i.e. someoen just trying to do stuff in adobe/blender/etc)

8gb shouldn't exist in 2025. if you dont need 8gb buy used as what you need isnt worth the $ of a new gpu
 
  • Like
Reactions: artk2219
If I ever buy a new graphics card, it'll probably be one made by Intel, assuming they're still making them by then and still represent as good of a value proposition as they do now (theoretically, at MSRP). I prefer not to buy used components, and a 1080 Ti also is too power-hungry and doesn't quite represent the >=2x rasterization peformance upgrade I'd be looking for in a replacement.

Right now (and for a good while longer, I expect) I'm still fine with my 1660 Super. Plus it's not a good time for anyone to upgrade because of the state of the market.
So why don't you get a Arc B580? It has roughly 2x the performance of a 1660S.
 
it shouldn't. 8gb ends up in landfill sooner than not given how much vram is bloating in the past few yrs.
8gb is too small for gaming.
its too low for productivity. (i.e. someoen just trying to do stuff in adobe/blender/etc)

8gb shouldn't exist in 2025. if you dont need 8gb buy used as what you need isnt worth the $ of a new gpu
A lot of games are fine with 8 GB at 1080p. No, not all of them, as textures are pushing past that limit in newer games.

You could buy an RX 580 8GB for under $100, but there aren't any 75W 8 GB cards, which is what I would want to buy to put into refurb office PCs.

RTX 3050 6GB low profile cards sell for around $200 right now. So a hypothetical 8 GB RX 9040 at 75W based on Navi 44 would actually be an upgrade from that card. Just make that, and I'll buy one, maybe two if they were cheap on sale. Will it happen? I hope so.

Meanwhile, many people will be buying 8 GB Nvidia cards in the $300-400 range this year. Hopefully, they know what they are getting into, but some of them won't and will get burnt exactly like the Hardware Unboxed video shows.
 
Actually, Nvidia claimed that 12 GB is expensive... when using 3 GB modules on 128-bit. Because the memory is new.
The world in which using 12GB GDDR7 3GB costs more than 16GB GDDR7 2GB with a clamshell design is not a real one. They simply bought the vast majority of the initial runs of GDDR7 and are saving the 3GB IC for their highest margin parts. Publicly admitting you're optimizing for profit at the cost of consumer product quality isn't something companies tend to like to do even when everyone knows it's the truth.
 
I saw it over on TPU, less so here thusfar, and the number of people not bothered by the 8GB 5060 Ti really makes me scratch my head. I fully agree that 8GB VRAM is fine... if the price is right. Intel released the B580 with 12GB VRAM at a $250 MSRP which should be everyone's bar for the lower end of midrange price. Selling 8GB VRAM cards for $300+ should simply be seen as what it is: an anti-consumer money grab.

The 5060 Ti is not a bad GPU by any stretch and can easily allow for higher than 1080p gaming with additional VRAM eating features enabled. The 8GB model is that with asterisks and quite frankly nobody buying this level of performance should be hamstrung by VRAM. This is not something which will get better over time and I wish people wouldn't stand for it.
 
You realized they took down that video right...

Toms already demonstrated the difference between 8 and 16 in a previous review.

https://www.tomshardware.com/pc-components/gpus/nvidia-geforce-rtx-5060-ti-16gb-review/5

Relative results will be the same regardless of model, DLSS / MFA eats VRAM like no tomorrow so 8GB definitely won't be enough. Afterwards it's all based on texture sizes with Ultra on modern games usually pushing past what 8GB can do, high or very high OTOH fits.
 
  • Like
Reactions: artk2219
Soooooo, don't set textures to very high? Problem solved, eh?

For the price it costs, it absolutely should have more VRAM. But this is cherry-picking workloads that maximize the amount of VRAM needed in order to make 8 GB look bad.
8 GB makes 8 GB look bad, lol. True, it's essentially cherry-picking, but we're also talking about games TODAY, not later this year, not 2026, 2027, and so on; in other words, it's only downhill from here.

To be fair, this isn't unique to nVidia when AMD releases/has >$300 8 GB GPU's in 2025. Consumer expectations are what they are, but both companies still make money on this classes of cards (and probably sell quite a few just because they are the most affordable, ignoring the ones that were upsold to a higher model).
 
You realized they took down that video right...

Toms already demonstrated the difference between 8 and 16 in a previous review.

https://www.tomshardware.com/pc-components/gpus/nvidia-geforce-rtx-5060-ti-16gb-review/5

Relative results will be the same regardless of model, DLSS / MFA eats VRAM like no tomorrow so 8GB definitely won't be enough. Afterwards it's all based on texture sizes with Ultra on modern games usually pushing past what 8GB can do, high or very high OTOH fits.
The video linked in the article? It's still there.
 
  • Like
Reactions: artk2219
Nvidia is full of it. They could have done 12gb, they just chose not to. They could have done 192bit bus, but with GDDR6 or 6x, and it would have been just fine.

I agree that they shouldn't be using 128-bit memory bus for 60 class cards anymore but for whatever reason they seem dead set on tiering cards by memory bus width and insisting that 60 is at the bottom instead of 50. Having said that, they did this with the 40 series so not sure why people are surprised or outraged.
 
  • Like
Reactions: artk2219
The video linked in the article? It's still there.

View: https://www.youtube.com/watch?v=AdZoa6Gzl6


Comes up with a "This Video isn't available anymore"

The point being, I do not trust any click bait YT video that relies on ad-sense revenue. Even the more reputable sources end up milking outrage for revenue.

I'm sure eventually Jarred will do a review here and we can get real side by side graphs to see exactly where the fall off is at, just like he did in the last 5060 TI review.
 
Just check that link because that is what was reviewed. The YT link only changes when someone takes down a video and reuploads it, usually after editing something.
I did, and it still works just fine. As for HUB, GN and others they actually raise most of their own funds through subscription sales and merchandise/events. This allows them to buy review hardware, not worrying about losing access to NVIDIA, Intel, AMD or board partners attaching strings. They DO get review hardware, but in this specific case they purchased the 8GB card themselves. You'll also find that their testing regimes are quite rigorous, and the size of the database they maintain and update is something Tom's just cannot do due to the cost. I get my information from multiple sources, much easier to spot and avoid bias that way. Now, that said I'm not a huge HUB fan, or even GN really, but the data they present is worth the watch (usually muted) for me.
 
  • Like
Reactions: artk2219
Now, that said I'm not a huge HUB fan, or even GN really, but the data they present is worth the watch (usually muted) for me.

Data can be made to say anything you want it to be just altering how it's presented. Plus most people, including reviewers, do not understand how VRAM works in the first place and end up jumping to wrong conclusions. Watchers, also not understanding, eat up those conclusions and thus internet memes and myths get born.

Going to compare the 4060 Ti 8 vs 16GB because it's a perfect sample case. Both cards have identical GPUs with identical compute, power, everything. The only difference is the 8GB has the four VRAM chips operating in full node, while the 16GB has the eight VRAM chips operating in clamshell half mode.

Here is Stalking 2 at 1080p ultra/epic

tHaocK5yapcqFAfnUB4CPC-970-80.png.webp


Same game at 2160p ultra/epic/whatever

bdkv2TEirVCU4abaoSMQkC-970-80.png.webp


That is what a VRAM failure look like. Those 8GB cards are having to flush their VRAM resource cache to load new resources from system RAM mid-frame resulting in performance falling off a cliff. Of course we're looking at sub 30FPS play here on these cards so it doesn't really matter.

Toms take on The Last of Us, 1440p ulta/whatever

BVgqd7m2M3Hq45C7ZQzn2E-970-80.png.webp


Same average but notice the 8GB has a slightly lower "lows", that's because it has to occasionally flush VRAM resource cache, not often but enough to effect the chart.

Same game but at 2160p

AmCteZm2gtpLGNqtSXFfCE-970-80.png.webp


Now we are starting to see VRAM matter, performance isn't crashing hard but the game is having to flush resources often enough to actually make a difference. Of course I wouldn't call low 20's FPS "enjoyable" anyway, the 5060 TI 16GB isn't looking much better at sub 30FPS. Like Stalker 2 the 16GB cards aren't playable either at this setting.

And just for reference, Jarred was kind enough to do a 1080p medium on all these games.

EuRauL6mQxFnUpizN5ckeD-970-80.png.webp


Wow actually playable on a 60 class card. This tells me that we could of likely gone to 1440p high and been ok on both the 8 and 16GB cards.


This type of analysis is what all the outrage farming reviewers are missing. There are very few times when a card on a 128-bit memory bus is going to be "good" at 16GB but "bad" at 8GB. The same situations that are playable on that 128-bit memory card are ones where having extra VRAM isn't going to make much difference in the first place. The situations that you need more then 8GB aren't playable on that 128-bit card to begin with. The only two choices are changing the situation (reducing settings) or upgrading to a proper 192-bit and above card.

Seriously, my Nvidia Geforce 4200 TI AGP I used in my retro gaming rig has a 128-bit memory bus. Anytime you see a 128bit memory bus, immediately assume that card is bottom tier and should be avoided if at all possible.

Here are all the charts Jarred did, it paints a very good picture.
https://www.tomshardware.com/pc-components/gpus/nvidia-geforce-rtx-5060-ti-16gb-review/5
 
Last edited:
Ehh just not being able to turn on RT with good textures in FC6 a long time ago told me 8GB was done and that was back in 2022 just after I replaced the dodgy 5700-XT with the RTX 3070.
IDK, don't label GPU's as 1440P capable next time nVIDIA.

The graphs presented only show a fixed presentation of somewhere in a video game, not the overall experience.



Next time don't..... oh...

Casssspture.png
 
Last edited:
RTX 3050 6GB low profile cards sell for around $200 right now. So a hypothetical 8 GB RX 9040 at 75W based on Navi 44 would actually be an upgrade from that card. Just make that, and I'll buy one, maybe two if they were cheap on sale. Will it happen? I hope so.

I actually have one of those running in my HTPC downstairs connected to a 3840x2160 (4K) HDTV. The case is a Silverstone ML09 which is absolutely gorgeous but is rather space limited on what it accepts. Great for lite gaming on nondemanding titles and watching TV shows.

https://www.gigabyte.com/Graphics-Card/GV-N3050OC-6GL#kf
 
  • Like
Reactions: usertests
Ehh just not being able to turn on RT with good textures in FC6 a long time ago told me 8GB was done and that was back in 2022 just after I replaced the dodgy 5700-XT with the RTX 3070.
IDK, don't label GPU's as 1440P capable next time nVIDIA.

The graphs presented only show a fixed presentation of somewhere in a video game, not the overall experience.

That wasn't VRAM size but GPU, the only AMD GPU's capable of real RT are the recent 90 series. You could of had 1 PB worth of VRAM on that 5700-XT and it would of still choked on RT. I say this as someone with a monster 7900 XTX inside a custom WC loop. That card with it's ridiculous specs can't really do RT either.

And now, those charts are very good at comparing those two cards. The number to look at is the lows as mid-frame cache reloading is where that will show up. Like I said, most people have no clue how VRAM works nowadays.
 
That wasn't VRAM size but GPU, the only AMD GPU's capable of real RT are the recent 90 series. You could of had 1 PB worth of VRAM on that 5700-XT and it would of still choked on RT.

And now, those charts are very good at comparing those two cards. The number to look at is the lows as mid-frame cache reloading is where that will show up. Like I said, most people have no clue how VRAM works nowadays.
Not what I stated, I said I replaced the 5700-XT with a 3070 only to find it's 8GB is not enough at 1440P years ago.

You say people have no clue yet only presented a fixed bar chart of somewhere in a video game.
 
Data can be made to say anything you want it to be just altering how it's presented. Plus most people, including reviewers, do not understand how VRAM works in the first place and end up jumping to wrong conclusions. Watchers, also not understanding, eat up those conclusions and thus internet memes and myths get born.

Going to compare the 4060 Ti 8 vs 16GB because it's a perfect sample case. Both cards have identical GPUs with identical compute, power, everything. The only difference is the 8GB has the four VRAM chips operating in full node, while the 16GB has the eight VRAM chips operating in clamshell half mode.

Here is Stalking 2 at 1080p ultra/epic

tHaocK5yapcqFAfnUB4CPC-970-80.png.webp


Same game at 2160p ultra/epic/whatever

bdkv2TEirVCU4abaoSMQkC-970-80.png.webp


That is what a VRAM failure look like. Those 8GB cards are having to flush their VRAM resource cache to load new resources from system RAM mid-frame resulting in performance falling off a cliff. Of course we're looking at sub 30FPS play here on these cards so it doesn't really matter.

Toms take on The Last of Us, 1440p ulta/whatever

BVgqd7m2M3Hq45C7ZQzn2E-970-80.png.webp


Same average but notice the 8GB has a slightly lower "lows", that's because it has to occasionally flush VRAM resource cache, not often but enough to effect the chart.

Same game but at 2160p

AmCteZm2gtpLGNqtSXFfCE-970-80.png.webp


Now we are starting to see VRAM matter, performance isn't crashing hard but the game is having to flush resources often enough to actually make a difference. Of course I wouldn't call low 20's FPS "enjoyable" anyway, the 5060 TI 16GB isn't looking much better at sub 30FPS. Like Stalker 2 the 16GB cards aren't playable either at this setting.

And just for reference, Jarred was kind enough to do a 1080p medium on all these games.

EuRauL6mQxFnUpizN5ckeD-970-80.png.webp


Wow actually playable on a 60 class card. This tells me that we could of likely gone to 1440p high and been ok on both the 8 and 16GB cards.


This type of analysis is what all the outrage farming reviewers are missing. There are very few times when a card on a 128-bit memory bus is going to be "good" at 16GB but "bad" at 8GB. The same situations that are playable on that 128-bit memory card are ones where having extra VRAM isn't going to make much difference in the first place. The situations that you need more then 8GB aren't playable on that 128-bit card to begin with. The only two choices are changing the situation (reducing settings) or upgrading to a proper 192-bit and above card.

Seriously, my Nvidia Geforce 4200 TI AGP I used in my retro gaming rig has a 128-bit memory bus. Anytime you see a 128bit memory bus, immediately assume that card is bottom tier and should be avoided if at all possible.

Here are all the charts Jarred did, it paints a very good picture.
https://www.tomshardware.com/pc-components/gpus/nvidia-geforce-rtx-5060-ti-16gb-review/5
Friend, I just suggested to watch the video. If you do not wish to, or cannot do so, then fine.
 
  • Like
Reactions: jesselafantaisie
View: https://www.youtube.com/watch?v=AdZoa6Gzl6


Comes up with a "This Video isn't available anymore"

The point being, I do not trust any click bait YT video that relies on ad-sense revenue. Even the more reputable sources end up milking outrage for revenue.

I'm sure eventually Jarred will do a review here and we can get real side by side graphs to see exactly where the fall off is at, just like he did in the last 5060 TI review.
I've heard when you delete the last character off a link they tend to not work c/d?
 
  • Like
Reactions: CelicaGT
I agree that they shouldn't be using 128-bit memory bus for 60 class cards anymore but for whatever reason they seem dead set on tiering cards by memory bus width and insisting that 60 is at the bottom instead of 50. Having said that, they did this with the 40 series so not sure why people are surprised or outraged.
You still seem to think bus width matters that's fun. The 5060 Ti has ~55% more memory bandwidth than the 4060 Ti does with the same bus width. It's almost like memory bandwidth, not bus width, is the important part.
 
So why don't you get a Arc B580? It has roughly 2x the performance of a 1660S.
Like I said, for now I'm totally okay with my 1660 Super. Plus, I'm poor, and the US government isn't handing out stimulus checks like candy anymore like they were when I first built my PC and later when I bought the GPU. 🤷‍♂️ Maybe if its price comes down in a few years.
 
Not what I stated, I said I replaced the 5700-XT with a 3070 only to find it's 8GB is not enough at 1440P years ago.

You say people have no clue yet only presented a fixed bar chart of somewhere in a video game.

That was communication on your part, it was phrased such that the 5700-XT wasn't enough for RT.

Like I said most people do not understand how VRAM works, so here is a very quick class which explains the results everyone gets, including the discrepancies and why "8GB is THE DEVIL" is very wrong headed.


In modern OS/GPU architecture your VRAM can largely be split into two categories, nondiscretionary and discretionary. Nondiscretionary is things that absolutely must be inside VRAM, buffers and working area, while discretionary is everything else, mostly resources like textures and models. The difference is that discretionary can be dynamically loaded from system RAM turning your VRAM into a cache. The more nondiscretionary things you have, the less space you have from discretionary resources, aka the smaller said cache is. Things like Multiple monitors and borderless window mode result in about 400~500MB of nondiscretionary space being used for example, while single monitor and full screen exclusive mode would free that up.

Now the big eater of nondiscretionary space is buffers, DLSS alone requires 1.5GB worth of buffers and scratch space and that's all nondiscretionary. Things like MFG and RT also require additional buffer space further limiting the VRAM resource cache size. The smaller that VRAM cache the higher the probability that there will be a cache failure in the middle of a frame, meaning that rendering has to be paused while the missing resource is fetched from system RAM. Though some games kinda cheat by delaying the rendering of that thread and press forward resulting in some missing textures and weird effects for a few seconds.

What this boils down to is that entry level 60 class cards (really 50 class but different discussion) should not be expected to render DLSS, MFG, RT at ultra settings. This is a failure in expectation management fueled by outrage farming content creators. 8GB has proven to be sufficient for rasterization as long as your not doing extremely high settings, and even then the entry level cards don't generally have the compute to do those settings in a playable way.

If someone said "8GB is not enough for mid range cards", I would absolutely agree with them. It's belongs in entry tier right next to the 128-bit bus (four chips).
 
Last edited:
  • Like
Reactions: KyaraM