GeForce RTX 5060 Ti 8GB vs RTX 5060 Ti 16GB GPU face-off: how much difference does twice the memory make for the same chip?

Also, do not forget nVidia is giving you a crippled card on purpose with the 8GB model, evidenced by the simple fact the 16GB performs like a completely different card at more demanding settings without other hardware differences than RAM capacity. This is not even talking about how the extra 8GB is not over $100 difference in terms of BOM.

Same logic for AMD, mind you. The 9600XT 8GB should not exist either. This tier does not need 2 VRAM configs and it just needs the better config.

Both AMD and nVidia could even save* a bit of money on marketing by just doing right by the consumer and get free good will, but alas. Corps be doin' Corpo things.

Regards.
 
Despite having a $130 higher MSRP, the RTX 5060 Ti 16GB provides a better price-to-performance ratio than its 8GB counterpart, even at 1080p. At 1080p ultra, the 16GB provides 7% greater FPS per dollar
Assuming you can find an RTX 5060 Ti 16GB at its $429 MSRP, there's only a $50 difference between that card and the 8GB version.
This math seems fishy.

MSRP: $379 (8 GB), $429 (16 GB)
Street: Newegg is actually throwing a $20 "rebate card" (never heard of that) alongside a $10 promo code, so you could get 8 GB for about $350. 16 GB model can be obtained for $450.

So straight up, it's not a $130 higher MSRP, it's $50 as stated later. The street pricing gap isn't $130 either. I'm not sure where that line came from. Edit: It was probably supposed to be "13%" instead of "$130".

16 GB model is 13% more expensive (MSRP) or about 18-29% more expensive (street). That's 18% if you get the 8 GB for $380, 29% if you get it at $350.

1080p ultra
21 game geomean = +12% avg, +16% lows
15 game raster geomean = +5% avg, 8% lows
6 game raytracing geomean = +37% avg, +44% lows

So with the MSRPs and the 21 game geomean average (with 1080p ultra being a better case for 16 GB than 1080p medium), the 5060 Ti 16 GB's performance is not better than the price difference (+13% money for +12% performance).

Raytracing on the 8 GB model is being dragged down significantly by the Indiana Jones result. It could be argued that's fair, as it indicates that the 8 GB is not enough. It's a small sample size though.

If you are willing to turn down settings (e.g. 1080p medium), the 8 GB card copes better. Performance is being left on the table at higher resolutions, but again, Indiana Jones is dragging it down. For example, 1440p raster average is only 11% higher (however, lows are 21% higher).
 
Last edited:
  • Like
Reactions: JarredWaltonGPU
This math seems fishy.

MSRP: $379 (8 GB), $429 (16 GB)
Street: Newegg is actually throwing a $20 "rebate card" (never heard of that) alongside a $10 promo code, so you could get 8 GB for about $350. 16 GB model can be obtained for $450.

So straight up, it's not a $130 higher MSRP, it's $50 as stated later. The street pricing gap isn't $130 either. I'm not sure where that line came from.

16 GB model is 13% more expensive (MSRP) or about 18-29% more expensive (street). That's 18% if you get the 8 GB for $380, 29% if you get it at $350.

1080p ultra
21 game geomean = +12% avg, +16% lows
15 game raster geomean = +5% avg, 8% lows
6 game raytracing geomean = +37% avg, +44% lows

So with the MSRPs and the 21 game geomean average (with 1080p ultra being a better case for 16 GB than 1080p medium), the 5060 Ti 16 GB's performance is not better than the price difference (+13% money for +12% performance).

Raytracing on the 8 GB model is being dragged down significantly by the Indiana Jones result. It could be argued that's fair, as it indicates that the 8 GB is not enough. It's a small sample size though.

If you are willing to turn down settings (e.g. 1080p medium), the 8 GB card copes better. Performance is being left on the table at higher resolutions, but again, Indiana Jones is dragging it down. For example, 1440p raster average is only 11% higher (however, lows are 21% higher).
I sent Paul/Aaron a note (from the great beyond). I don't know if this was just outdated text based on real-world pricing (at one point the 16GB cards was trending closer to $500 than $430), but these days you can find the 8GB card starting at $370 — $350 if you want to try to get Newegg's mail-in rebate. The 16GB can be had for $430, straight up from Amazon and Newegg with no rebates.
 
I sent Paul/Aaron a note (from the great beyond). I don't know if this was just outdated text based on real-world pricing (at one point the 16GB cards was trending closer to $500 than $430), but these days you can find the 8GB card starting at $370 — $350 if you want to try to get Newegg's mail-in rebate. The 16GB can be had for $430, straight up from Amazon and Newegg with no rebates.
After reviewing my comment, I think the "$130 higher MSRP" was almost certainly a typo for "13%". 429 / 379 = 1.13192...
 
  • Like
Reactions: JarredWaltonGPU
In January I upgraded from an Nvidia GeForce 690 to an Nvidia GeForce RTX 3050 6GB ($180).

I run my PC at 4k, 10bit, 144Hz

I really see no reason to buy a more expensive card.

The only glitch I had in upgrading was that Samsung packed a HDMI 2.0 cable with my

Samsung - Odyssey G7 28” 4K UHD IPS AMD FreeSync Premium Pro & G-Sync Compatible Smart 144Hz 1ms Gaming Monitor

It was causing an occasional 2 to 4 second black screen - maybe 2 or 3 times per day.

When I finally figured out Samsung sent me an HDMI 2.0 cable to use with my HDMI 2.1 monitor - I upgraded the cable and problem solved.

The other thing that took me a while to figure out is that Windows 11 defaults to 150% video scale if you are using a 4k video monitor. That makes things bigger but a little bit fuzzy. I changed the Windows 11 scale to 100% and now I see a lot more (but obviously smaller) but everything is sharper and just as easy to see.

I'm guessing running 4k vs 1080p saves me at least an hour per day of scrolling.

Oh - by the way - I hate Samsung "Smart Screen" - but, so does everyone else. When I bought the monitor in January at BestBuy (ater-Christmas-returns everywhere) - there were huge stacks of "open box" Smart Samsung monitors for sale. So I got a $1,200 monitor for $380. I hate "smart monitors" they are soooo annoying.
 
Last edited:
The 8GB card is worse for sure, but it could still be a valid choice if you plan on using a 1080p screen and don’t care about raytracing.

If you want 1440p or more AND raytracing then the 16GB card doesn’t deliver playable framerates either, it’s really only good for 1440p without RT or 1080p with RT.

Both are useless for 4K gaming in my opinion.

At least this test shows where more VRAM makes a difference and where it’s pretty much useless so you can make an informed choice, the difference in price is not much but it’s not nothing so why give it to NVidia if you won’t need the extra VRAM anyway.
 
  • Like
Reactions: hush now
They almost had it, but the bias leaked through when they mixed RT and non RT.
The writer already had a conclusion and just wrote material to support that conclusion.


8GB is sufficient for 1080p/1440p up to High / Very High. Ultra on modern games simple has too large texture sizes, which honestly do not belong on resolutions like 1920x1080 or 2560x1440, but do belong on 3840x2160, 7680x4320 or the various ultra wide desktop displays. 8GB is for entry level 128-bit cards like the xx50 (now nVidia calls them xx60). No enthusiast should be buying those cards.

IMHO The 4060 TI / 5060 TI should never of existed as 128-bit cards and instead been a cut down version of the higher tier 192-bit cards. But nVidia playing games with model numbers to justify ridiculous per-chip prices.
 
Last edited:
  • Like
Reactions: Devoteicon
let me fix your conclusion:

No 8GB is not enough unless you play indie games, older games, or like 720p upscaling to 1080p.

People who buy the "budget" card generally do so w/ long temr usage of said card..if 8gb isnt usable today then it only gets worse as time goes on.
The resale value of these are trash.

You are literally better off not gettign gpu for a month or 2 more and going to a 16gb card as it can actually play stuff.
 
Same logic for AMD, mind you. The 9600XT 8GB should not exist either. This tier does not need 2 VRAM configs and it just needs the better config.
While I'm not super thrilled about it, I think AMD can be afforded quite a bit more slack in this regard, for two reasons:

1 - at this point the 9060 XT is, particularly with discounts (albeit that in the form of a Newegg gift card), as low as $250

2 - More importantly, there was a video, I think it was Hardware Unboxed, that did a comparison between the 9060 XT 8GB and the 5060 Ti 8GB, and found that the AMD card suffers notably less from the VRAM limitation than does the Nvidia card.
 
  • Like
Reactions: jlake3
If you have an 8GB card just play on "high" textures instead of "Ultra".

I wouldn't say the difference is unnoticeable... more like your <8k monitor is not technically capable of displaying a difference. Not unless the developer seriously screwed up how they used and optimized their textures, and you probably aren't playing those games.

But most modern games always look pretty good, even on low. And you aren't even going to play the game for very long before it's generic live service fails and gets shut down, so its not a big deal. But then again you probably just paid $70+ upfront for a 3 month ticket to play free-to-play game that is less fun than the popular actually-free-to-play games that it was ripping off, which is also playable on integrated graphics and phones. So you don't care what the game looks like, you just looove wasting money.
Play that slot machine, cha-CHING baby. Get those imaginary high heels!
 
2 - More importantly, there was a video, I think it was Hardware Unboxed, that did a comparison between the 9060 XT 8GB and the 5060 Ti 8GB, and found that the AMD card suffers notably less from the VRAM limitation than does the Nvidia card.
Yeah it was a HWU video.
It boils down to the PCIe 5.0 8x on the 5060Ti vs 16x on the 9060XT. The 5060Ti chokes on PCIe bandwidth when it runs out of VRAM.
 
  • Like
Reactions: King_V and jlake3
I play lots of games 4k or 2560x1440 with the 4060... Some games have Vram fault
Using the Intel UHD770 as primary graphics and the RTX 4060 as Acellerator can improve some of these VRAM and FPS because Windows Always will Use the UHD 770 for the Vram needs.
Some programs like discord, steam and browser have defualt HW acelleration and it's impact on the Poor Gtx 4060.
 
8GB would be fine in a $200 card...but absolutely, COMPLETELY unacceptable in a $400 one! It's not enough for modern games, even at lower resolutions, and WILL cripple performance. It's not like the extra 8GB is expensive, either, it's what? An extra $20 at the price points the manufacturers pay? 8GB as a minimum should have died LAST gen, and this gen, it's obnoxious that they are still being sold!
 
I understand that 8gb of vram isn't enough for a lot of people, and there is a borderline absurd amount of discourse surrounding technologies like DLSS and frame gen, but I've been getting by on a laptop with a 4060, and it's been perfectly fine. Better than fine, actually. Cyberpunk and Elden Ring run phenomenally well at 1600p on max settings. TLOU2, God Of War, Ghost Of Tsushima. It handles everything I throw at it better than I think it will. All of the older games that I spend 90% of my gaming time playing run perfectly. Obviously if it weren't for upscaling we would be having a different conversation, but I'm not what one would call a "pixel peeper" so it doesn't bother me at all. As long as I can get over 60 fps I'm happy. However, looking at a game like the Witcher 4 makes me think I might have some issues in the near future. We shall see. To be frank I think a lot of people just like complaining, or nothing is ever good enough, or they get caught up in the cycle of negativity and discourse or whatever. But for now, yes, I'd say 8gb of vram ia passable.
 
>I understand that 8gb of vram isn't enough for a lot of people

It depends on the context. 8GB VRAM isn't enough for a lot of people *HERE*. If you widen the context to be representative of the wider gaming population, eg those on Steam survey, the statement is no longer true.

>but I've been getting by on a laptop with a 4060, and it's been perfectly fine. Better than fine, actually.

This forum is an echo chamber of like-minded gamers, and their sources of info come from tech tubers and reviewers who all say "8GB isn't enough." In this environment, such claim easily becomes the group consensus, and goes on to be a "truth." But the "truth" is conditional, and contingent to the environment (clique) you're in.

>However, looking at a game like the Witcher 4 makes me think I might have some issues in the near future. We shall see.

Again, it depends on your mindset and expectations. Say that you've never heard of the "8GB isn't enough" claim. What then happens is when you encounter a game that hiccups because of the RAM limit, you would do what every gamer would do, which is to lower your settings until the hiccup stops. As reported by those same tech tubers, 1080p at medium settings would be fine for the vast majority of games, including AAA games.

But if your expectation has been set by above "truth" that 1080p medium isn't acceptable, then dissatisfaction ensues, and you become an adherent to said truth.

>To be frank I think a lot of people just like complaining, or nothing is ever good enough, or they get caught up in the cycle of negativity and discourse or whatever.

Ranting is the favorite online pasttime. It's normal behavior. It's just noise that you learn to tune out, once you've been online long enough.

>But for now, yes, I'd say 8gb of vram ia passable.

I think what's important is that you set your own standards of what is "good enough," and not be swayed by what others think. When you're in an echo chamber like this one, it's easy to be coaxed by "truths" that don't align with your value system.
 
I think what's important is that you set your own standards of what is "good enough," and not be swayed by what others think. When you're in an echo chamber like this one, it's easy to be coaxed by "truths" that don't align with your value system.

What I've always done is break things down by use case and who would be in that use case. Entry level cards are perfectly fine for the vast majority of the world's consumers, this includes the majority of PC Gamers.

We can debate chip costs being absurd, but the market for these entry level 8GB cards exists. This makes all the people saying "8GB cards are dead / garbage / insert popular insult here" factually wrong. They are being sold in large quantities, just not to enthusiasts.

Enthusiasts will be buying enthusiast class cards like the 70 or 80 models. Those are the cards that meet the use cases of that group. Demanding an entry level card to be equipped like an enthusiast card is asinine.

I believed those tech YTers were deliberately misleading viewers in the hopes of impacting sales as a way to punish nVidia for their absurd chip prices and malicious segmentation. It didn't work as the primary consumer of those cards don't even know their names.
 
  • Like
Reactions: King_V