GeForce RTX 5060 Ti 8GB vs RTX 5060 Ti 16GB GPU face-off: how much difference does twice the memory make for the same chip?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
No one buys a 4060 or something while expecting stellar performance.
Hell, 4k ultra? If that's a concern then surely you put some money on the table to make it possible by buying a decent GPU instead.
For 1080p though? With ray tracing and Dlss 8GB does the job just fine unless you're trying to play Alan Wake 2 which is a beast of its own (I played it on a 2060 though).
Just like my VRAM, my budget is limited so I can't just dump 100€ more to gain 5fps in a higher resolution. What I mean is that the people buying these gpus already operate on a limited budget with limited expectations, you can keep telling them that DLSS looks awful in 1080p or that framegen is unacceptable, but we have different standards for what is or isn't good.
Now that doesn't mean I have no standards, I always try and aim for 120 fps if I can, or 60 at the absolute lowest for really heavy games, and outside of AW2 that hasn't stopped me from turning on ray tracing in any game.
 
I sent Paul/Aaron a note (from the great beyond). I don't know if this was just outdated text based on real-world pricing (at one point the 16GB cards was trending closer to $500 than $430), but these days you can find the 8GB card starting at $370 — $350 if you want to try to get Newegg's mail-in rebate. The 16GB can be had for $430, straight up from Amazon and Newegg with no rebates.
Nice to see your writing again!
 
I understand that 8gb of vram isn't enough for a lot of people, and there is a borderline absurd amount of discourse surrounding technologies like DLSS and frame gen, but I've been getting by on a laptop with a 4060, and it's been perfectly fine. Better than fine, actually. Cyberpunk and Elden Ring run phenomenally well at 1600p on max settings. TLOU2, God Of War, Ghost Of Tsushima. It handles everything I throw at it better than I think it will. All of the older games that I spend 90% of my gaming time playing run perfectly. Obviously if it weren't for upscaling we would be having a different conversation, but I'm not what one would call a "pixel peeper" so it doesn't bother me at all. As long as I can get over 60 fps I'm happy. However, looking at a game like the Witcher 4 makes me think I might have some issues in the near future. We shall see. To be frank I think a lot of people just like complaining, or nothing is ever good enough, or they get caught up in the cycle of negativity and discourse or whatever. But for now, yes, I'd say 8gb of vram ia passable.
Where is THE DISLIKE BUTTON?
 
Say that you've never heard of the "8GB isn't enough" claim. What then happens is when you encounter a game that hiccups because of the RAM limit, you would do what every gamer would do, which is to lower your settings until the hiccup stops. As reported by those same tech tubers, 1080p at medium settings would be fine for the vast majority of games, including AAA games.
Say that you've never heard of texture quality and a lot of other graphics settings that mostly depend on VRAM which is not expensive that BUT have a great impact on the quality of what you see.
Say that you don't know what is the resolution of your monitor (which most probably is higher than 1920x1080).
 
The issue becomes that the basic gaming card is still $400. That's a chunk for a lot of people.
This actually, it's kind of ridiculous that it can't do 1080p ultra at $400 brand new for a GPU only... I am kind of fine for them to sell skyhigh price for the HALO product, but you better offer something reasonably decent at 4k ultra for most current ganes at a reasonable price plus a "student" kind of entry gaming card where the 60 series used to be at something like $200 for 1080p high settings. HALO product is 50%show off and future proof, but buying a $400 card and have to live with 1080p and compromised setting is.... kind of too greedy
 
Hell, 4k ultra? If that's a concern then surely you put some money on the table to make it possible by buying a decent GPU instead.
For 1080p though? With ray tracing and Dlss 8GB does the job just fine unless you're trying to play Alan Wake 2 which is a beast of its own (I played it on a 2060 though).
It's funny how some people say something like this, are all monitors 1080p or 4K? Do you have to put all the setting at max when you have a 4K monitor?
No! most monitors are somehwere between these two (like 2560x1080 - 2560x1440 - 3440x1440) that 8GB of VRAM is just not enough.
The answer of the second question is a NO as well! You probably know that resolution and graphics setting are mostly independent and there are a ton of different graphics settings both in the graphics card's control panel and in games as well and you can tweak every single of them or choose between a number of presets.
 
While I'm not super thrilled about it, I think AMD can be afforded quite a bit more slack in this regard, for two reasons:

1 - at this point the 9060 XT is, particularly with discounts (albeit that in the form of a Newegg gift card), as low as $250

2 - More importantly, there was a video, I think it was Hardware Unboxed, that did a comparison between the 9060 XT 8GB and the 5060 Ti 8GB, and found that the AMD card suffers notably less from the VRAM limitation than does the Nvidia card.
I wouldn't say this is "good vs bad" option, but "bad vs worse".

You're not wrong in mentioning it, but don't give AMD a pass either.

Neither 8GB model has any reason to exist.

Also, I'm glad there's a lot of "it works on my machine!" sentiment. Be happy and I hope reality doesn't slap you in the face very suddenly one day.

Regards.
 
  • Like
Reactions: P.Amini
Let's take a walk. 11% is barely perceptible to human senses. 10% is where you might notice.

None of those frame rates are bad. The lure of "it has to be 3000 fps!" is crap. From the invention of tv and video game consoles in the 70s, we all played at 240i, 60Hz. Didn't matter how fast the console was, you were seeing interlaced 60hz. It'd look more or less like 30fps. Consoles often advertised as more, but you wouldn't have seen it. Some tvs and VHS recorders could go up to ~400i with special tapes. After trying that a few times, suffice it to say that any difference was hard to see, if it was even compatible at all. Around 30 years of 30fps that until 720p hdmi, and then 6-8+ months for consoles that supported it.

These aren't 1440p cards. If you want that at this price point, its a B580. If you wanna pay the nvidia tax, value isn't of interest. AMD same, to a lesser extent.

What I've seen over the last few years is youtube and web site techsperts who get their hardware for free and get advance products, but are paid to promote them. I saw a few "trustworthy" youtubers push a mini nas as awesome because the company was giving them $31 for each unit sold through their affiliate links. They've therefore pushed a sell up message for nvidia, amd and intel. The device has loads of performance and thermal issues that a FEW YTers covered.

And then nvidia didn't send them free 5050s or 5060s. And shipped them the day before a convention they were all going to. So they couldn't do day one coverage. One YT channel did it overnight with a bought card and was quite pissed at everything about it because it didn't fit with his channel sell-up and affiliate deals.
 
They almost had it, but the bias leaked through when they mixed RT and non RT.
The writer already had a conclusion and just wrote material to support that conclusion.


8GB is sufficient for 1080p/1440p up to High / Very High. Ultra on modern games simple has too large texture sizes, which honestly do not belong on resolutions like 1920x1080 or 2560x1440, but do belong on 3840x2160, 7680x4320 or the various ultra wide desktop displays. 8GB is for entry level 128-bit cards like the xx50 (now nVidia calls them xx60). No enthusiast should be buying those cards.

IMHO The 4060 TI / 5060 TI should never of existed as 128-bit cards and instead been a cut down version of the higher tier 192-bit cards. But nVidia playing games with model numbers to justify ridiculous per-chip prices.
Not sure what the "bias" is here? That if you're spending $350 or more, 8GB is a bad investment? The problem is that $50 more gives you so much more potential over the long haul. If this was a $200~$250 card versus a $250~$300 card, you might be able to argue that the extra $50 for doubling the memory is a bit of a tough sell. Bumping the price up to $380 vs. $430, though?

Nvidia shouldn't have created the 5060 Ti 8GB. Period. The regular 5060 already covers the same market for less money, and yes it's a bit slower... but if you don't need 16GB, then you sure as hell don't need the extra performance of the 5060 Ti vs the 5060.

You might think RT is stupid (and it often is!), but it's also a good proxy for more demanding rasterization games. If today's games without RT are right at the limit of 8GB, and well beyond it with RT enabled, then tomorrow's games without RT will also be beyond the limit of 8GB. And yes, lowering details and settings reduces VRAM requirements, but then you might as well get a lower tier card to go with it.

I still think the potential RTX 5060 Ti Super with 12GB (4x3GB GDDR7) will also be good when/if it arrives. 12GB for this tier is at least acceptable though not awesome. (It's a much harder sell on the 5070!) But having both the 8GB and 16GB 5060 Ti (and 9060 XT!) at launch just feels so disingenuous in 2025.
 
  • Like
Reactions: P.Amini and -Fran-
If you widen the context to be representative of the wider gaming population, eg those on Steam survey, the statement is no longer true.
Dude, the Steam survey is a HORRIBLE metric to use. It has tons of problems, not the least of which is that it relies on users reporting what they use. If you look at it now, it shows a tiny fraction of AMD cards being used, but SALES data suggest the percentage should be FAR higher than what it is on there!
 
Not sure what the "bias" is here? That if you're spending $350 or more, 8GB is a bad investment? The problem is that $50 more gives you so much more potential over the long haul. If this was a $200~$250 card versus a $250~$300 card, you might be able to argue that the extra $50 for doubling the memory is a bit of a tough sell. Bumping the price up to $380 vs. $430, though?

This is a value judgement, which is 100% up to the consumer. I happen to agree that all the nVidia chips are outrageously priced but I'm not arrogant enough to assume my opinion is the only valid opinion.

Nvidia shouldn't have created the 5060 Ti 8GB. Period. The regular 5060 already covers the same market for less money, and yes it's a bit slower... but if you don't need 16GB, then you sure as hell don't need the extra performance of the 5060 Ti vs the 5060.

This is a value judgement, and while we can rant and rave, the market has spoken. I happen to personally think the entire 40 and 50 series are bad value. The 5060 is just a binned 5060 TI (GB206), the GB206 has a 128-bit memory bus placing it solidly in the entry tier category. Putting a "Ti" next to it's name to justify charging more is the real problem. It should of been a down binned GB205 (5070) with the 192-bit memory bus and therefor 12GB of VRAM. Selling regular people memory running at half speed is just WTF levels of abuse. Clamshell mode on a consumer card is legit an abomination, that is a feature made for workstation / datacenter cards where the engineers involved already accept the tradeoffs.

The point of my post was that the "winner" was determined ahead of time and then everything written to support that conclusion. Instead of unbiased pro / con of each product, we got the usual biased "no 8GB cards in 2025" spiel. The mixing of RT and Rasterization and sticking with medium vs ultra (skipping high/very high) was a very open attempt to biasing the final data as those are all separate use cases. Anyone wanting RT isn't going to be buying an entry level card, it's going to suck and they will not enjoy it. They really should be buying the mainstream or enthusiast tier cards which will give them a good experience. Anyone buying into these entry level cards should already know they are going to be compromising on RT and other advanced features.
 
Last edited:
Dude, the Steam survey is a HORRIBLE metric to use. It has tons of problems, not the least of which is that it relies on users reporting what they use. If you look at it now, it shows a tiny fraction of AMD cards being used, but SALES data suggest the percentage should be FAR higher than what it is on there!

It has been shown to be highly accurate as a representation of the PC gaming market as a whole.

https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

It's not just the enthusiasts PC Master Race who treat their gaming rigs specs as a badge of honor or bragging rights. It's the defacto PC gaming distribution platform, any PC gamer who doesn't at least have it installed would be the definition of a statistical outlier.

Randomization sampling over large numbers is kind of how science works, using mathematical models you can figure out your confidence level. For Valve it's 2~3 sigma, which is 95~99% confidence, good enough for business and marketing data. Scientific data usually requires 4~6 sigma (99.994% ~ 99.99966%), which is where the myth of "Steam is unreliable" comes from.
 
It has been shown to be highly accurate as a representation of the PC gaming market as a whole.

https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

It's not just the enthusiasts PC Master Race who treat their gaming rigs specs as a badge of honor or bragging rights. It's the defacto PC gaming distribution platform, any PC gamer who doesn't at least have it installed would be the definition of a statistical outlier.

Randomization sampling over large numbers is kind of how science works, using mathematical models you can figure out your confidence level. For Valve it's 2~3 sigma, which is 95~99% confidence, good enough for business and marketing data. Scientific data usually requires 4~6 sigma (99.994% ~ 99.99966%), which is where the myth of "Steam is unreliable" comes from.
It's been a bad joke for YEARS:

Btw, quoting an article from Steam themselves doesn't really make a good example, bub.
 
It's been a bad joke for YEARS:

Btw, quoting an article from Steam themselves doesn't really make a good example, bub.

Umm ... I didn't quote anything .... talk about a self own...

Every time someone complains about the Steam HW survey, I peal back the onion to see it's just their feelings. The look at the numbers and "that doesn't FEEL right", basically just self selection echo chambering. When pushed for details, ultimately it's someone posting a link to a forum post or editorial where the error bar for the steam survey is figured to be 0.5 ~ 1%, aka above 2 but less then 3 sigma. Then some gaslighting the readers on the importance of sigma in statics. Under 3 sigma is trash for scientific studies because they demand 99.994% (4 sigma) or higher accuracy as part of publishing criteria. Over 2 sigma (95%) is considered really good for marketing data as part of a business decision or study.

Anyhow, the Steam HW Survey is very accurate as a proxy for the assessing the PC Gaming Market, within 2~3 sigma.
 
Last edited:
>Not sure what the "bias" is here? That if you're spending $350 or more, 8GB is a bad investment? The problem is that $50 more gives you so much more potential over the long haul.

I agree that if the choice was between a 5600Ti/16 vs /8, the /16 is the better buy hands down. The bias is that it's mostly DIY'ers (enthusiasts) who buy dGPUs as a separate purchase. Regular gamers tend to buy pre-built system or laptop, and jumping to the next tier to get 16GB VRAM would likely involve a larger price delta than just between the dGPU variants.

When you say that card A is better than card B, your audience is perforce limited to the enthusiast/DIY set. That's the built-in bias.

>Nvidia shouldn't have created the 5060 Ti 8GB. Period.

Your statement is presumptuous. You have no insight into what companies should or shouldn't sell. Technical merit is just one of the many factors that goes into product decision-making.

It's the same malarkey that techtubers say, that GPU X should cost only Y dollars. Tech reviewers aren't business analysts, and shouldn't pretend to be.