Review Nvidia GeForce RTX 4060 Ti 16GB Review: Does More VRAM Help?

I just want to clarify something here: The score is a result of both pricing as well as performance and features, plus I took a look (again) at our About Page and the scores breakdown. This is most definitely a "Meh" product right now. Some of my previous reviews may have been half a point (star) higher than warranted. I've opted to "correct" my scale and thus this lands at the 2.5-star mark.

I do feel our descriptions of some of the other scores are way too close together. My previous reviews were based more on my past experience and an internal ranking that's perhaps not what the TH text would suggest. Here's how I'd break it down:

5 = Practically perfect
4.5 = Excellent
4 = Good
3.5 = Okay, possibly a bad price
3 = Okay but with serious caveats (pricing, performance, and/or other factors)
2.5 = Meh, niche use cases
...

The bottom four categories are still basically fine as described. Pretty much the TH text has everything from 3-star to 5-star as a "recommended" and that doesn't really jive with me. 🤷‍♂️

This would have been great as a 3060 Ti replacement if it had 12GB and a 192-bit bus with a $399 price point. Then the 4060 Ti 8GB could have been a 3060 replacement with 8GB and a 128-bit bus at the $329 price point. And RTX 4060 would have been a 3050 replacement at $249.

Fundamentally, this is a clearly worse value and specs proposition than the RTX 4060 Ti 8GB and the RTX 4070. It's way too close to the former and not close enough to the latter to warrant the $499 price tag.

All of the RTX 40-series cards have generally been a case of "good in theory, priced too high." Everything from the 4080 down to the 4060 so far got a score of 3.5 stars from me. There's definitely wiggle room, and the text is more important than just that one final score. In retrospect, I still waffle on how the various parts actually rank.

Here's an alternate ranking, based on retrospect and the other parts that have come out:

4090: 4.5 star. It's an excellent halo part that gives you basically everything. Expensive, yes, but not really any worse than the previous gen 3090 / 3090 Ti and it's actually justifiable.

4080: 3-star. It's fine on performance, but the generational price increase was just too much. 3080 Ti should have been a $999 (at most) part, and this should be $999 or less.

4070 Ti: 3-star. Basically the same story as the 4080. It's fine performance, priced way too high generationally.

4070: 3.5-star. Still higher price than I'd like, but the overall performance story is much better.

4060 Ti 16GB: 2.5-star. Clearly a problem child, and there's a reason it wasn't sampled by Nvidia or its partners. (The review would have been done a week ago but I had a scheduled vacation.) This is now on the "Jarred's adjusted ranking."

4060 Ti 8GB: 3-star. Okay, still a higher price than we'd like and the 128-bit interface is an issue.

4060: 3.5-star. This isn't an amazing GPU, but it's cheaper than the 3060 launch price and so mostly makes up for the 128-bit interface, 8GB VRAM, and 24MB L2. Generally better as an overall pick than many of the other 40-series GPUs.

AMD's RX 7000-series parts are a similar story. I think at the current prices, the 7900 XTX works as a $949 part and warrants the 4-star score. 7900 XT has dropped to $759 and also warrants the 4-star score, maybe. The 7600 at $259 is still a 3.5-star part. So, like I said, there's wiggle room. I don't think any of the charts or text are fundamentally out of line, and a half-star adjustment is basically easily justifiable on almost any review I've done.
 

Lord_Moonub

Commendable
Nov 25, 2021
12
11
1,515
Jarred, thanks for this review. I do wonder if there is more silver lining on this card we might be missing though. Could it act as a good budget 4K card? What happens if users dial back settings slightly at 4K (eg no Ray tracing, no bleeding edge ultra features ) and then make the most of DLSS 3 and the extra 16GB VRAM? I wonder if users might get something visually close to top line experience at a much lower price.
 
Jarred, thanks for this review. I do wonder if there is more silver lining on this card we might be missing though. Could it act as a good budget 4K card? What happens if users dial back settings slightly at 4K (eg no Ray tracing, no bleeding edge ultra features ) and then make the most of DLSS 3 and the extra 16GB VRAM? I wonder if users might get something visually close to top line experience at a much lower price.
If you do those things, the 4060 Ti 8GB will be just as fast. Basically, dialing back settings to make this run better means dialing back settings so that more than 8GB isn't needed.
 

Greg7579

Commendable
Jun 11, 2022
61
29
1,560
Jarred, I'm building with the 4090 but love reading your GPU reviews, even the ones that are far below what I would build with because I learn something every time.
I am not a gamer but a GFX Medium Format photographer and have multiple TB of high-res 200MB raw files that I work extensively with in LightRoom and Photoshop. I build every 4 years and update as I go. I build the absolute top-end of the PC arena, which is way overkill, but I do it anyway.
As you know. Lightroom has many new amazing AI masking and noise reduction features that are like magic but so many users (photographers) are now grinding to a halt on their old rigs and laptops. Photographers tend to be behind the gamers on PC / laptop power. It is common knowledge on the photo and Adobe forums that these new AI capabilities eat VRAM like Skittles and extensively use the GPU for the grind. (Adobe LR & PS was always behind on using the GPU with the CPU for its editing and export tasks but now are going at it with gusto.) When I run an AI DeNoise on a big GFX 200MB file, my old rig with the 3080 (I'm building again soon with the 4 090) takes about 12 seconds to grind out the AI DeNoise task. Others rigs photographers use take several minutes or just crash. The Adobe and LightRoom forums are full of howling and gnashing of teeth about this. I tell them to start upgrading, but here is my question.... I can't wait to see what the 4090 will do with these photography-related workflow tasks in LR.
Can you comment on this and tell me if indeed this new Lightroom AI masking and DeNoise (which is a miracle for photographers) is so VRAM intensive that doubling the VRAM on a card like this would really help alot? Isn't it true that NVidea made some decisions 3 years ago that resulted in not having enough (now far cheaper) VRAM in the 40 series? It should be double or triple what it is right? Anything you can teach me about increased GPU power and VRAM in Adobe LR for us photographers?
 
  • Like
Reactions: voyteck

atomicWAR

Glorious
Ambassador
Some of my previous reviews may have been half a point (star) higher than warranted. I've opted to "correct" my scale and thus this lands at the 2.5-star mark.

Thank you for listening Jarred. I was one of those claiming on multiple recent gpu reviews that your scores were about a half star off though not alone in that sentiment either. I was quick to defend you from trolls though as you clearly were not shilling for Nvidia either. This post proves my faith was well placed in you. Thank you for being a straight arrow!
 
Last edited:
Jarred, I'm building with the 4090 but love reading your GPU reviews, even the ones that are far below what I would build with because I learn something every time.
I am not a gamer but a GFX Medium Format photographer and have multiple TB of high-res 200MB raw files that I work extensively with in LightRoom and Photoshop. I build every 4 years and update as I go. I build the absolute top-end of the PC arena, which is way overkill, but I do it anyway.
As you know. Lightroom has many new amazing AI masking and noise reduction features that are like magic but so many users (photographers) are now grinding to a halt on their old rigs and laptops. Photographers tend to be behind the gamers on PC / laptop power. It is common knowledge on the photo and Adobe forums that these new AI capabilities eat VRAM like Skittles and extensively use the GPU for the grind. (Adobe LR & PS was always behind on using the GPU with the CPU for its editing and export tasks but now are going at it with gusto.) When I run an AI DeNoise on a big GFX 200MB file, my old rig with the 3080 (I'm building again soon with the 4 090) takes about 12 seconds to grind out the AI DeNoise task. Others rigs photographers use take several minutes or just crash. The Adobe and LightRoom forums are full of howling and gnashing of teeth about this. I tell them to start upgrading, but here is my question.... I can't wait to see what the 4090 will do with these photography-related workflow tasks in LR.
Can you comment on this and tell me if indeed this new Lightroom AI masking and DeNoise (which is a miracle for photographers) is so VRAM intensive that doubling the VRAM on a card like this would really help alot? Isn't it true that NVidea made some decisions 3 years ago that resulted in not having enough (now far cheaper) VRAM in the 40 series? It should be double or triple what it is right? Anything you can teach me about increased GPU power and VRAM in Adobe LR for us photographers?
Honestly, I never use Lightroom so it's not really on my radar. For better or worse, I started with Photoshop back in the aughts and learned that well enough that I never tried Lightroom. Which is funny because Lightroom ostensibly does more of what I need (i.e. cleaning up photos of products) than Photoshop.

I do have access to Lightroom (Adobe CC), so in theory I could try and test this. But the problem is knowing what I really need to test / how to test it. I rarely do anything complex enough that Photoshop bogs down, but I'm pretty much confined to 14MP photos taken on my rather old (but still works fine) DSLR — Nikon D3100 if you're wondering.

Drop me an email / PM / link to a photo that you work with that bogs down, and we can have a discussion of what I'd need to do to test some of these AI workloads. I'm certainly interested myself in how they scale across different GPUs, particularly stuff like the 4060 Ti 8GB/16GB.
 
Thanks for the review Jarred and also for covering the "pro" side of things! The correction and clarification on the scoring system is also most welcome.

The proof is always in the puddin', for sure!

Super niche card for which gamers should just stay the hell away, unless they do have one of those super niche cases to cover.

So, in my old cynical self, I can absolutely see the redemption arc from nVidia next gen cutting prices $50 across the board with a 10% generational uplift and people will love it.

Regards.
 

PEnns

Reputable
Apr 25, 2020
702
747
5,770
" and the RX 6800 XT, which only costs a bit more at $520, delivers superior performance."

Yep. An almost 3 year older card still wipes the floor with anything from Nvidia in most categories!!
 

Eximo

Titan
Ambassador
So odd. 16GB would have made more sense on the 4070 / 4070 Ti cards instead of 12GB (at their launch prices).

Yes, and how it should have been. 4090 384bit, 4080 320bit (20GB), 4070Ti 256bit 16GB, 4070 16GB, 4060Ti 16GB or 12GB depending on how the yields went, 4060 12GB 192bit.

But instead they didn't make a 3080 or 3060 equivalent and pushed the 4070 class GPU as a 4080, and tried to market the 4080 12GB while they were at it. Just as InvalidError stated.
 

abufrejoval

Reputable
Jun 19, 2020
601
435
5,260
It's a niche card, but wherever a market is large enough, niches are, too.

I'm doing far more CUDA with my Nvidia cards than gaming: that's usually what my kids will do with them, after I'm done, so I keep an eye on both.

And with CUDA, especially with machine learning on CUDA, RAM is everything.

The most wonderful thing about an RTX 3090 or 4090 was those 24GB which let you fit the 30B Llama model on it, zero practical use in any game I know. And even the RTX 4090 still struggles with ARK Survival Evolved at 4k and eye candy settings (no DLSS support or similar is supported by the game), so 8K is out of question: 24GB is as much use for gaming as 128GB is on the CPU side, it's really about workstation use cases at PC prices. And from that perspective 16 core Ryzens and RTX x090 are crazy economical.

Too bad Meta chose not to release a 30B model for Llama 2, which means the 70B won't fit and for the 13B I don't need the big card.

If you think that this card is overpriced, try running a 70B model: 48 or 80GB or GPU RAM will cost you a kidney in today's market.

Even with 16GB of RAM you won't have a lot of fun running LLMs, because they need cores, too. But when it comes to doing feasability studies or if you're CS department has to make do with less, it can be the difference between looking on or being at least in a minor league game.

And then with some of the LLM models you can actually run them on multiple GPUs with only a PCIe fabric in-between, because they have been designed on vast scale-out GPU networks, where the fabric isn't always NVlink, either: they do much better than your own run-of-the-mill-and-out-of-RAM model.

For that case this 2.5 slot design is obviously no good. I've gone with PNY for both my RTX 4090, because that's a tripple slot where most are 3.5 and for my 4070, because that's dual slot design with the PNV Verto.

There are used EPYCs out on eBay which allow you to run a quad dual GPU setup for 48GB of VRAM, better than a single RTX 3090 or 4090 if the model isn't too highly connected.

The 4070 is only 12GB but would I pay premium for 24GB? Bet you, I would! Not for gaming, but for ML it's a no brainer!
Ditto the 4090: 48GB is a €8500 price gap between the RTX 4090 and the A100.

Of course it's HBM vs. GDDR6 and some other tiny details, but at say €2000 instead of €1500 for 48GB GDDR6X I'd jump very quickly and probably purchse two, beause that couldn't last.

Nvidia made an interesting decision not to close their consumer cards against ML, like they did for previous generations where they culled FP64 support to disable HPC [ab]use of consumer hardware.

Putting out a 16GB entry level model may well be a carefully aimed move to avoid AMD or even Intel gaining a lead at the entry level of ML.

And those who consciously choose this card precisely for its ML capabilities know they got off cheap--relatively speaking.
 
  • Like
Reactions: voyteck

abufrejoval

Reputable
Jun 19, 2020
601
435
5,260
On another note: the freefall prices on Enthusiast NUCs send me into a buying spree of two, one of each a Phantom Canyon (NUC11PHKi7C, i7-1167G5 Tiger Lake quad + RTX 2060m 6GB) and Serpent Canyon (NUC12SNKi72, Alder Lake i7-12700H + ARC A700m 16GB).

The first sells for €450 currently including VAT and the latter for around €600: that's somewhere between 1/3 and 1/2 of their MSRP and really excellent hardware for a price that's 'rather nice', to put it very bluntly.

For €50 over current listings of an RTX4060ti you get a full bare bone with a rather potent CPU that hits very near desktop speeds, but just where the CMOS knee isn't yet jerking, as well as well as a plethora of ports, a well built chassis etc., etc. and those magic 16GB of 256bit VRAM on a GPU that will raster faster than the 4060, but currently super sample somewhat less intelligently.

The gen 11 is truly adorable, the quietest NUC I've ever known at Furmark + Prime95, small and beautiful for every game I've thrown at it with ultra settings on THD, if DLSS 2.0 was supported.

The gen 12 struggles to keep as quiet because the same fan setup needs to deal with 100 more peak Watts, but for anything up to 2560x1440 60-144Hz I've not seen it falter, XeSS adding around 30% if supported.

So if gaming economy is your thing, please have a look at these truly wonderful Enthusiast NUCs, too bad they're not sustainable at current prices, but as crazy as they were at the original MSRP, they are sanest choice at the current sell-out.

And really solid, lovingly made hardware with excellent BIOS control over all TDP and fan aspects.

Should I mention they work just as well on Linux?

No, it doesn't run CUDA. But Stable Diffusion seems OneAPI closer...

If you're looking for console performance, quality and economy at PC utility, I honestly can't think of anything easier to recommend (I'm certainly biased, but I spent my own hard earned money on these).
 

abufrejoval

Reputable
Jun 19, 2020
601
435
5,260
So odd. 16GB would have made more sense on the 4070 / 4070 Ti cards instead of 12GB (at their launch prices).
I'm sure, but not all powers of 2 align perfectly with sense or demand.

These days there are headlines around non-binary DRAM capacities, but it's just as bad or worse with GDDR, I believe.

GPU designers can't just freely choose any permutation of RAM capacity, bus width and GPU size.
And unfortunately "VRAM DIMM slots" have stopped working a long time ago.

I'd easily pay €900 for a 24GB RTX 4070 and €2000 for a 48GB RTX 4090.
Instead I had to fork out €700 for the 12GB RTX 4070 and €1600 for the 24GB RTX 4090.

But with quad RTX 4060ti setup I might be able to run a 70B Llama 2 instead of paying €8.000 for a 48GB A100... not the same speed, sure but the same model...

Life is full of hard choices!
 

InvalidError

Titan
Moderator
I'm sure, but not all powers of 2 align perfectly with sense or demand.
It would have aligned perfectly if Nvidia went 192bits/12GB for the 4060(Ti) and up 32bits/4GBs per tier from there. If Nvidia learned its lessons from how the 3000-series made most of the 4000-series irrelevant due to the VRAM size and bandwidth problems, that is what the 5000-series should be aiming for.
 

Sleepy_Hollowed

Distinguished
Jan 1, 2017
536
237
19,270
This is a massively hard card to recommend, like the 4060, unless it's the only thing you can get that checks your boxes.

The prices of the higher cards make them hard to recommend as well unless absolutely needed so I'm recommending a generation skip to my friends unless there's a need.

What a terrible generation of cards.