Review Nvidia GeForce RTX 4060 Ti Review: 1080p Gaming for $399

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
You say that while posting a graph of 54 fps avg. performance in 4K (and 97 fps @ 2K from the same review). Both are extremely playable. Nor am I aware of any statement from Nvidia that claims this card is only useful at 1080p. Are you?

And I'll kindly ask you to moderate your tone. Everyone who disagrees with you isn't a evil bloodsucking Nazi. I, for instance, am a bloodsucking capitalist.
I can use my Vega64 to play in 4K for all my games, so Vega64 should make a come back!

Regards.

Note: I get ~30 FPS'es with everything in medium using FSR2 in Witcher3!

EDIT: https://www.techpowerup.com/review/amd-radeon-rx-vega-64/27.html
I may be under-performing due to capping the power at 250W, but hey, still over 30FPS with the new changes in Witcher3.
 
Last edited:
While I agree with those who say Tom's was too positive about the 4060 TI, at least they did not embarrass themselves as JayzTwoCents did. Geesh, their "review" was such a shill (New Mid-Range King) that they actually wound up pulling it from YouTube - after a ton of consumer backlash.. They did yet another "not an apology" just "explaining" post in it's place..

The most recent blunder from a source with a history of bad calls. Contrast that with Gamers Nexus flat out title of "Do Not Buy" review or the title of Hardware Unboxed's "Laughably Bad at $400" review. At this point the history of JayzTwoCents bad advice to consumers is too much to warrant any confidence in their content.
 
The one where the reviewer claimed the card was utterly unfit for anything other than 1080p ... while standing in front of a graph showing the card running a 4K game at 117 fps?

Anything for a click eh?
you mean rainbow6siege? a 7 yr old game?
runs 40 fps in CP at 4k and at 1080p merely 106? (only 10 above last gen)

Steve's point was Nvidia is selling a refresh tier improvement as a next gen card. You might as well save $ and get a 3060 ti as ur gettign basically same performance.

and he even says WHY its so bad (because nvidia shafted bus and relies on dlss 3& frame gen to get the expected generational improvements one expects)

this is the 1st GPU (at least in past decade) that is regularly worse performing than the last gen tier.

usually you go up 1/2 or 1 full tier of performance...not nearly same or worse.

THATS why its bad.

the gpu is good tech stand alone. (any modern gpu is) just for the price & performance once you factor in you have alternatives (amd's, intel, & last gen nvidia) you see how "bad" the card is.

and this is even ignoring how the GPU is only 8x pcie lanes so if u run a gen 3 slot..you gonna lsoe 10% (ish) just from that alone vs if you put it in a gen 4 slot.

the "bad" part of this gpu is ENTIRELY because of bad choices made by Nvidia.

they chose to go all in on dlss 3.0 when very little has its support (same reason ppl hated on 20 series early on as rtx a key feature over 10 series barely had support)


when a lot mroe has dlss 3 support? sure might become good...but thats not why you buy a gpu as you need it now not possible futures.
 
Cut the shame crap, first off. The headline is just a headline, not the full story. Others think I scored it too high, some think it deserves a 5-star review, whatever.

Depending on the game, resolution, settings, and desired FPS, the 4060 Ti can handle 1440p. My point (in the headline) is that it's more of a 1080p gaming card. And it still costs $399. Nvidia said as much, and I quote from an email:

"I think the sweet spot for this card is 1080p. Abd that is what A LOT of people play at."

I even left in the typo. That was in response to an email I sent the other day saying:

"Fundamentally, I think the 8GB VRAM capacity and 128-bit bus are going to be a bit of an issue at 1440p in some modern games. DLSS can help out, but really I can't help but feel that the 30-series memory interfaces were better sized than the 40-series (except for the 4090). I'm not alone in that, I know, but it would have been better from a features standpoint to have 384-bit and 24GB on the 4090, then 320-bit and 20GB on 4080, 256-bit and 16GB on the 4070-class, 192-bit and 12GB on the 4060-class, and presumably 128-bit and 8GB on the 4050-class. Obviously that ship has long since left port, and the 4060 Ti 16GB will try to overcome the VRAM limitation to some extent (when it ships), but it's still stuck with a 128-bit interface."
I play at 1080p60, but would never buy a $400 card for that. It's too expensive. Besides that price is without tax, so it's closer to $500 in EU. I ain't paying that for a dubious pleasure of playing new games.
 
  • Like
Reactions: sherhi
The one where the reviewer claimed the card was utterly unfit for anything other than 1080p ... while standing in front of a graph showing the card running a 4K game at 117 fps?

Anything for a click eh?
You're lucky if you hit 100FPS with the 4060 TI in 1080p. Many games barely hit 60fps.

There is a 2,5 year gap between the 3060Ti and 4060Ti...and it is at best, at best, 15% faster.

That's only a 6% improvement a year. Not sure what TSMC keeps claiming about their new nodes, but Moore's Law is clearly coming to a screeching halt. Most of that 15% doesn't even seem to be coming from a newer node, but from the higher cache.

4060Ti...1080p...just enough to be able to maintain 60 fps in newer games. It can not maintain 60 fps in Elden ring for example. That's basically 0 headroom for new games.

Yes it can hit 200fps in some FPS, but then these FPS look like they could run on a potato GT1030 too.

 
Last edited:
The problem with doing DLSS / FSR2 everywhere is just time. So my standard, like with overclocking, is to show just the card under review with a reference point.
I understand, but I had a very specific request of a single case to add to your DLSS performance analysis: the card's immediate predecessor. I would hope to see 1 or 2 additional bars on those charts, showing the RTX 3060 Ti with DLSS (and optionally without).

Where frame-generation is involved, I don't think it's very fair to compare it with cards not doing frame-generation.
 
  • Like
Reactions: Roland Of Gilead
Perhaps speaking for myself, but I suspect a lot of people interested in upgrading to the 4000 series (especially the entry-mid range) probably aren't as interested in comparisons with the previous gen (since they don't have those), but rather the gen before that (look at Steam hardware survey to see how many 2000 series boards are out there vs 3000 series). I'd like to see benchmarks comparing these to the 2060 and 2060 Super. Especially wrt to raytracing performance, which is virtually useless on these cards at any resolution.
 
  • Like
Reactions: sherhi
While I agree with those who say Tom's was too positive about the 4060 TI, at least they did not embarrass themselves as JayzTwoCents did. Geesh, their "review" was such a shill (New Mid-Range King) that they actually wound up pulling it from YouTube - after a ton of consumer backlash.. They did yet another "not an apology" just "explaining" post in it's place..

Hahah!!!! The irony! You mentioned him last week when I said something about his 4M subscribers which are twice what GN has. Obviously due to all the "subscribe for entry into the contest to win PC parts!" 🤣

Well to be truthful, I did not forget JayzTwoCents as much as ignore him. The guy has a history of making angry bombastic pronouncements which were flat out factually wrong.

👍

The most recent blunder from a source with a history of bad calls. Contrast that with Gamers Nexus flat out title of "Do Not Buy" review or the title of Hardware Unboxed's "Laughably Bad at $400" review. At this point the history of JayzTwoCents bad advice to consumers is too much to warrant any confidence in their content.

Gotta agree! ... and in case anyone missed it I just stumbled across the video.

View: https://www.youtube.com/watch?v=okS5qnMMcjs


People are praising him for it too. 🤣 Apparently though he's recovering from surgery and wasn't directly involved in the video? Still... bad advice is bad.
 
Last edited by a moderator:
  • Like
Reactions: CeltPC
There is a 2,5 year gap between the 3060Ti and 4060Ti...and it is at best, at best, 15% faster.

That's only a 6% improvement a year.
That's my point entirely. This is the new reality for the semi industry -- a small increase in logic perfomance, much less in SRAM, and none whatsoever for analog. A 15% performance uptick at 20% (?) power drop is right in line with expectations. And N3/N2 are even worse -- there are configurations in those nodes that, for the first time in history, actually have a higher per-transistor cost than the prior generation.
 
i wouln't have any issues as well IF i was using a 4090 too.

nvidia can keep their cards, HAVING to resort to DLSS and fake frames is insulting for the price they are charging for their cards.

Maybe AMD will take this opportunity to put out better cards. Nvidia can charge whatever they want when there is little to no competition on the top tier cards.

I've been running Nvidia cards since the GeForce3 but it has nothing to do with loyalty... it's because every time I was in the market for a card Nvidia's cards were better than what ATI/AMD had to offer.

I don't care who makes my hardware as long as it doesn't suck.
 
  • Like
Reactions: CeltPC
Maybe AMD will take this opportunity to put out better cards. Nvidia can charge whatever they want when there is little to no competition on the top tier cards.

I've been running Nvidia cards since the GeForce3 but it has nothing to do with loyalty... it's because every time I was in the market for a card Nvidia's cards were better than what ATI/AMD had to offer.

I don't care who makes my hardware as long as it doesn't suck.

i just found it funny for you to say that, when you are running a top of the line card.

maybe, who knows, but bottom line is this, nvidia is at the point now, where it is almost a requirement to use dlss in order to get the performance at native, that we should be getting on a card that isnt a 90 series card, and that is BS. as some else mentioned, nvidia looks to be in the same place as intel was when amd had bulldozer. DLSS " should " be used as a feature for some one to extend the life of their card by 6 months, maybe more, until they can either afford a new card, or, a new card comes out, that they feel is worth their money, NOT something to be used from day one.
 
  • Like
Reactions: Sleepy_Hollowed
i just found it funny for you to say that, when you are running a top of the line card.

maybe, who knows, but bottom line is this, nvidia is at the point now, where it is almost a requirement to use dlss in order to get the performance at native, that we should be getting on a card that isnt a 90 series card, and that is BS. as some else mentioned, nvidia looks to be in the same place as intel was when amd had bulldozer. DLSS " should " be used as a feature for some one to extend the life of their card by 6 months, maybe more, until they can either afford a new card, or, a new card comes out, that they feel is worth their money, NOT something to be used from day one.

I don't disagree. It is BS. I've already said the 4090 is the only 4000 series card worth buying if you can swallow the price. It's a huge generational leap over the 3090. Why aren't the rest of the 4000 series showing the same performance improvements over last gen?

I also mentioned earlier that DLSS was supposedly to be for added performance... but it's obvious that developers are cheesing optimization on games because DLSS will make up for the shortcomings. Then we end up with thousands of reviews saying lackluster cards are giving lackluster performance.

I don't know what is worse... the current state of the market or the market back when you couldn't find a card anywhere except from scalpers for 3x the MSRP.

Anyway... AMD has a golden opportunity to grab some market share right now.
 
Jarred you absolutely called Nvidia out BUT I think your star rating needs a tweak even by your standards/definition. When a new product launches and it is inferior in some ways (mem bus, lower vram than users would like, etc) to the last gen part and if noticable falls short because of this, even if it can grab some wins at 1080P this doesn't translate to a C grade. Its C- at best if not a D, IMHO. I think thats where your critics have a point even if it is splitting hairs to a degree. Objectively speaking this was a 3 star product by your definition and what I have seen in other reviews. And while your testing did show it trading blows with a 3070, unforuntely there are a lot of reviews out there showing it not beating or tying/beating (single digit percentage gains) the 3060 Ti under to many conditions. I don't know or not whether that may use some addressing on your part or theirs (notably Hardware unboxed and GamersNexus showed very poor 1440P numbers compared to 3060 Ti, though it did beat it). I know games tested can matter because I have no reason to dis-trust your data points.

People implying you're a sell out or simping for Nvidia are just plain wrong. You're clearly not happy with the product for how much it is and what we get. People getting super bent out of shape over what ends up being a half star is a tad much IMHO. I do think however the 4060 Ti needs to be dug into more thoroughly to figure out the dependencies between outlets.

Also I am a bit ashamed of the amount of hate your getting in this forum (criticism is fine). I may not agree with your review 100% but there is not call for all that either
I think there's way too much sensationalism these days within the YouTuber and journalism realms. Everyone wants a huge <Mod Edit> rather than a reasonable take. I disagree, vehemently, with that approach.

RTX 4060 Ti is not a bad card. It's not awesome, but to suggest it's a failure is hyperbole of the worst form. I wouldn't suggest everyone rush out to buy it, but I would absolutely recommend buying a 4060 Ti for $400 over a previous generation card with similar performance for the same price. Now, what about a "lightly used" RTX 3060 Ti for $300 or less? That's where it gets fuzzy.

I'm know there are select workloads where the 4060 Ti can perform a bit worse than a 3060 Ti, but my experience is that those are far from the norm. And while I don't love DLSS 3, it's not actually horrible. It's just overstating things, for lack of a better way of putting it.

As hotaru251 puts it, the idea that we should get 30 to 50 percent more performance each generation thanks to Moore's Law has long since died. TSMC N4 is expensive. Only Nvidia and TSMC know for certain exactly how much it costs for the GPUs, and Nvidia also has all the other components that go into the card to consider as well. Meanwhile, the economy is doing very poorly right now.

Nvidia could theoretically charge less money for the RTX 40-series cards and still earn money, but that's only if you look at hardware costs. What about all the R&D going on? What about previous generation GPUs that still need to be sold? I don't believe for a moment that Nvidia's CEO and executive team are blind to everything that's going on. Quite the opposite. I think they know far better than anyone on the web exactly what their portfolio looks like, how much inventory they have, how much they should charge, etc. Heck, they're probably running financial models that are smarter than all of us on their supercomputer to optimize profits.

This all reminds me a bit of my 13-year-old, where when he wants something, that's all that matters. Gamers want a faster, cheaper, better graphics card. Great! Wanting that and a company actually managing to create it are completely different things.

If you want another example... well, maybe check back in the morning. But AMD, the proverbial champion of the budget gamer, has abandoned the budget sector just as much as Nvidia. It's not making much money on previous generation RX 6000-series parts, RX 7000-series are only at the very top (and soon bottom) of the performance ladder, and pricing is basically right in line with what Nvidia is doing. There are tons of unsold last-gen GPUs as well, which is why AMD isn't pushing out RX 7800/7700 yet and instead is pointing at RX 6900/6800/6700-class GPUs.
 
Last edited by a moderator:
RTX 4060 Ti is not a bad card. It's not awesome, but to suggest it's a failure is hyperbole of the worst form. I wouldn't suggest everyone rush out to buy it, but I would absolutely recommend buying a 4060 Ti for $400 over a previous generation card with similar performance for the same price. Now, what about a "lightly used" RTX 3060 Ti? That's where it gets fuzzy.

Hahah... funny you mention that. I literally just read this over on Reddit... a thread talking about the deleted Jayz video.

The 1060 launched at - like - 200+ dollars and the 4060 is launching at just shy of 400

Even in terms of inflation - tf were they on??? xD

The real issue is that 1060 was 70% faster than 960 whereas 4060ti is 5-10% faster than 3060ti.
 
  • Like
Reactions: peachpuff
>I think we need some more activism and agitation in the tech review space. Big companies like Nvidia and Intel don't listen to that individual users want. They do listen to what influential individuals or publications want.

As mentioned, I empathize with the notion, but I don't agree with it.

The reviewer's duty, first and foremost, is to be _accurate_ with his review. That is, he should review the product for what it is, not what he thinks it should be. It is not accurate to say "4060Ti is terrible" because it is still better than a 3060Ti (which was selling for almost same price last week), and nobody thinks 3060Ti is terrible.

It is altogether different to say "4060Ti is terrible because a $400 card in 2023 should have more VRAM." That is not a review of what the card is, but an opinion of what it should be. The two are separate and distinct, and shouldn't be commingled.

There is something to be said for reviewers as influencers, who can ostensibly mobilize public opinion against what they see as anti-consumer practices. To which I say, great, have at it. Just pen/shoot it in a separate piece and don't couch it as a review.
 
>What led nVidia to make these calls on the configuration?

I think the answer is straightforward. Jarred already said this (somewhere) so I'll just use his words: Nvidia does it (slow-walk the upgrade cadence) because it can. That's it.

Every company does it, as well they should. Intel did it, when AMD was way behind. AMD did it with Ryzen, when Intel was behind. Nvidia is doing it now, because AMD is not only behind, but is content in following Nvidia's lead.

People always get their feathers ruffled when this happens, and understandably so. But from the company's perspective, it's just good business sense. When there's no competition, why compete against yourself? Pretty dumb, right?

So, there is a clear business incentive to slow-walk the pace of upgrades when the situation allows for it (ie lack of competition). You or I would do the exact same thing, were we in the company's shoes. Companies SHOULD do this.

Companies aren't in business to make customers happy. They're in business to maximize their profits, and that entails making customers "happy enough but not too much." Translated, it means the customer can be "unhappy" some of the time.

To be sure, it's a balancing act. That's why Nvidia jacked pricing on high and high-mid cards, but toe the line on midrange cards. Mainstream consumers are sensitive to price increase more than anything else. Can't make them too unhappy. Of course, the trade-off is that products get nerfed.

The questions is, are there enough unhappy folks aside from the handful who hang out on HW sites? Do the activist YouTubers have enough reach to sway opinion beyond this small crowd?

The answer to that lies in changes to Nvidia market share and stock price. If NV's market share take a hit, or if its stock price tumble because of the lackluster 4000 line-up, then for sure NV will do a pivot. But NV's market share isn't in danger, and its stock are buoyed by the AI craze, so my SWAG is NV is safe.

Which is all the more reason why NV should slow-walk the upgrade pace. And it's not just NV. AMD is doing the exact same thing.
 
Last edited:
While I agree with those who say Tom's was too positive about the 4060 TI, at least they did not embarrass themselves as JayzTwoCents did. Geesh, their "review" was such a shill (New Mid-Range King) that they actually wound up pulling it from YouTube - after a ton of consumer backlash.. They did yet another "not an apology" just "explaining" post in it's place..

The most recent blunder from a source with a history of bad calls. Contrast that with Gamers Nexus flat out title of "Do Not Buy" review or the title of Hardware Unboxed's "Laughably Bad at $400" review. At this point the history of JayzTwoCents bad advice to consumers is too much to warrant any confidence in their content.
Gamers Nexus and Hardware Unboxed are so full of BS. "DO NOT BUY IT!" There's a YouTuber headline if ever I saw one! "Laughably bad" because that will get clicks and views! Then Steve from GN gets on HUB's video and kisses butt about how hard it is to come up with suitably bad descriptions that will get clicks. And then 2500 people upvote that comment because being part of the echo chamber is so much fun!

Negativity sells, in other words. Maybe Jarred and Tom's should maybe learn from that. But then they'd be just as blatantly bad. These video outlets are all in a race to see who can be the most biased and cause the most outrage. Because that will get them more views and more money.

I can't wait to see what GN and HUB have to say about RX 7600 tomorrow! Will they slam it as hard as they're bashing on Nvidia? Somehow I doubt that, but we'll wait until the reviews are out. AMD dropped the price at the last minute to try and make it not look like a dud. "Here comes the new RX 6650 XT, with almost no new features, for a higher price!" If you want AV1 encoding that bad, just go buy the $200 Arc A750.
 
Every company does it, as well they should. Intel did it, when AMD was way behind. AMD did it with Ryzen, when Intel was behind. Nvidia is doing it now, because AMD is not only behind, but is content in following Nvidia's lead.

People always get their feathers ruffled when this happens, and understandably so. But from the company's perspective, it's just good business sense. When there's no competition, why compete against yourself? Pretty dumb, right?

Spot on... what I was just talking about. If only AMD put out a better card I'd buy it because I have no loyalty to Nvidia... or Intel for that matter. I just built a Ryzen PC which is my first AMD processor since the Athlon XP 1800+ in 2001.

What Nvidia is doing is what any company would do in the same situation.
 
Hahah... funny you mention that. I literally just read this over on Reddit... a thread talking about the deleted Jayz video.

The 1060 launched at - like - 200+ dollars and the 4060 is launching at just shy of 400

Even in terms of inflation - tf were they on??? xD

The real issue is that 1060 was 70% faster than 960 whereas 4060ti is 5-10% faster than 3060ti.
The 960 was sort of a dud. So was the 950. I know why people bought them (they were inexpensive!), but they weren't very good. At the same time, the GTX 1050 Ti was nearly identical performance to GTX 960 4GB, and GTX 1050 was nearly identical to GTX 950. And they had basically the same price.

The GTX 1060 wasn't a $200 part, either, unless you're talking about the 3GB model. The 1060 6GB Founders Edition was $300. Most 1060 6GB cards were $270-ish for a while. And you know how fast the 1060 6GB was? About the same as a GTX 980 — slower at times, but faster in games that needed more than 4GB. It was about 10% faster than a GTX 970... which was selling for about $260 by the time the 1060 arrived.

But 2016 was different times! Not just in graphics hardware but in what was happening in the process technology space. TSMC was stuck at 28nm for way too long, and then finally we had 16nm, 12nm, and Samsung 14nm options. And TSMC wasn't as big then so it couldn't jack up prices. Basically, the modern TSMC was built on the success of those 14nm-class nodes!

Once it moved beyond 12/16nm FinFET, it had grown and could charge a lot more. Apple helped, all the smartphone stuff helped, AMD and Nvidia helped, even Intel helped! Actually, Intel continues to help TSMC by not successfully executing on its process technology roadmaps IMO. It's still behind and may not catch up until 2025... except it might not catch up then, either!

There are lots of companies using TSMC now, for AI, crypto, smartphones, etc. It's now in the pole position and it's charging appropriately for the chance to use its tech. That's eating into margins on GPUs and CPUs for sure.
 
Who's actually gonna buy this? The Radeon RX 6750 XT beats this thing in nearly every way, and NVIDIA used a freaking 128-bit bus. Why????? I miss the good old days when GTX card improved by 30% every generation.
 
I think there's way too much sensationalism these days within the YouTuber and journalism realms. Everyone wants a huge <Mod Edit> rather than a reasonable take. I disagree, vehemently, with that approach.
Yeah, you nailed it. I guess that is all-but-necessary for career Youtubers trying to deal with the mystical algorithm and viewers who are looking for... well, what you said.

Don't get me wrong - I do love watching some of the tech-tubers. As entertainment. I like Jay for entertainment, but I think a lot of his info is way off and unreliable, and his opinions seem bizarre at times (I did watch the video this morning of Phil reviewing it, before it was pulled. I feel bad for the guy. He's just an excited and positive dude, and I thought he did the best he could, for a guy who has reviewed like one product ever. It's not his area, and suddenly had the responsibility dumped on him.) I really like GN, but I disagree with their GPU review conclusions pretty much every time. What I don't understand, is why viewers/readers get to pissed off at reviewers. If you get angry because a reviewer's opinion doesn't match your own, then why are you checking reviews at all, if you have already made up your mind before seeing it?

I check a combination of Youtube and written articles. Written articles allow much more depth. Yours are my go-to reviews that I check first when a new GPU launches. You seem like one of the most level-headed, and don't really get heated (I think your 3.5 rating seems perfectly fine.) I love your analyses, both in articles and here in the forum. Props for the insane amount of benchmarking you always do. Great review.

From every benchmark and bit of info I've seen today, I'm just kinda dumbfounded... I didn't think the 4060 Ti would be quite this close to the 3060 Ti. This is what the GPU market has come to: Skylake-like incremental upgrades. This is sucky-AF. I'm holding off upgrading for the moment due to funds, and am hoping that Nvidia does a Super refresh in the meantime. I dunno how likely that is. But with sales so low, they have to change something, right? Right? ... *tumbleweed*
 
Last edited:
  • Like
Reactions: JarredWaltonGPU
Not sure what TSMC keeps claiming about their new nodes, but Moore's Law is clearly coming to a screeching halt. Most of that 15% doesn't even seem to be coming from a newer node, but from the higher cache.
its not node (4090 is proof the 40 series and the ndoe are fine)

again its entirely bad choices by NVIDIA for the issues of 4060 ti.

they wanted save a few bucks and not give it the memory bus it needed.
This is provable in fact that with DLSS 3 it "does" fit the increase NVIDIA states it does.

The sheet specs (for both the 4060 ti & non ti) show its nearly entirely worse hardware except in its clock speed & memory speed (which is shafted by the memory bus they give outside of dlss 3 supported stuff) oh and its TFLOp i guess.

everything else?
its got lower total gpu cores, sms, tensor, and RT cores, etc than last gen.

This is also likely due ot the increased performance of dlss 3 from 2 (as we seen similar increase goign form dlss 1 to 2 in past)

That could (likely) of been enabled on 30 series if they chose. (they wont)


blame Nvidia not the TSMC node.
 
  • Like
Reactions: atomicWAR