Review Nvidia GeForce RTX 4060 Ti Review: 1080p Gaming for $399

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Cut the shame crap, first off. The headline is just a headline, not the full story. Others think I scored it too high, some think it deserves a 5-star review, whatever.

Depending on the game, resolution, settings, and desired FPS, the 4060 Ti can handle 1440p. My point (in the headline) is that it's more of a 1080p gaming card. And it still costs $399. Nvidia said as much, and I quote from an email:

"I think the sweet spot for this card is 1080p. Abd that is what A LOT of people play at."

I even left in the typo. That was in response to an email I sent the other day saying:

"Fundamentally, I think the 8GB VRAM capacity and 128-bit bus are going to be a bit of an issue at 1440p in some modern games. DLSS can help out, but really I can't help but feel that the 30-series memory interfaces were better sized than the 40-series (except for the 4090). I'm not alone in that, I know, but it would have been better from a features standpoint to have 384-bit and 24GB on the 4090, then 320-bit and 20GB on 4080, 256-bit and 16GB on the 4070-class, 192-bit and 12GB on the 4060-class, and presumably 128-bit and 8GB on the 4050-class. Obviously that ship has long since left port, and the 4060 Ti 16GB will try to overcome the VRAM limitation to some extent (when it ships), but it's still stuck with a 128-bit interface."

This thing is a turd sandwich! No way anyone said it deserves 5/5. It is being panned by everyone and you gave it 3.5/5. You are no longer objective.

EDIT

Just for funsies. 8GB is a negative in 2021. Now it should be a travesty at $400.
 
Last edited:
  • Like
Reactions: evdjj3j
You have no idea what you are talking about 58 vs 60 fps means tearing, which looks terrible and causes me nausea. 67% vs 100% is not being pedantic, you just got called on your BS. Just because you're fine with garbage visuals doesn't mean the rest of us are.
Solution: G-Sync / Adaptive Sync. Been around for about eight years now. If you're using an older fixed refresh rate monitor, then the 60 fps threshold is more critical, yes.
 
This thing is a turd sandwich! No way anyone said it deserves 5/5. It is being panned by everyone and you gave it 3.5/5. You are no longer objective.
And yet the comment I replied to was calling my review "shameful" because I apparently didn't like this enough.

3.5 is not a great score. It's not the end of the world. It's "okay" and that's what most GPUs are: okay. Downright awful would be either horribly broken drivers, a non-functional device, or worse performance with a higher price. This has slightly better performance than its predecessor with the same price. That's okay, end of story.

If you want to talk about being objective, maybe check yourself. What do you want me as a reviewer to do? 70% (3.5-star) is a passing grade, C's get degrees. 80% (4-star) is a decent product, with minor flaws/concerns. 90% would be a great product. Pricing can adjust those up/down a bit.

Looking at the market, I don't know that we'll see anything clearly superior in every way at the $400 price point this generation. AMD may have an RX 7700 card in the future that costs $400, offers better rasterization performance, runs worse in ray tracing and AI, and has 12GB. That will probable be in the same boat, maybe a 4-star if it surprises me. But, spoiler alert, I know what AMD is doing in the $300 price bracket. Projecting that to the next tier doesn't leave me a lot of hope. Best-case, maybe we see $350 for a 7700-class card with 12GB, that ends up delivering ~6800 XT levels of performance.
 
It's just to answer a hypothetical that Nvidia might be counting on improved DLSS performance to make up for the lack in raw speed improvements.

Whatever your personal opinions of DLSS, I think you can probably set them aside for long enough to hopefully understand what Nvidia might or might not have been thinking.

BTW, I did say exclusive of frame generation. That's a whole other topic and not a simple apples-to-apples comparison like what I'd wish to see.
I'm glad Jarred didn't include DLSS benchmarks, this a BS crutch so that Nvidia can sweep everything under the rug and pretend their limp $%^& new-gen GPU is anything but a failure to improve on its predecessor.
 
  • Like
Reactions: bigdragon
And yet the comment I replied to was calling my review "shameful" because I apparently didn't like this enough.

3.5 is not a great score. It's not the end of the world. It's "okay" and that's what most GPUs are: okay. Downright awful would be either horribly broken drivers, a non-functional device, or worse performance with a higher price. This has slightly better performance than its predecessor with the same price. That's okay, end of story.

If you want to talk about being objective, maybe check yourself. What do you want me as a reviewer to do? 70% (3.5-star) is a passing grade, C's get degrees. 80% (4-star) is a decent product, with minor flaws/concerns. 90% would be a great product. Pricing can adjust those up/down a bit.

Looking at the market, I don't know that we'll see anything clearly superior in every way at the $400 price point this generation. AMD may have an RX 7700 card in the future that costs $400, offers better rasterization performance, runs worse in ray tracing and AI, and has 12GB. That will probable be in the same boat, maybe a 4-star if it surprises me. But, spoiler alert, I know what AMD is doing in the $300 price bracket. Projecting that to the next tier doesn't leave me a lot of hope. Best-case, maybe we see $350 for a 7700-class card with 12GB, that ends up delivering ~6800 XT levels of performance.
I think your review is on-point; I care more for your words than the 5-star system which in my opinion fails to achieve what it's supposedly there to do.
The only thing that doesn't make sense to me is that why your test system uses a 13900K instead of an X3D CPU.
 
>But, spoiler alert, I know what AMD is doing in the $300 price bracket.

@jarred, so is the 7600 write-up ready to go for tomorrow? Not a 5/5, I take it? :)

Let me say that I appreciate your participation in this forum, and answering some of the questions here. That's above and beyond the norm. But I suggest just ignoring the hubbub. People just want to vent, and your piece is a convenient punching bag.

Here's a quote from ArsTechnica reviewer's response to some of the boobirds' there which I find fitting:

"But the incredulous, outraged tone is just not how I've ever preferred to approach things (and I think this is pretty consistent across products and companies over a lot of years). "Waste of sand" is a good pithy quote, can't deny that. But to put it in context: is it mostly better than the 3060 Ti that Nvidia and partners were still selling for close to $400 a week ago? Yeah. Is there a card from AMD or Intel that is clearly, unambiguously better for around the same price? Not really, though there's certainly an opening for a great RX 7700 XT card (or something) in this price range now if AMD can get it together."
 

Cons​

  • Only manages to trade blows with RTX 3070 3060Ti

There, I fixed that for you...

Anyway, I understand their marketing towards 1080p since 64,5% of people on steam play on that resolution but do they want to play on that resolution with entry level GPU with 8gbs for 400usd?

In addition many players have been playing on 1080p for over a decade and I bet majority of consumers on the market dont like stagnation on the same (or higher) price level. And what resolution will 4060 non Ti target? And 4050? Shops in my country still sell brand new 1650/4gb (around 170€) and 1660 SUPER/6gb (around 250€), are these still produced? What resolutions are these cards for? Are they just for retro 10+ years old games or something? I just dont get this marketing.
 
>But, spoiler alert, I know what AMD is doing in the $300 price bracket.

@jarred, so is the 7600 write-up ready to go for tomorrow? Not a 5/5, I take it? :)

Let me say that I appreciate your participation in this forum, and answering some of the questions here. That's above and beyond the norm. But I suggest just ignoring the hubbub. People just want to vent, and your piece is a convenient punching bag.

Here's a quote from ArsTechnica reviewer's response to some of the boobirds' there which I find fitting:

"But the incredulous, outraged tone is just not how I've ever preferred to approach things (and I think this is pretty consistent across products and companies over a lot of years). "Waste of sand" is a good pithy quote, can't deny that. But to put it in context: is it mostly better than the 3060 Ti that Nvidia and partners were still selling for close to $400 a week ago? Yeah. Is there a card from AMD or Intel that is clearly, unambiguously better for around the same price? Not really, though there's certainly an opening for a great RX 7700 XT card (or something) in this price range now if AMD can get it together."
This is why I appreciate Ars. The YouTuber nonsense that goes on where you have to create a damning title with an angry face and then rip on a product for not being a loss leader just irks me. It's like we're trying to train all PC enthusiasts to believe that there are only 5-star products and 0-star products and nothing in between. "The 4060 Ti isn't perfect? OMG! Let's make an outraged video to get clicks!"
 

Cons​

  • Only manages to trade blows with RTX 3070 3060Ti

There, I fixed that for you...

Anyway, I understand their marketing towards 1080p since 64,5% of people on steam play on that resolution but do they want to play on that resolution with entry level GPU with 8gbs for 400usd?

In addition many players have been playing on 1080p for over a decade and I bet majority of consumers on the market dont like stagnation on the same (or higher) price level. And what resolution will 4060 non Ti target? And 4050? Shops in my country still sell brand new 1650/4gb (around 170€) and 1660 SUPER/6gb (around 250€), are these still produced? What resolutions are these cards for? Are they just for retro 10+ years old games or something? I just dont get this marketing.
Nope. Look at the charts. The 4060 Ti does indeed trade blows with the 3070. And by "trades blows" I mean there's no clear winner when it comes to performance.
1684871438512.png
 
Solution: G-Sync / Adaptive Sync. Been around for about eight years now. If you're using an older fixed refresh rate monitor, then the 60 fps threshold is more critical, yes.
I use a 5 year old TV, no VRR, good try. I'd rather spend money on a faster GPU than a new TV, once one worth buying becomes available.
 
The hardware unboxed video review is pretty informative showing video clips of the 4060ti at high fps, but failing to load textures and the games looking like trash. Low-end nvidia GPUs no longer actually render full frames and look glitchy, but still manage to fake out high FPS metrics.
 
Cut the shame crap, first off. The headline is just a headline, not the full story.
Absolutely, unequivocally false. What you're doing in journalistic terms is "burying the lede", and is one of the first things students are taught to avoid in Journalism 101. The issue here is entirely distinct from how accurate your overall analysis is (which, for the record, I think is quite solid) but the discrepancy between the lede and body.

As for the article itself, my takeaway is that, if you wish to spend $400 for a 1440p card, this is the best you can get. Certainly, the price/performance increment is disappointing-- but it's still there. And the power reduction is a strong selling point that is being somewhat overlooked.
 
And yet the comment I replied to was calling my review "shameful" because I apparently didn't like this enough.

3.5 is not a great score. It's not the end of the world. It's "okay" and that's what most GPUs are: okay. Downright awful would be either horribly broken drivers, a non-functional device, or worse performance with a higher price. This has slightly better performance than its predecessor with the same price. That's okay, end of story.

If you want to talk about being objective, maybe check yourself. What do you want me as a reviewer to do? 70% (3.5-star) is a passing grade, C's get degrees. 80% (4-star) is a decent product, with minor flaws/concerns. 90% would be a great product. Pricing can adjust those up/down a bit.

Looking at the market, I don't know that we'll see anything clearly superior in every way at the $400 price point this generation. AMD may have an RX 7700 card in the future that costs $400, offers better rasterization performance, runs worse in ray tracing and AI, and has 12GB. That will probable be in the same boat, maybe a 4-star if it surprises me. But, spoiler alert, I know what AMD is doing in the $300 price bracket. Projecting that to the next tier doesn't leave me a lot of hope. Best-case, maybe we see $350 for a 7700-class card with 12GB, that ends up delivering ~6800 XT levels of performance.
Jarred you absolutely called Nvidia out BUT I think your star rating needs a tweak even by your standards/definition. When a new product launches and it is inferior in some ways (mem bus, lower vram than users would like, etc) to the last gen part and if noticable falls short because of this, even if it can grab some wins at 1080P this doesn't translate to a C grade. Its C- at best if not a D, IMHO. I think thats where your critics have a point even if it is splitting hairs to a degree. Objectively speaking this was a 3 star product by your definition and what I have seen in other reviews. And while your testing did show it trading blows with a 3070, unforuntely there are a lot of reviews out there showing it not beating or tying/beating (single digit percentage gains) the 3060 Ti under to many conditions. I don't know or not whether that may use some addressing on your part or theirs (notably Hardware unboxed and GamersNexus showed very poor 1440P numbers compared to 3060 Ti, though it did beat it). I know games tested can matter because I have no reason to dis-trust your data points.

People implying you're a sell out or simping for Nvidia are just plain wrong. You're clearly not happy with the product for how much it is and what we get. People getting super bent out of shape over what ends up being a half star is a tad much IMHO. I do think however the 4060 Ti needs to be dug into more thoroughly to figure out the dependencies between outlets.

Also I am a bit ashamed of the amount of hate your getting in this forum (criticism is fine). I may not agree with your review 100% but there is not call for all that either
 
Last edited:
Nope. Look at the charts. The 4060 Ti does indeed trade blows with the 3070. And by "trades blows" I mean there's no clear winner when it comes to performance.
View attachment 254
Anyone can watch dozens of other benchmarks on youtube, footage from games etc and see. When its in some 7-8 games better than 3060 Ti on 1080p, then worse or on par on higher resolutions, then all mixed around in another 7-8 games, and on average across all games maybe 8-10% better, sure technically its better card overall. This is just marketing scemantics though, its a personal choice in this case whether one sides with consumers or producers.
 
  • Like
Reactions: atomicWAR
Again, HUB's take isn't wrong, but has Steve admitted himself in the piece, he's taking on a more activist or agitator role in forcing Nvidia/AMD to push the VRAM allotment higher. IMO, that's above and beyond the role of a reviewer, but I empathize with both sides of the coin.
I think we need some more activism and agitation in the tech review space. Big companies like Nvidia and Intel don't listen to that individual users want. They do listen to what influential individuals or publications want. Having the tech reviewers act as intermediaries between the users and the companies establishes a feedback channel that is harder for companies to ignore.

I see Nvidia slipping into that complacent state Intel was in during the AMD Bulldozer years. Intel had everyone locked up on the idea of 4 cores and incremental improvement for many generations of desktop and mobile chips. Meanwhile the ARM space went crazy with huge performance jumps, many more cores, big and small cores, and cores for different specific tasks. If tech reviewers had been more active and agitated Intel, then maybe we wouldn't have had to wait for AMD's Ryzen success to see the x86 market return to big jumps in performance instead of incremental bumps.

I want to see Nvidia agitated. I want to see them sent a clear message of how they're creating a stagnant graphics market. I want to see Nvidia regard game consoles as serious competition again. The RTX 4060 and RTX 4060 TI miss the mark. Many people commenting on this thread want the same. Nvidia won't read our comments, but Nvidia will read a review that elevates those concerns.
 
The one where the reviewer claimed the card was utterly unfit for anything other than 1080p ... while standing in front of a graph showing the card running a 4K game at 117 fps?

Anything for a click eh?
you mean that almost 8 years old game? And every other game was below 75 or something, even 10 years old GTA 5 had like 70, some games even below 3060 Ti. You can go argue with the original author of those benchmarks, here its useless anyone can watch that video.
 
Some of these visuals I could hardly give a duck about. "Ooh, the shadows! Ooh, the reflections!", etc - meanwhile, I'm more concerned about gameplay, or a good story
The large majority of gamers don't seem to care about graphical prowess.

The Nintendo Switch userbase is larger than PC+PS5+Xbox combined.

This focus on high-end hardware on PC might be helping Nvidia and AMD sell GPU, but it certainly isn't helping game developers sell more games.

loihlkjh.png
 
The one where the reviewer claimed the card was utterly unfit for anything other than 1080p ... while standing in front of a graph showing the card running a 4K game at 117 fps?

Anything for a click eh?

Is that part about 77% of gamers still game at 1080p even remotely accurate?

If so that is really mind blowing... I mean... 1080p TVs were a thing when HDTV first came out... but doesn't everyone at least have a $299 4K TV from Best Buy now?

Maybe I'm speaking for the minority because I game in 4K just like I watch my home theater movies in 4K. It costs more... and is worth the cost.

The large majority of gamers don't seem to care about graphical prowess.

The Nintendo Switch userbase is larger than PC+PS5+Xbox combined.

If the above reference is true I would have to agree. The only console I have is a Switch... and that's for portability and for the fact I can easily play all my NES/SNES favorites from when I was a kid. I don't care about today's games... but I love the old school stuff.
 
I think we need some more activism and agitation in the tech review space. Big companies like Nvidia and Intel don't listen to that individual users want. They do listen to what influential individuals or publications want. Having the tech reviewers act as intermediaries between the users and the companies establishes a feedback channel that is harder for companies to ignore.

I see Nvidia slipping into that complacent state Intel was in during the AMD Bulldozer years. Intel had everyone locked up on the idea of 4 cores and incremental improvement for many generations of desktop and mobile chips. Meanwhile the ARM space went crazy with huge performance jumps, many more cores, big and small cores, and cores for different specific tasks. If tech reviewers had been more active and agitated Intel, then maybe we wouldn't have had to wait for AMD's Ryzen success to see the x86 market return to big jumps in performance instead of incremental bumps.

I want to see Nvidia agitated. I want to see them sent a clear message of how they're creating a stagnant graphics market. I want to see Nvidia regard game consoles as serious competition again. The RTX 4060 and RTX 4060 TI miss the mark. Many people commenting on this thread want the same. Nvidia won't read our comments, but Nvidia will read a review that elevates those concerns.
Well said! And I actually had hexus.net (RIP) do that for me once with Zotec after getting one of their bogus GTX 780 bios that had much lower clock frequencies (didn't have bios on webpage was reason for help needed)...But I agree this would be a great way to handle these situations. All these reviewers claim to be users (I believe them fyi) so who better to act as our emissaries.
 
  • Like
Reactions: bigdragon
I think we need some more activism and agitation in the tech review space.
Agitate all you wish; you're not going to overrule the laws of physics and economics. Moore's Law is nearly dead for logic (and long-decayed for analog); and inflation is substantial enough where the cost increases between card generations is now significant. As much as you miss those heady days when each and every card in a new generation contained astronomical advances in price/performance -- they're still not returning. And if you think it's bad now, wait until the N2 node comes along, Some configurations there are showing actual declines in price/performance (though still with perf/power gains).
 
After listening to the FullNerd, I think mr Gordon makes really good questions and comments about the 4060ti:

- What led nVidia to make these calls on the configuration? Absolutely interesting question that, I think, nVidia will never ever give an honest answer to.
- The performance delta in favor of the 4060ti on a PCIe3 BUS will just disappear and the 3060ti will end up being better all the time thanks to being X16 and not X8 like the 4060ti and crush it above 1080p, leaving only frame generation (fake frames), AV1 encoding and lower power consumption as the only redeeming features. Which, I think it is spot on. This card can't be recommended to anyone still on anything that sports PCIe3 as an upgrade over the 3060ti. This is a hard catch 22 from nVidia.

To put it in nVidia's own words: "this is a great product with a terrible name (and price!)". Seriously, all of the data, specs and impressions of the card makes it be more of a successor of a 3060 and not the 3060ti. Hell, even the 3060 has all X16 PCIe4 and only at the 3050 level they cut the PCIe to be X8.

The more data is put forward, the worse this card gets at that price.

Overall, given everything I've read, watched and mulled over: if you can wait, just wait for either price drops or just the new generation at this point.

Regards.
 
Nvidia themselves are touting this as a 1080p card, shill much?
You say that while posting a graph of 54 fps avg. performance in 4K (and 97 fps @ 2K from the same review). Both are extremely playable. Nor am I aware of any statement from Nvidia that claims this card is only useful at 1080p. Are you?

And I'll kindly ask you to moderate your tone. Everyone who disagrees with you isn't a evil bloodsucking Nazi. I, for instance, am a bloodsucking capitalist.