Review Nvidia GeForce RTX 4060 Ti Review: 1080p Gaming for $399

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
You probably know this, but if you have v-sync on and don't have a freesync/gsync monitor there is a big difference between 58 and 60 fps as it will drop you to 30 fps and create input lag and other issues. But if you're buying a new video card and pairing it with a non freesync/gsync monitor I'd suggest you get a new monitor.
Still I agree that arguing over 58 vs 60 fps average isn't that big a deal when most games have video quality and resolution scaling options that can make 60 fps achievable.
No this person doesn't know that, at one point they argued 24 fps is fine because that's what movies are shot in.
 
No this person doesn't know that, at one point they argued 24 fps is fine because that's what movies are shot in.
Well to be honest I'm not 100% sure things to drop to 30 anymore. I know they used to, and in those days I'd just turn v-sync off. With higher refresh rate monitors I noticed the tearing alot less. I think some game engines have frame generation/duplication to keep it from dropping to 30 and I think nvidia and amd might have driver options to do the same thing and whatnot. It hasn't really been an issue for me but I also target alot more than 60 fps. I consider that more of an acceptable minimum vs acceptable average. Average implies you could be dipping into the 40's in intense action or whatever which isn't at all what you want.
 
The 4090 does it by using as much juice as an oven.

Well at least my CPU helps offset the monthly light bill expenses. 🤣

When I was parts picking a graphics card for my current PC , I knew I wanted a strong 4k card. The 4090 was certainly that, but the power consumption was daunting, and the price / availability was terrible - over $2000 was just a bridge too far!

Despite the reviews on the 4080 wailing about the value, I selected it because it met my goals better than the alternatives, and I got it at Nvidia MSRP (a rarity at the time). ATM you can get a 4090 for $1,599.99. If I was buying now, I would have to chew on it, but would be tempted. Meanwhile I am happy chugging along with good frame rates in AAA titles at 4k.

My personal take on an acceptable 1440p card (what I would call mid-range) would be one that will run high settings at that resolution with room to spare on fps for a reasonable degree of future proofing. We all know the 4060 TI is not that, it is a 1080p card (lower tier). Unfortunately it is not all that even at 1080p. PC world posed the question "Are you willing to pay $400 for a 1080p graphics card with 8GB of memory in the year of our lord 2023?".

They went on to say:

"The decision to outfit this card with 8GB of memory means it won’t be able to run all games with eye candy maxed out even at 1080p, by Nvidia’s own admission...

In Nvidia’s own early RTX 4060 Ti performance teasers, it had to reduce graphics settings in A Plague Tale: Requiem and Resident Evil Remake even at 1080p, while recent games like The Last of Us and Hogwarts Legacy also struggle with 8GB at the lower resolution unless you drop settings.

Frame Generation is pretty damned cool, but Nvidia’s vaunted DLSS upsampling loses quite a bit of its luster at 1080p (along with AMD’s rival FSR) because of the limited amount of pixels available at this resolution."


They summed up by saying "look elsewhere" and suggested:

"If you have to buy right now, I’d rather spend my money on a Radeon RX 6600 for around $200 if I was gaming on a 1080p/60Hz monitor, or opt for the Radeon RX 6700 XT and its 12GB of memory on a 256-bit bus for $350...

That said, I’d hold my horses if I could. Nvidia already teased a $299 RTX 4060 with DLSS 3, AV1, and extreme power efficiency for July. Plus, the rumor mill is screaming that AMD could launch a $300 Radeon RX 7600 any minute now. That price point is a lot more palatable for 1080p gaming on 8GB if you don’t need Nvidia’s deep feature set."


Now that seems like good advice.
 
When I was parts picking a graphics card for my current PC , I knew I wanted a strong 4k card. The 4090 was certainly that, but the power consumption was daunting, and the price / availability was terrible - over $2000 was just a bridge too far!

Despite the reviews on the 4080 wailing about the value, I selected it because it met my goals better than the alternatives, and I got it at Nvidia MSRP (a rarity at the time). ATM you can get a 4090 for $1,599.99. If I was buying now, I would have to chew on it, but would be tempted. Meanwhile I am happy chugging along with good frame rates in AAA titles at 4k.

As long as you are getting the performance you want that's all that matters. I had a 3090 Founders that I sold on eBay for $700 which made swallowing the price of the 4090 a bit easier.

There were no 4090 Founders in stock that I could find so I went with the Gigabyte Aero for $1749 and it hasn't disappointed me. No way would I have pulled the trigger on one of those $2000+ cards.. I just didn't see the additional value in it.

Quite happy with the overall 4K performance. 👍
 
  • Like
Reactions: CeltPC
I agree. People place too much hope on AMD. I support them, because it's good to have an option, but their recent choices are solely to make profit and have nothing to do with satisfying consumer needs. RX5500 was the last budget card that made sense from AMD. It's hard to find a used one at reasonable price, even. I bought used, mining RX570 and with how things are it might be best to wait for a ZEN4 APU than buy a new GPU.

When you mention other reviews and their goal to get the most views... It's really hard to reach a conclusion on what to think about RX4600ti. The sites I trust all have different reviews, conflicting each other, so the only conclusion I came to:
"Can I afford the card?" - No.
"Am I willing to save money to change old RX570?" - Again, no.
I bought a bunch of rx580's and I feel like they were good cards and good value. I haven't seen anything from AMD since that I feel matches the value of the 580's. I also had some 1070/1070 ti's and basically feel the same way about nvidia though. so... I did buy a 3070 ti (pandemic) and a 4070 ti (few months ago). I really wish the 3070 ti had more ram but it was hard to be picky when you couldn't buy a gpu at all. The 4070 ti I really wanted a bit more gpu ram but didn't want a monster card either because 95% of the time it's doing office work. I kinda regret both the 3070 ti and 4070 ti because of low ram but AMD's offerings sounded like they would never clock down because of my monitor configuration even when just doing office work and I didn't want that. Also AMD's ML/AI support is nowhere near nvidias, and while I only need it a few times a year, it would suck to not have it.
 
I agree with the comment that said the 4060ti seems at best to be about 15% faster than the 3060ti. This card should have come out with at least 10-12 gb of vram. Instead it has trouble competing against the 6700xt and 6750xt at times. Those cards can be had just over 309. If they wanted to really kick nvidia in the teeth, drop a 7700xt that’s close to the speed of the 6800xt but with 12gb vram. Then drop the 6700xt to say 250 to get rid of the stock. Put the 7600 at 230-250. Slot in the 7700xt at 325-350. Then bring out a 7800xt with 16gb of vram with 6900xt-6950 performance level. If you dropped that card at 500-550 people might buy.
 
Lower all the in game graphics except texture quality then increase res to 1080p, play perfectly okay AAA games without getting crazy with all those RTX nonsense. 30-45fps. GTX 1050,1060,1660,1650.

There, I saved you dollars, advanced people doing this for ages.
 
I'm just waiting for Nvidia to reveal the RTX 4050 with 12gig of EDO ram :tearsofjoy: but that would be some feat to be fair lol
 
Last edited by a moderator:
Folks are confusing stuff, it's not so much the size of VRAM but it's memory interface speed. In this case, that 128-bit bus is an instant DOA. That is the bus of a 30-50 class budget part priced at $200. Other benefits could justify $300 tops.

As for the reason, it has to do with the mining craze and price normalization. During that time those $200-300 cards sold for $400+, and nVidia is trying to continue that trend but pocketing the profit for themselves.

This card should be a hard pass for everyone with a 20x or newer, heck even the 1060 folks should look elsewhere.

3.5/5 is way way to generous, maybe 2/5 at best. The value just isn't there.
 
Folks are confusing stuff, it's not so much the size of VRAM but it's memory interface speed. In this case, that 128-bit bus is an instant DOA. That is the bus of a 30-50 class budget part priced at $200. Other benefits could justify $300 tops.

3.5/5 is way way to generous, maybe 2/5 at best. The value just isn't there.


👍 👍

View: https://www.youtube.com/watch?v=516OexsGlO0


You reminded me of a video I saw last night. The 1080 Ti in 2023 is a better value!
 
  • Like
Reactions: palladin9479
Ok now to discuss memory and why 99% of folks are wrong. Resolution size has very very little to do with direct memory utilization. Its texture size that matters, and that is only loosely correlated with resolution. Modern games are being HORRIBLE with this by just shoving 4k textures along with the generated 2k, 1k and 512 versions into memory without checking if it actually needs them.

Most textures are rendered at 256 or 512 because they are at a distance and there is much less then 256 or 512 pixels of screen space to hold them. 4k textures would only matter if you were face hugging a wall with that single texture. Because of this there is virtually no difference between 1k and 4k textures at any resolution under 16k. So instead of pulling a crysis and clicking "ultra max optimized" just drag the texture detail slider down one or two notches.

Updating with more information as I'm back on the computer and not using a phone. If we stop and think about it, most textures on the screen are measured in hundres of pixels at best, a model of a robot we are shooting at might contain hundreds of textures, each relatively small. The final screen space for a single texture may only be a 20x30 rectangle or 600 pixels. Storing that texture at 4096x4096 pixels, or 16.7 million pixels is extremely wasteful. Never mind that we are also storing a 2048x2048 and 1024x1024 pixel copy along with it (some engines only store two down, others three down). Sticking with a 2048x2048 or 1024x1024 version of that texture would get is the same 20x30 rectangle, but a quarter of the memory requirements. What makes this even more wasteful is that game engines don't even bother using the 4096x4096 stored texture anyway, it just gets down sampled to a 256x256 or 512x512 prior to being rendered at 20x30. Modern games are, quite literally, filling GPU's with gigabytes of worthless data they don't even bother using, just to pretend to be the next "Ultra HD" video game.

Like how would everyone react if a game used part of your GPU to calculate prime numbers with each frame, then threw the answer away? Yeah that's basically what they are all doing with your GPU memory.
 
Last edited:
Ok now to discuss memory and why 99% of folks are wrong. Resolution size has very very little to do with direct memory utilization. Its texture size that matters, and that is only loosely correlated with resolution. Modern games are being HORRIBLE with this by just shoving 4k textures along with the generated 2k, 1k and 512 versions into memory without checking if it actually needs them.
I can't contradict what's current/common practice, but it's not necessary nor always true. Direct3D 11.2 and OpenGL both support "Tiled Resources", which enables sparse textures (among other things).

Also, the cost of a full MIP map is 4/3 of storing the texture at maximum size. In other words, to store all power-of-2 resolutions of a texture, it adds just 1/3rd of the highest resolution.

So, I'd say the generated lower-res versions aren't the main problem. The real issue is just not demand-paging a texture that's too big. Perhaps the reason demand-paging isn't more common is due to latency, but hopefully, DirectStorage will fix that.

It does surprise me to hear that games don't automatically exclude higher-res levels of their textures on lower-memory graphics cards and/or for lower-resolution render targets.
 
Last edited:
Beat me to it, Colif!

Regards xD
Dang we were all on the same page here I came to post this as well. I feel even stronger that Tom's needs to do a deep dive on this card. Its not looking good...To many outlets, minus JayzTwoCents now pulled video, are saying the same thing. This card falls behind the 3060Ti it replaces in WAY to many scenarios. Basically what I am hearing is unless you're running a PCIe 4.0 system or better this card is a worthless upgrade compared to the 3060Ti. And even in a PCIe 4.0 system if you jump to 1440P again the 4060Ti frequently wets the bed again.

Its with the the uttmost respect I say this. @Jarred Walton I love your articles but your benchmarks and take on the 4060Ti seems to be the odd man out in reviews (not to say you are wrong you may well be in the right). I wish you would please address this in an article not just the forums. I suspect there is a setting other outlets missed but honestly have no idea to account for the discrepancies we are seeing. I would just hate for this to become a black eye for Tomshardware because they didn't look into this further.
 
Last edited:
I was wondering if Nvidia originally intended this to be the RTX 4050 Ti, but then changed their mind after seeing how much backlash they were getting for price inflation. Calling it the RTX 4060 Ti means they could charge more for it ...assuming it had enough performance to justify the designation. That would appear to be too big a stretch.

If they're counting on DLSS to make up the difference, it would be good to know. That's why I'm saying I'd like to see benchmarks between 3060 Ti and 4060 Ti with DLSS (but not frame generation) enabled.
 
  • Like
Reactions: atomicWAR
They using DLSS as a crutch to make the new 60 card look better than old one. Its also about only reason they beat AMD cards in some tests. HUB suggested they could be using a tick tock approach, and perhaps 50 series will offer actual performance gains at all levels.

Nvidia don't have eye on the gaming GPU market, they playing another game. AI is where they making money and so don't need to worry about making gamers happy. How long they can do that depends on how long the bubble lasts. They are losing market share in gaming but when you almost worth 1 trillion, who cares about peasants. That and so many people out there only think about Nvidia when they want a new GPU that the company ignoring them doesn't really matter.
 
  • Like
Reactions: atomicWAR
To many outlets, minus JayzTwoCents now pulled video, are saying the same thing. This card falls behind the 3060Ti it replaces in WAY to many scenarios. Basically what I am hearing is unless you're running a PCIe 4.0 system or better this card is a worthless upgrade compared to the 3060Ti.
Clicking through the 1080p results, I'm seeing only 2 or 3 cases where they draw equal or worse. I was expecting to see a few outliers really skewing the mean, but I don't. Did those other reviewers all use such old systems they don't even have PCIe 4.0? That's like Zen+ or Comet Lake.

Its with the the uttmost respect I say this. @Jarred Walton I love your articles but your benchmarks and take on the 4060Ti seems to be the odd man out in reviews. I wish you would please address this in an article not just the forums. I would hate for this to become a black eye for Tomshardware.
I think it's worth trying to understand how they arrived at different conclusions. Until we know that, I wouldn't necessarily put Jarred/Toms in the wrong. That exercise could also help improve future benchmarks (i.e. maybe game selection & settings needs an update?). Could it have anything to do with host selection or configuration?
 
  • Like
Reactions: atomicWAR
That and so many people out there only think about Nvidia when they want a new GPU that the company ignoring them doesn't really matter.

If a company has a product I want... I'll buy it. It's that simple. Has nothing to do with loyalty. If it did I would have stayed with Intel for the new build instead of going AMD for the first time since 2001.

I've never owned an AMD GPU... ever. Every time I have been in the market Nvidia had the better card so I haven't had any desire for AMD's cards.

I had also never owned a Ford vehicle until I bought my Bronco a few months back. Reasons behind the purchase are every other SUV is the same cookie cutter look... then you have the Bronco on another level all its own. 🤣

With all the garbage non-4090 cards being released by Nvidia right now AMD should be taking this opportunity... but it's my understanding they are content not to.
 
Clicking through the 1080p results, I'm seeing only 2 or 3 cases where they draw equal or worse. I was expecting to see a few outliers really skewing the mean, but I don't. Did those other reviewers all use such old systems they don't even have PCIe 4.0? That's like Zen+ or Comet Lake.


I think it's worth trying to understand how they arrived at different conclusions. Until we know that, I wouldn't necessarily put Jarred/Toms in the wrong. That exercise could also help improve future benchmarks (i.e. maybe game selection & settings needs an update?). Could it have anything to do with host selection or configuration?
I didn't mean to state Toms or Jarred is in the wrong. Quite the opposite actually. I have ZERO idea who is in the wrong (I haven't done any benchies on the 4060 Ti to know better). BUT I do feel like there is enough going on here to warrant a deeper dive into the 4060 Ti at this point. I suspect there is a setting or something that is being overlooked. Wouldn't be the first time something of that nature skewed benchmarks results for sites.

All the points you bring up are completely valid; however, there is clearly something up with the benchmarks of the 4060Ti. Again I am seeing more reviews saying the 4060Ti is a problem and showing benchmarks to back it up (GamersNexus, Hardware unboxed, Daniel Owen and the list goes on). Yet Toms shows it is a far better light in benchmarks. Something is not adding up. Are the other reviewers just doing click bait titles with cherry picked examples to 'back them up'? Or is it something more innocent like the issue we had a few years back with resizable bar that messed up reviews across the board and many outlets had to re-review the product (Edit update: it was for RX 6000 series GPUs and I know Intel GPUs also had an issue with resizable bar needing to be on). Point being I really don't know who is in the wrong thus why I am asking for a deep dive on this issue. If they don't I do worry this will be a black eye on par with the RTX 2000 series "just buy it" debacle years ago. That was all I meant by my post. I love Toms and only want this addressed for the good of the community and the site itself.
 
Last edited:
Are the other reviewers just doing click bait titles with cherry picked examples to 'back them up'?

I don't think it's 100% the reason but definitely has something to do with it. I subscribed to Daniel Owen for like 4 days until his annoying voice paired with his video titles had me clicking unsubscribe.

For all you YouTube content creators out there... making a video and then putting a title with a screenshot of you wearing a stupid look on your face DOES NOT make me want to watch your video.
 
I don't think it's 100% the reason but definitely has something to do with it. I subscribed to Daniel Owen for like 4 days until his annoying voice paired with his video titles had me clicking unsubscribe.

For all you YouTube content creators out there... making a video and then putting a title with a screenshot of you wearing a stupid look on your face DOES NOT make me want to watch your video.
I do as well and thus my request. Likely there is some middle ground somewhere in all of this that could be found with a deep dive. I think the truth is important on this and all the more reason Toms should address this head on with an article. Toms has never shied away from topics like this in the past and they shouldn't start now IMO. Its what brings me and many others here to Toms to read reviews. Because we know they value the tech community and will do their best to give us honest reviews/answers to the products being released.
 
Last edited: