Review Nvidia GeForce RTX 4060 Review: Truly Mainstream at $299

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I noticed that in the gaming benchmarks, the 3060 is shown as getting 55.2 fps in the 1080p game average. Yet when it was tested in October last year, it was getting 98.5 fps in the same 13 game average test? Ploise explain?
Different games, and when it was tested — which wasn't actually in October, that was just when the original article was converted from a single long page into five individual pages; it was actually tested in February 2021 — games were not as demanding. I didn't use as many ray tracing games in that 2021 test suite either. The list of tested games back then was:

Assassin's Creed Valhalla
Borderlands 3 (I now test at the "Badass" preset, where before it was at "Ultra")
Dirt 5 (DXR, but this was a really weak implementation of RT shadows)
The Division 2
Far Cry 5
Final Fantasy XIV
Forza Horizon 4
Horizon Zero Dawn
Metro Exodus (no RT enabled IIRC)
Red Dead Redemption 2 (I now test at higher "maxed out" settings)
Shadow of the Tomb Raider (again, no RT)
Strange Brigade (a super lightweight game, which skews the average fps up)
Watch Dogs Legion (the only "real" DXR game IMO, because the reflections actually did matter a bit)

So if you look at that list, four of the games are still in my current test suite, but the settings used on three of those are different. Strange Brigade, Shadow of the Tomb Raider, Forza Horizon 4, Final Fantasy XIV, and Far Cry 5 are all quite a bit less demanding than most of the games in the current test suite. Plus I use six "heavy" DXR games now in the 15-game suite, and that pulls the average way down. If you only look at the rasterization suite, the 3060 currently gets 69.9 fps — still lower, but even the current rasterization games are in most cases more demanding than the above 2021 list.
 
  • Like
Reactions: AndrewJacksonZA
Yes, and I already noted that things like standing close enough to an object that it more than fills the viewport will potentially use higher resolution textures. At the same time, there will be lots of stuff (think of all the little objects like grass and detail textures) that basically won't render at more than a few hundred pixels. Most likely the artists / developers don't even package 1K textures for things like that.

I have done an article on this subject. I've also included screenshots at various settings in this article, though I didn't take the time to only change texture quality. I've done that in individual game benchmarking articles though (Dead Island 2, Star Wars Jedi: Survivor, Lord of the Rings: Gollum, and Redfall most recently).

I haven't tried to suss out whether those games are using 4K textures or 2K textures, but even at 4K resolution, turning down texture settings from max to min tends to only affect closer objects, as you'd expect from mipmapping. And let's not even get into the bit about TAA blurring things to try to eliminate jaggies, so that all the benefits from those ultra high resolution textures are wiped out.

I agree that it really depends on the game, and in the past the devs were far more reasonable on managing system resources. It just seems the past year or two has been a race on who can put out he bigger flashier game at "Ultra Melt my PC" settings. Diablo 4 is a very good example of this, the game is normally ~45GB install but the "HD texture" option is another 40GB and they are indeed 4K textures. All the reviewers selected that option, immediately selected "Ultra 600W" settings, tested it on a 3060 or recently a 4060ti and loudly declared "8GB isn't enough for modern gaming". Thus every few posts is someone lamenting on a card as weak as the 4060 / 4060ti not having more then 8GB of memory. As if adding would let it get anywhere close to playability on "Ultra Neighborhood Blackout" settings.

I think the game developers are smart and do resource management at the High and below settings, on Ultra they throw that out of the window cause like "you chose ultra, you get ultra".
 
I agree that it really depends on the game, and in the past the devs were far more reasonable on managing system resources. It just seems the past year or two has been a race on who can put out he bigger flashier game at "Ultra Melt my PC" settings. Diablo 4 is a very good example of this, the game is normally ~45GB install but the "HD texture" option is another 40GB and they are indeed 4K textures. All the reviewers selected that option, immediately selected "Ultra 600W" settings, tested it on a 3060 or recently a 4060ti and loudly declared "8GB isn't enough for modern gaming". Thus every few posts is someone lamenting on a card as weak as the 4060 / 4060ti not having more then 8GB of memory. As if adding would let it get anywhere close to playability on "Ultra Neighborhood Blackout" settings.

I think the game developers are smart and do resource management at the High and below settings, on Ultra they throw that out of the window cause like "you chose ultra, you get ultra".
It's funny you mention Diablo IV, because I did a bunch of overnight testing on the day of launch, and I absolutely didn't go off about how you needed more than 8GB for 4K ultra, even with the HD pack. That was especially true with DLSS/FSR2/XeSS upscaling, where even the Arc A750 and above averaged more than 60 fps.

Without upscaling? Yes, the RTX 3060 fell below 60 fps, but how many people really have an RTX 3060 paired with a 4K display where they expect native performance to be great at max settings? If people think that way, they really have no clue about PC hardware.
 
  • Like
Reactions: adbatista
It's funny you mention Diablo IV, because I did a bunch of overnight testing on the day of launch, and I absolutely didn't go off about how you needed more than 8GB for 4K ultra, even with the HD pack. That was especially true with DLSS/FSR2/XeSS upscaling, where even the Arc A750 and above averaged more than 60 fps.

Without upscaling? Yes, the RTX 3060 fell below 60 fps, but how many people really have an RTX 3060 paired with a 4K display where they expect native performance to be great at max settings? If people think that way, they really have no clue about PC hardware.

You Toms didn't outright say it, lots of folks on YT did though and reading through the comments on every one of these threads is the same message "8GB isn't enough" and so forth. And you are 100% correct that nobody with a 4060(ti) should be trying to play Diablo 4 on Ultra settings in the first place, but that was the first type of benchmark graph that gets shown.

I don't count "upscaling", that is just a cheap filter used to render at one resolution and then stretch it to fit a higher resolution. Couple decades ago ATI got caught doing something like that and it turned into a bunch of negative press. It's a nice feature that should be mentioned and demonstrated but not used as a benchmark for the performance of the hardware.
 
View: https://www.youtube.com/watch?v=_NaIQr_0IZg


Two noteworthy points:
1.- The 4060 is the only "60" card that loses to the previous generation in any given game.
2.- The 1060 can still get playable FPS in some titles thanks to FSR2 and NOT DLSS2, because nVidia just threw all their users from Pascal and before under the bus.

The more information I find abut the 4060, the more evident it becomes it's a full on embarrassment.

Regards.
 
>I am REALLY curious to see what Nvidia actually does. I think a desktop RTX 4050 might arrive, and I wouldn't be shocked if it uses a 96-bit interface and 6GB. We'll have to see.

I doubt 4050 will happen, for many reasons. a) If it cuts down the hardware as you said above, there'll be another negative PR wave from peeps decrying DOWNGRADE from 3050, just as 4060 faced. If performance is reduced substantially, Nvidia will be lambasted for selling wares with 2015 perf in 2023. It's bad for the Nvidia brand no matter how you slice it.

b) It's also bad from market positioning view. The 3050 was an attempt to bring to market an affordable product during the crypto craze, at a time when 3060 was selling for $400-500, out of reach for the mainstream. That price gap no longer exists. There are plenty of cheap cards from older generations. A 4050 can't match the bang/buck of those cards, and will have poor sales.

c) As said earlier, the dollar's value has been reduced substantially within a single generation. The $299 4060 now is worth about $260 then, which means the 4060 has essentially replaced the $250 4050 as the entry-level product. As well, the floor of PC gaming requirement has risen since the last gen, and dropping perf further would only damage Nvidia as the better-performing brand (relative to Radeon). Nvidia already owns the lion's share of the gaming market, and AMD isn't contesting aggressively. Nvidia doesn't need more low-end entries which would only dilute its brand strength.

d) Allocating resources to more low-end products like 4050 does not make economic sense. We all agree that AI has much higher priority for both Nvidia and AMD. Nvidia cannot meet demands for the $40K H100 and $10K A100 as it is, and margins for those would be in the thousands or tens of thousands. Why would it devote any production capacity toward a $250 part for which margin would be in the tens of dollars?

That's what stands out to me the most in all this "4060 is overpriced" ballyhoo. If 4060 had been priced at, say, $269, peeps would've been overjoyed. Nvidia is being blasted for what is basically a $30 difference. Consumer market is very price-sensitive at the low-end. Why would you (Nvidia) want to chase after low-end dollars when you already own most of the market, and there are much higher margins to be made elsewhere? The bang/buck for Nvidia is simply not in the low-end segment. If AMD wants to chase after that, let them.
 
I doubt 4050 will happen, for many reasons. a) If it cuts down the hardware as you said above, there'll be another negative PR wave from peeps decrying DOWNGRADE from 3050, just as 4060 faced. If performance is reduced substantially, Nvidia will be lambasted for selling wares with 2015 perf in 2023. It's bad for the Nvidia brand no matter how you slice it.
I won't be surprised by a RTX4050. Something will be done to defective AD107 dies. I really bet on a MX650 like chip for laptops. But I think we will see a 4050 launched with no fanfare. Won't be sent to reviewers (for it's price we will see a lot o them anyway) and the focus will be selling in China. Here at my country I'm sure that at least from Aliexpress lots of people would buy it, but in our stores, where today a 3060 8GB is still more expensive than 1 month wage for many people, we will see cheaper cards like 3050 or 7500. You'd be surprised how well 3050 sold here for USD$ 300 during pandemic, or for 250 last year. Last black Friday RX6500 sold a lot for $220.
 
  • Like
Reactions: adbatista
I am from that "better" part of the world, EU country, average wage is 1300€ (heavily influenced by capital city, other regions have like 1000€), minimum is 700€ and that's before taxes and mandatory insurance, netto those 1300 are about 950-1000ish €. From my experience people just buy consoles and it's mostly because of GPU prices (just my personal experience from talking to dozens of gamers around me). This is not just game developers doing, let's NOT present it as some kind of generous gesture from Nvidia/AMD duopoly to constrain VRAM to 8gbs in hopes of forcing developers to optimise more....no no no, this just an upselling strategy. Slapping extra 4gbs of ram adds very little extra cost to the card, they just make it some kind of luxurious aspect of a product but it isn't. They are taking a risk of slowing down whole PC HW and gaming market but it seems they don't care.
Not so easy. For example, the guy who installed my internet (i live in a small town in a state in middle of a rain forest, so infrastructure here is bad, still I pay less than USD$20/mo for 300Mbps) earns minimum wage (today is USD$270). If would buy an Xbox Series S would cost $280. Series X close to 760. Any game $62. With $400 you can buy a computer that play and also do other things. Any $20 increase in GPU price costs some $32 here. Way too much for some people pay. That's why so many people went for 3050 with 3060 available less than $50 more. Now-days 3060 12GB is less than $20 more expensive than 3050, but there are still people that buy the terrible 3050 (note that 6600 is available cheaper ).
 
  • Like
Reactions: adbatista
If they can make a GTX1630 they will certainly dump excess chips into at least the Asian market with a 4050. But I agree that will only come after demand of low-end mobile GPUs is completely met.

Top MX570 is an Ampere chip at the moment, basically a crippled RTX3050 with slow memory.

Shockingly the first time I have seen the GTX1630 actually available for less than the cheapest GTX1650. Still only $15 difference.
 
  • Like
Reactions: bit_user