Review Nvidia GeForce RTX 4060 Review: Truly Mainstream at $299

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
This happens on a per-pixel basis, often along with interpolating between the two resolutions on either side, so you don't get a hard edge at the transition from one texture resolution to another.

No, textures are often far larger than individual polygons. Even then, if I use a texture map for a wall, and the wall is 20 feet high, then when I'm standing next to it what's filling my viewport is actually just a rather small subset of the entire texture.

That's obviously a simplistic example, but it illustrates the point that 4k textures aren't necessarily a waste, even at 1080p. It all depends on how the texture is used.

It would probably make a good article to compare screenshots of a few games vs. different texture resolutions, to actually test how noticeable the difference is.
Yes, and I already noted that things like standing close enough to an object that it more than fills the viewport will potentially use higher resolution textures. At the same time, there will be lots of stuff (think of all the little objects like grass and detail textures) that basically won't render at more than a few hundred pixels. Most likely the artists / developers don't even package 1K textures for things like that.

I have done an article on this subject. I've also included screenshots at various settings in this article, though I didn't take the time to only change texture quality. I've done that in individual game benchmarking articles though (Diablo IV, Dead Island 2, Star Wars Jedi: Survivor, Lord of the Rings: Gollum, and Redfall most recently).

I haven't tried to suss out whether those games are using 4K textures or 2K textures, but even at 4K resolution, turning down texture settings from max to min tends to only affect closer objects, as you'd expect from mipmapping. And let's not even get into the bit about TAA blurring things to try to eliminate jaggies, so that all the benefits from those ultra high resolution textures are wiped out.

There are always edge scenarios where you can make a game in such a way that using higher resolutions textures makes sense. In fact, one option is to use a single 8K texture that has lots of sub-textures. It could contain 64 1K textures, or 256 512x512 textures. Or it could be for a sky or space map, where only a small section of that large texture would normally be visible on screen at any time. But that's not what we're talking about.

I'm talking about a typical game like The Last of Us, Part 1, where there are lots of small surfaces in any given frame, and very few of those will use more than a 512x512 texture size with a standard mipmapping implementation.

[And I also need to mention here that modern games don't use a single texture for a single polygon. You'll often have a high resolution texture that wraps around hundreds of polygons. But the point still stands that if the wrapped object only covers say a 500x1000 area of the monitor at 1080p — like a standing character at relatively close distance — wrapping that object with a 4096x4096 versus a 2048x2048 versus a 1024x1024 texture will typically result in nearly the same final rendered output.]
 
Jun 30, 2023
1
0
10
One thing i think about this card with 8GB is that will make devs work a little more to keep memory usabe under 8GB. There are Just too many users with cards with 8 ir 6GB. Look at Steam Survey... I know It is crap, but gives devs Idea of what ia their potential market.

One more thing. Outside USA and EU there are millions of people with less money. Here The minimum wage ia about $300. Most people earn under 500. So I see many buying used Xeon processors from Aliexpress. I'm a teacher and some of my high School students that have gamer PCs use these old xeons, i3 or R3 with a RX550 or GX950. That's all they can afford. One friend of mine that really likes spending on his gaming PC is keeping his R5 2600 with a 1060. Another Will buy a new PC and best thing he can find is a R5 4500 with a RTX3060 for about $1000. Just to play Starfield. He uses an old laptop with 1650 to play Dota and CoD. Don't think he can afford more than that. And I think this is the reality in countries like all in Latin America, Ásia and other poorer countries. IF the dev raises the bar too much, will lose too many gamers. Bethesda wants to sell quite some copies in latin America. Wont IF person needs a 3070 that didn't sell 100.000 in Brazil, or México. And can add that I have the same problem, I was also pretty bad at school in terms of learning but also in terms of some physical exercises I was no different from a person who doesn't love physical education, this discouraged me a lot, because I felt useless, then in college I managed a little bit to get up, it helped me a lot story of my life essay, it showed me perfectly what I am and where I should go to be successful, now I'm a successful man, I opened several IT related businesses and especially PC peripherals, this excites me a lot and because now I know who I am and where I have to go I feel much better, unfortunately this problem is quite widespread among young people, this is very sad, because I see how many young people don't know what is with their life and what they want from it. I hope that the education system has become better since then and there will be less like me.

I have a student that was saying he had a GX940 (not sure, but low end from that generation) and just upgrade to a RX5500 8GB (from Aliexpress, god only knows what was its past life). Runs happily anything he tries and only costed $100. He luckyly didn't pay customs fee (60% here, that's why a $200 card costa $320 here). I wont be surprised if 5060 vanilla comes with 8GB in a few years. But I hope It comes with more.
It's true that there are many users who have graphics cards with 8GB or less VRAM, and game developers do consider the hardware specifications of their target market. They strive to optimize their games to ensure they run smoothly on a wide range of systems, including those with lower-end graphics cards.:rolleyes:
 
Last edited:
I noticed that in the gaming benchmarks, the 3060 is shown as getting 55.2 fps in the 1080p game average. Yet when it was tested in October last year, it was getting 98.5 fps in the same 13 game average test? Ploise explain?
Different games, and when it was tested — which wasn't actually in October, that was just when the original article was converted from a single long page into five individual pages; it was actually tested in February 2021 — games were not as demanding. I didn't use as many ray tracing games in that 2021 test suite either. The list of tested games back then was:

Assassin's Creed Valhalla
Borderlands 3 (I now test at the "Badass" preset, where before it was at "Ultra")
Dirt 5 (DXR, but this was a really weak implementation of RT shadows)
The Division 2
Far Cry 5
Final Fantasy XIV
Forza Horizon 4
Horizon Zero Dawn
Metro Exodus (no RT enabled IIRC)
Red Dead Redemption 2 (I now test at higher "maxed out" settings)
Shadow of the Tomb Raider (again, no RT)
Strange Brigade (a super lightweight game, which skews the average fps up)
Watch Dogs Legion (the only "real" DXR game IMO, because the reflections actually did matter a bit)

So if you look at that list, four of the games are still in my current test suite, but the settings used on three of those are different. Strange Brigade, Shadow of the Tomb Raider, Forza Horizon 4, Final Fantasy XIV, and Far Cry 5 are all quite a bit less demanding than most of the games in the current test suite. Plus I use six "heavy" DXR games now in the 15-game suite, and that pulls the average way down. If you only look at the rasterization suite, the 3060 currently gets 69.9 fps — still lower, but even the current rasterization games are in most cases more demanding than the above 2021 list.
 
  • Like
Reactions: AndrewJacksonZA
Yes, and I already noted that things like standing close enough to an object that it more than fills the viewport will potentially use higher resolution textures. At the same time, there will be lots of stuff (think of all the little objects like grass and detail textures) that basically won't render at more than a few hundred pixels. Most likely the artists / developers don't even package 1K textures for things like that.

I have done an article on this subject. I've also included screenshots at various settings in this article, though I didn't take the time to only change texture quality. I've done that in individual game benchmarking articles though (Dead Island 2, Star Wars Jedi: Survivor, Lord of the Rings: Gollum, and Redfall most recently).

I haven't tried to suss out whether those games are using 4K textures or 2K textures, but even at 4K resolution, turning down texture settings from max to min tends to only affect closer objects, as you'd expect from mipmapping. And let's not even get into the bit about TAA blurring things to try to eliminate jaggies, so that all the benefits from those ultra high resolution textures are wiped out.

I agree that it really depends on the game, and in the past the devs were far more reasonable on managing system resources. It just seems the past year or two has been a race on who can put out he bigger flashier game at "Ultra Melt my PC" settings. Diablo 4 is a very good example of this, the game is normally ~45GB install but the "HD texture" option is another 40GB and they are indeed 4K textures. All the reviewers selected that option, immediately selected "Ultra 600W" settings, tested it on a 3060 or recently a 4060ti and loudly declared "8GB isn't enough for modern gaming". Thus every few posts is someone lamenting on a card as weak as the 4060 / 4060ti not having more then 8GB of memory. As if adding would let it get anywhere close to playability on "Ultra Neighborhood Blackout" settings.

I think the game developers are smart and do resource management at the High and below settings, on Ultra they throw that out of the window cause like "you chose ultra, you get ultra".
 
I agree that it really depends on the game, and in the past the devs were far more reasonable on managing system resources. It just seems the past year or two has been a race on who can put out he bigger flashier game at "Ultra Melt my PC" settings. Diablo 4 is a very good example of this, the game is normally ~45GB install but the "HD texture" option is another 40GB and they are indeed 4K textures. All the reviewers selected that option, immediately selected "Ultra 600W" settings, tested it on a 3060 or recently a 4060ti and loudly declared "8GB isn't enough for modern gaming". Thus every few posts is someone lamenting on a card as weak as the 4060 / 4060ti not having more then 8GB of memory. As if adding would let it get anywhere close to playability on "Ultra Neighborhood Blackout" settings.

I think the game developers are smart and do resource management at the High and below settings, on Ultra they throw that out of the window cause like "you chose ultra, you get ultra".
It's funny you mention Diablo IV, because I did a bunch of overnight testing on the day of launch, and I absolutely didn't go off about how you needed more than 8GB for 4K ultra, even with the HD pack. That was especially true with DLSS/FSR2/XeSS upscaling, where even the Arc A750 and above averaged more than 60 fps.

Without upscaling? Yes, the RTX 3060 fell below 60 fps, but how many people really have an RTX 3060 paired with a 4K display where they expect native performance to be great at max settings? If people think that way, they really have no clue about PC hardware.
 
  • Like
Reactions: adbatista
It's funny you mention Diablo IV, because I did a bunch of overnight testing on the day of launch, and I absolutely didn't go off about how you needed more than 8GB for 4K ultra, even with the HD pack. That was especially true with DLSS/FSR2/XeSS upscaling, where even the Arc A750 and above averaged more than 60 fps.

Without upscaling? Yes, the RTX 3060 fell below 60 fps, but how many people really have an RTX 3060 paired with a 4K display where they expect native performance to be great at max settings? If people think that way, they really have no clue about PC hardware.

You Toms didn't outright say it, lots of folks on YT did though and reading through the comments on every one of these threads is the same message "8GB isn't enough" and so forth. And you are 100% correct that nobody with a 4060(ti) should be trying to play Diablo 4 on Ultra settings in the first place, but that was the first type of benchmark graph that gets shown.

I don't count "upscaling", that is just a cheap filter used to render at one resolution and then stretch it to fit a higher resolution. Couple decades ago ATI got caught doing something like that and it turned into a bunch of negative press. It's a nice feature that should be mentioned and demonstrated but not used as a benchmark for the performance of the hardware.
 
View: https://www.youtube.com/watch?v=_NaIQr_0IZg


Two noteworthy points:
1.- The 4060 is the only "60" card that loses to the previous generation in any given game.
2.- The 1060 can still get playable FPS in some titles thanks to FSR2 and NOT DLSS2, because nVidia just threw all their users from Pascal and before under the bus.

The more information I find abut the 4060, the more evident it becomes it's a full on embarrassment.

Regards.
 

baboma

Notable
Nov 3, 2022
283
338
1,070
>I am REALLY curious to see what Nvidia actually does. I think a desktop RTX 4050 might arrive, and I wouldn't be shocked if it uses a 96-bit interface and 6GB. We'll have to see.

I doubt 4050 will happen, for many reasons. a) If it cuts down the hardware as you said above, there'll be another negative PR wave from peeps decrying DOWNGRADE from 3050, just as 4060 faced. If performance is reduced substantially, Nvidia will be lambasted for selling wares with 2015 perf in 2023. It's bad for the Nvidia brand no matter how you slice it.

b) It's also bad from market positioning view. The 3050 was an attempt to bring to market an affordable product during the crypto craze, at a time when 3060 was selling for $400-500, out of reach for the mainstream. That price gap no longer exists. There are plenty of cheap cards from older generations. A 4050 can't match the bang/buck of those cards, and will have poor sales.

c) As said earlier, the dollar's value has been reduced substantially within a single generation. The $299 4060 now is worth about $260 then, which means the 4060 has essentially replaced the $250 4050 as the entry-level product. As well, the floor of PC gaming requirement has risen since the last gen, and dropping perf further would only damage Nvidia as the better-performing brand (relative to Radeon). Nvidia already owns the lion's share of the gaming market, and AMD isn't contesting aggressively. Nvidia doesn't need more low-end entries which would only dilute its brand strength.

d) Allocating resources to more low-end products like 4050 does not make economic sense. We all agree that AI has much higher priority for both Nvidia and AMD. Nvidia cannot meet demands for the $40K H100 and $10K A100 as it is, and margins for those would be in the thousands or tens of thousands. Why would it devote any production capacity toward a $250 part for which margin would be in the tens of dollars?

That's what stands out to me the most in all this "4060 is overpriced" ballyhoo. If 4060 had been priced at, say, $269, peeps would've been overjoyed. Nvidia is being blasted for what is basically a $30 difference. Consumer market is very price-sensitive at the low-end. Why would you (Nvidia) want to chase after low-end dollars when you already own most of the market, and there are much higher margins to be made elsewhere? The bang/buck for Nvidia is simply not in the low-end segment. If AMD wants to chase after that, let them.
 

adunlucas

Prominent
Nov 5, 2022
8
13
515
I doubt 4050 will happen, for many reasons. a) If it cuts down the hardware as you said above, there'll be another negative PR wave from peeps decrying DOWNGRADE from 3050, just as 4060 faced. If performance is reduced substantially, Nvidia will be lambasted for selling wares with 2015 perf in 2023. It's bad for the Nvidia brand no matter how you slice it.
I won't be surprised by a RTX4050. Something will be done to defective AD107 dies. I really bet on a MX650 like chip for laptops. But I think we will see a 4050 launched with no fanfare. Won't be sent to reviewers (for it's price we will see a lot o them anyway) and the focus will be selling in China. Here at my country I'm sure that at least from Aliexpress lots of people would buy it, but in our stores, where today a 3060 8GB is still more expensive than 1 month wage for many people, we will see cheaper cards like 3050 or 7500. You'd be surprised how well 3050 sold here for USD$ 300 during pandemic, or for 250 last year. Last black Friday RX6500 sold a lot for $220.
 
  • Like
Reactions: adbatista

adunlucas

Prominent
Nov 5, 2022
8
13
515
I am from that "better" part of the world, EU country, average wage is 1300€ (heavily influenced by capital city, other regions have like 1000€), minimum is 700€ and that's before taxes and mandatory insurance, netto those 1300 are about 950-1000ish €. From my experience people just buy consoles and it's mostly because of GPU prices (just my personal experience from talking to dozens of gamers around me). This is not just game developers doing, let's NOT present it as some kind of generous gesture from Nvidia/AMD duopoly to constrain VRAM to 8gbs in hopes of forcing developers to optimise more....no no no, this just an upselling strategy. Slapping extra 4gbs of ram adds very little extra cost to the card, they just make it some kind of luxurious aspect of a product but it isn't. They are taking a risk of slowing down whole PC HW and gaming market but it seems they don't care.
Not so easy. For example, the guy who installed my internet (i live in a small town in a state in middle of a rain forest, so infrastructure here is bad, still I pay less than USD$20/mo for 300Mbps) earns minimum wage (today is USD$270). If would buy an Xbox Series S would cost $280. Series X close to 760. Any game $62. With $400 you can buy a computer that play and also do other things. Any $20 increase in GPU price costs some $32 here. Way too much for some people pay. That's why so many people went for 3050 with 3060 available less than $50 more. Now-days 3060 12GB is less than $20 more expensive than 3050, but there are still people that buy the terrible 3050 (note that 6600 is available cheaper ).
 
  • Like
Reactions: adbatista

Eximo

Titan
Ambassador
If they can make a GTX1630 they will certainly dump excess chips into at least the Asian market with a 4050. But I agree that will only come after demand of low-end mobile GPUs is completely met.

Top MX570 is an Ampere chip at the moment, basically a crippled RTX3050 with slow memory.

Shockingly the first time I have seen the GTX1630 actually available for less than the cheapest GTX1650. Still only $15 difference.
 
  • Like
Reactions: bit_user