Review Nvidia GeForce RTX 4060 Review: Truly Mainstream at $299

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I'm not sure why you speak so definitively about what "all modern games" are doing. That is, ultimately, up to the game developers and artists. That was my whole point. Maybe a lot of games do have 4K textures now; I don't know for certain. I'd have to try and look at the files for the games, unpack files, and determine if there are 4K, 2K, etc. textures. That's more than I care to do.

4K textures are basically stupid, though. The mipmapping algorithm looks at a polygon and checks the dimensions. If they're bigger than 2048, in either dimension, then a 4K (4096x4096) texture gets selected. Obviously, that would basically never happen if you're running at 1920x1080, unless the viewport is so close to a game that half of a polygon covers the entire screen. Then it goes down to the next size (1024 check) and so on.

But you can use texture upscaling within an engine to go up to ~2x the base size with minimal loss in quality. This is why 4K is truly overkill. It won't matter unless you're using an 8K display. 2K textures are sufficient (and then some) for 4K displays. 1K textures are almost always sufficient for 1440p and 1080p displays.

My assumption, based on experience as a computer programmer, is that developers aren't complete idiots and thus games that advertise HD texture packs for 4K monitors aren't using 4K textures, they're using 2K textures that would potentially be helpful on 4K displays.

Or alternatively, some of the textures might by default be much lower than even 1K. Diablo IV knows that certain objects are going to only cover say 500 pixels at most on the screen. So maybe they have 256x256 textures for those objects. An HD pack might include 512x512 textures to provide an upgrade. Again: It's up to the developers. There's no absolute "this is what all games everywhere are doing" answer.

You are 100% dead on that using 4096x4096 textures is incredibly dumb unless on this massive screen, but for some reason many games recently seem to have that turned on for "Ultra" settings. Dial it own a notch and it goes back to the more common 2K as the upper limit. Look back at all the articles Toms did and see the games that "struggled" at 8GB VRAM that causes all the heartache recently. You are also correct that in the past devs were more conservative about using resources like disk space and not storing stuff in unreasonably high sizes, yet nowadays they don't seem to care about disk space, so yeah they do store them at 4096x4096 native with 2~3 levels of mipmapping prerendered. It's almost like being inefficient is bragging rights or something.

Having said that, 8K (4320p) monitors do exist and while extremely expensive, so were 4K once. From the dev's point of view, the "Ultra" settings might just assume everyone is running a 4090 or some future 5090 with an 4K or higher display, so they want their product to "age" well. Of course we can see what would then happen should a reviewer select "Ultra" to test a more modest card on. The point I've been hammering home is that "Ultra" presets absolutely are not a reasonable default and are setup to push a system unreasonably hard. Gamers do not need 12GB+ of VRAM to play games, it's a simple as going in and lowering a single slider one or two notches, with no effect on quality on current displays.
 
Last edited:
You are 100% dead on that using 4096x4096 textures is incredibly dumb unless on this massive screen, but for some reason many games recently seem to have that turned on for "Ultra" settings. Dial it own a notch and it goes back to the more common 2K as the upper limit. Look back at all the articles Toms did and see the games that "struggled" at 8GB VRAM that causes all the heartache recently. You are also correct that in the past devs were more conservative about using resources like disk space and not storing stuff in unreasonably high sizes, yet nowadays they don't seem to care about disk space, so yeah they do store them at 4096x4096 native with 2~3 levels of mipmapping prerendered. It's almost like being inefficient is bragging rights or something.

Having said that, 8K (4320p) monitors do exist and while extremely expensive, so were 4K once. From the dev's point of view, the "Ultra" settings might just assume everyone is running a 4090 or some future 5090 with an 4K or higher display, so they want their product to "age" well. Of course we can see what would then happen should a reviewer select "Ultra" to test a more modest card on. The point I've been hammering home is that "Ultra" presets absolutely are not a reasonable default and are setup to push a system unreasonably hard. Gamers do not need 12GB+ of VRAM to play games, it's a simple as going in and lowering a single slider one or two notches.
While I don’t disagree with what you are saying, I mean let’s be real you don’t need to run games on highest textures and ultra detail for them to look good, the fact is that the 4060 is still not a great product in my opinion at least.

Just my opinion, but if savings are the measuring stick then the 4060 is doa since the 6600xt, 6650xt and 7600 can match or beat it in many games for less money. Realistically how much ray tracing can be done on this level card anyway?

If one is willing to pay to get 300 bucks, then I think it’s worth spending a hair more for a 6700xt or 6750xt.

However if they bring the price on the 4060 closer to 225-250, then it’s a more interesting card and could be more if a candidate as a 1060 type replacement. At their current pricing it just seems like there are better cards available.

That said it does seem like devs could do better with optimizations, but I could see if they are getting pushed to constantly keep pushing games out with more content, perhaps it makes sense to keep using the same game engines and textures If speed of release is a major factor that may be getting pushed on them from management.
 
  • Like
Reactions: Thunder64
While I don’t disagree with what you are saying, I mean let’s be real you don’t need to run games on highest textures and ultra detail for them to look good, the fact is that the 4060 is still not a great product in my opinion at least.

Just my opinion, but if savings are the measuring stick then the 4060 is doa since the 6600xt, 6650xt and 7600 can match or beat it in many games for less money. Realistically how much ray tracing can be done on this level card anyway?

If one is willing to pay to get 300 bucks, then I think it’s worth spending a hair more for a 6700xt or 6750xt.

However if they bring the price on the 4060 closer to 225-250, then it’s a more interesting card and could be more if a candidate as a 1060 type replacement. At their current pricing it just seems like there are better cards available.

That said it does seem like devs could do better with optimizations, but I could see if they are getting pushed to constantly keep pushing games out with more content, perhaps it makes sense to keep using the same game engines and textures If speed of release is a major factor that may be getting pushed on them from management.

Oh I think the entire 40x series is hot garbage for consume value. Just wanting to ensure people understand that they will have an amazing experience even with 8GB of graphics VRAM, since TY channels and review sites have convinced everyone 12GB+ is "needed" by always running "Ultra melt my PC". They select "Ultra" and the game CTD's on the 8GB card, they then declare "8GB isn't enough for playing this game", even though that setting is inefficient with resources. Do it enough and everyone suddenly thinks they need newer 12GB+ cards. I don't want to get tin foil hat like, but yeah.
 
I'm with most of the rest of the commenters. DOA, if you're going new, the RX 6700 XT is roughly the same price with more vram. Probably why NVidia early preview didn't want 1440-4k gaming mentioned.

Either save $100 for the rx6600, get the 6700xt for the same price or save for rx6800. NVidia isn't worth it below the 4080. And even then they're over priced. I say this having bought a 3070 and 3080 during the pandemic.

The used market is relatively good right now. But buyer beware as always. If you're in a major city, you should be able to find something 2080Ti or 3000 series NVidia locally.

AMD pretty much owns the sub $600 price segments right now. If your main concern is gaming, AMD is a much better value. Intel A series drivers are pretty stable and pretty price competitive as well if you avoid the scalpers.

On scalpers, they're still out there... Research and know the current pricing. Don't buy from market scalpers.
 
  • Like
Reactions: bit_user
Why is it always mutual exclusivity with your types? I have always used both consoles and PCs, dating back to Atari 2600 and Intel 486 days (to be clear the PC was my dad's I'm not that old). Let me tell you a secret, you can be a PC enthusiast and an avid console gamer at the same time.

and why would I want to be an avid console gamer? lol
 
Last edited:
One thing i think about this card with 8GB is that will make devs work a little more to keep memory usabe under 8GB. There are Just too many users with cards with 8 ir 6GB. Look at Steam Survey... I know It is crap, but gives devs Idea of what ia their potential market.

One more thing. Outside USA and EU there are millions of people with less money. Here The minimum wage ia about $300. Most people earn under 500. So I see many buying used Xeon processors from Aliexpress. I'm a teacher and some of my high School students that have gamer PCs use these old xeons, i3 or R3 with a RX550 or GX950. That's all they can afford. One friend of mine that really likes spending on his gaming PC is keeping his R5 2600 with a 1060. Another Will buy a new PC and best thing he can find is a R5 4500 with a RTX3060 for about $1000. Just to play Starfield. He uses an old laptop with 1650 to play Dota and CoD. Don't think he can afford more than that. And I think this is the reality in countries like all in Latin America, Ásia and other poorer countries. IF the dev raises the bar too much, will lose too many gamers. Bethesda wants to sell quite some copies in latin America. Wont IF person needs a 3070 that didn't sell 100.000 in Brazil, or México.

I have a student that was saying he had a GX940 (not sure, but low end from that generation) and just upgrade to a RX5500 8GB (from Aliexpress, god only knows what was its past life). Runs happily anything he tries and only costed $100. He luckyly didn't pay customs fee (60% here, that's why a $200 card costa $320 here). I wont be surprised if 5060 vanilla comes with 8GB in a few years. But I hope It comes with more.
 
Oh, but wait! We have yet to see a 96-bit or 64-bit card!

Since the RTX 3050 has 8 GB, maybe the RTX 4050 has 8 GB, but using 64-bit bus? Either that, or they're going to entirely skip the 50-tier. Then, we might get a RTX 4030 @ 96-bit with 6 GB.
I am REALLY curious to see what Nvidia actually does. I think a desktop RTX 4050 might arrive, and I wouldn't be shocked if it uses a 96-bit interface and 6GB. We'll have to see. Maybe Nvidia will throw us a bone and at least stick with 128-bit and 8GB like the previous generation, because the AD107 supports that, but I don't know how far down they'd have to cut GPU cores to warrant the 4050 name.
 
  • Like
Reactions: palladin9479
Damn and I thought you were being obtuse on purpose, but seems like I need to spell it out for you.
The answer is that you can enjoy both, consoles and PC without having to choose between one.

yeah, but that is for you.

All the games I want and enjoy playing to play are already on my pc. So why do I need to buy a conole? It will just gather dust here.
 
  • Like
Reactions: bit_user
The mipmapping algorithm looks at a polygon and checks the dimensions. If they're bigger than 2048, in either dimension, then a 4K (4096x4096) texture gets selected.
This happens on a per-pixel basis, often along with interpolating between the two resolutions on either side, so you don't get a hard edge at the transition from one texture resolution to another.

Obviously, that would basically never happen if you're running at 1920x1080, unless the viewport is so close to a game that half of a polygon covers the entire screen.
No, textures are often far larger than individual polygons. Even then, if I use a texture map for a wall, and the wall is 20 feet high, then when I'm standing next to it what's filling my viewport is actually just a rather small subset of the entire texture.

That's obviously a simplistic example, but it illustrates the point that 4k textures aren't necessarily a waste, even at 1080p. It all depends on how the texture is used.

This is why 4K is truly overkill. It won't matter unless you're using an 8K display. 2K textures are sufficient (and then some) for 4K displays. 1K textures are almost always sufficient for 1440p and 1080p displays.
It would probably make a good article to compare screenshots of a few games vs. different texture resolutions, to actually test how noticeable the difference is.
 
Damn and I thought you were being obtuse on purpose, but seems like I need to spell it out for you.
The answer is that you can enjoy both, consoles and PC without having to choose between one.
If someone is trying to be frugal, I can see the point of sticking with one or the other. And if you already have a decent PC for other reasons/uses, then a dGPU is likely the only thing you really need to enable PC gaming.

I'm with you, though. I don't game on my PCs. If I wanted to do more gaming, I'd probably pick up a PS5. Not because of horsepower, but I know all the games for it are optimized to run well on it. More importantly, I prefer living room gaming which isn't where my big workstation machine is located.
 
  • Like
Reactions: Elusive Ruse
I am REALLY curious to see what Nvidia actually does. I think a desktop RTX 4050 might arrive, and I wouldn't be shocked if it uses a 96-bit interface and 6GB.
Even after the RTX 3050 shipped with 8 GB, though? That's why I dismissed that option, outright.

We'll have to see. Maybe Nvidia will throw us a bone and at least stick with 128-bit and 8GB like the previous generation, because the AD107 supports that, but I don't know how far down they'd have to cut GPU cores to warrant the 4050 name.
The tricky thing is going to be how they get costs down enough to make the RTX 4050 meaningfully cheaper than the RTX 4060.
 
No, textures are often far larger than individual polygons. Even then, if I use a texture map for a wall, and the wall is 20 feet high, then when I'm standing next to it what's filling my viewport is actually just a rather small subset of the entire texture.

That's obviously a simplistic example, but it illustrates the point that 4k textures aren't necessarily a waste, even at 1080p. It all depends on how the texture is used.

Maybe back in the early to mid 2000's, nowadays walls are multi-textured with 2~4 or more textures being rendered next to each other instead of one massive texture. It's how game devs can add flavor and atmosphere to walls, mix and matching different textures that are similar but not entirely the same. We have to get extremely asinine to create a scenario where 4096x4096 textures are actually useful on anything short of a 4320p display.
 
Maybe back in the early to mid 2000's, nowadays walls are multi-textured with 2~4 or more textures being rendered next to each other instead of one massive texture.
It was a simplistic example to illustrate the point that you can't assume hi-res textures add nothing to visual fidelity. Another such example could be floor textures, where 4k might not be a waste if the floor is large enough.

The only way we're going to get good answers is via 1:1 identical screen shots (except for texture detail) that you can A/B flip between. I honestly don't care how any non-game designers think games use textures. Show me the pixels!
 
I noticed that in the gaming benchmarks, the 3060 is shown as getting 55.2 fps in the 1080p game average. Yet when it was tested in October last year, it was getting 98.5 fps in the same 13 game average test? Ploise explain?
 
  • Like
Reactions: bit_user
One thing i think about this card with 8GB is that will make devs work a little more to keep memory usabe under 8GB. There are Just too many users with cards with 8 ir 6GB. Look at Steam Survey... I know It is crap, but gives devs Idea of what ia their potential market.

One more thing. Outside USA and EU there are millions of people with less money. Here The minimum wage ia about $300. Most people earn under 500. So I see many buying used Xeon processors from Aliexpress. I'm a teacher and some of my high School students that have gamer PCs use these old xeons, i3 or R3 with a RX550 or GX950. That's all they can afford. One friend of mine that really likes spending on his gaming PC is keeping his R5 2600 with a 1060. Another Will buy a new PC and best thing he can find is a R5 4500 with a RTX3060 for about $1000. Just to play Starfield. He uses an old laptop with 1650 to play Dota and CoD. Don't think he can afford more than that. And I think this is the reality in countries like all in Latin America, Ásia and other poorer countries. IF the dev raises the bar too much, will lose too many gamers. Bethesda wants to sell quite some copies in latin America. Wont IF person needs a 3070 that didn't sell 100.000 in Brazil, or México.

I have a student that was saying he had a GX940 (not sure, but low end from that generation) and just upgrade to a RX5500 8GB (from Aliexpress, god only knows what was its past life). Runs happily anything he tries and only costed $100. He luckyly didn't pay customs fee (60% here, that's why a $200 card costa $320 here). I wont be surprised if 5060 vanilla comes with 8GB in a few years. But I hope It comes with more.
I am from that "better" part of the world, EU country, average wage is 1300€ (heavily influenced by capital city, other regions have like 1000€), minimum is 700€ and that's before taxes and mandatory insurance, netto those 1300 are about 950-1000ish €. From my experience people just buy consoles and it's mostly because of GPU prices (just my personal experience from talking to dozens of gamers around me). This is not just game developers doing, let's NOT present it as some kind of generous gesture from Nvidia/AMD duopoly to constrain VRAM to 8gbs in hopes of forcing developers to optimise more....no no no, this just an upselling strategy. Slapping extra 4gbs of ram adds very little extra cost to the card, they just make it some kind of luxurious aspect of a product but it isn't. They are taking a risk of slowing down whole PC HW and gaming market but it seems they don't care.
 
LOL. Consoles don't do native 4k rendering. They use tricks like checkerboard rendering and upscaling akin to FSR.

So, for a fair comparison, you should be using DLSS on this card.
Yes digital foundry made a video about it, comparing a handful of selected games that do support that technology and it just proved my point anyway. That "OMFG lol +50% compute power go brrrr" translates into...basically similar result in those handpicked games UNLESS you run out of VRAM (which conveniently never happens in these tests, only when HW unboxed does the testing...strange). Much wow, very impressed, after 3 years and for 360-400€ price tag and not even close to 550€ for whole gaming system (where is it? Can you show me such build? Still waiting). 250€, ideally 200€ sure would be a great price for this card which has no future.
 
Slapping extra 4gbs of ram adds very little extra cost to the card,
It's not like they can just connect up some DRAM chips to some of the others and call it a day. There are a limited number of ways to add memory capacity. If you're not doubling the capacity, then you're adding memory channels, and the costs of doing that do indeed add up.

not even close to 550€ for whole gaming system (where is it? Can you show me such build? Still waiting).
I already said PC gaming doesn't offer the best price/performance. I'll even quote myself, since you already seem to have forgotten:

PC gaming isn't about saving money vs. consoles. That's generally a losing proposition.
 
I am from that "better" part of the world, EU country, average wage is 1300€ (heavily influenced by capital city, other regions have like 1000€), minimum is 700€ and that's before taxes and mandatory insurance, netto those 1300 are about 950-1000ish €. From my experience people just buy consoles and it's mostly because of GPU prices (just my personal experience from talking to dozens of gamers around me). This is not just game developers doing, let's NOT present it as some kind of generous gesture from Nvidia/AMD duopoly to constrain VRAM to 8gbs in hopes of forcing developers to optimise more....no no no, this just an upselling strategy. Slapping extra 4gbs of ram adds very little extra cost to the card, they just make it some kind of luxurious aspect of a product but it isn't. They are taking a risk of slowing down whole PC HW and gaming market but it seems they don't care.

Not that simple, SDRAM, which both DDR and GDDR evolved from, is configured in 32-bit memory channels, and unless you want extremely erratic performance, each channel needs identical memory configuration. These ... cards... having 128-bit memory interface means 4 32-bit channels, 4 paths to plug memory modules in. Each path has 2GB worth of GDD6 memory and thus the entire card has 8GB total. If they added another 32bit memory channel then it would be 160-bit and 10GB, add yet another and we get 192-bit and 12GB. They could also instead choose to use multiple memory chips per channel but that limits speed since the signal has to now reach twice as far. The whole reason we always tell people to use matched sticks of system memory.

This is why 30/40/50 tier cards had 128bit memory interfaces, it was the cheapest and easiest way to equip them with high speed memory, and why the 60 series cards having 128 is a travesty. In fact the 70 is just as bad for having a 192 bit bus instead of 256.
 
This happens on a per-pixel basis, often along with interpolating between the two resolutions on either side, so you don't get a hard edge at the transition from one texture resolution to another.

No, textures are often far larger than individual polygons. Even then, if I use a texture map for a wall, and the wall is 20 feet high, then when I'm standing next to it what's filling my viewport is actually just a rather small subset of the entire texture.

That's obviously a simplistic example, but it illustrates the point that 4k textures aren't necessarily a waste, even at 1080p. It all depends on how the texture is used.

It would probably make a good article to compare screenshots of a few games vs. different texture resolutions, to actually test how noticeable the difference is.
Yes, and I already noted that things like standing close enough to an object that it more than fills the viewport will potentially use higher resolution textures. At the same time, there will be lots of stuff (think of all the little objects like grass and detail textures) that basically won't render at more than a few hundred pixels. Most likely the artists / developers don't even package 1K textures for things like that.

I have done an article on this subject. I've also included screenshots at various settings in this article, though I didn't take the time to only change texture quality. I've done that in individual game benchmarking articles though (Diablo IV, Dead Island 2, Star Wars Jedi: Survivor, Lord of the Rings: Gollum, and Redfall most recently).

I haven't tried to suss out whether those games are using 4K textures or 2K textures, but even at 4K resolution, turning down texture settings from max to min tends to only affect closer objects, as you'd expect from mipmapping. And let's not even get into the bit about TAA blurring things to try to eliminate jaggies, so that all the benefits from those ultra high resolution textures are wiped out.

There are always edge scenarios where you can make a game in such a way that using higher resolutions textures makes sense. In fact, one option is to use a single 8K texture that has lots of sub-textures. It could contain 64 1K textures, or 256 512x512 textures. Or it could be for a sky or space map, where only a small section of that large texture would normally be visible on screen at any time. But that's not what we're talking about.

I'm talking about a typical game like The Last of Us, Part 1, where there are lots of small surfaces in any given frame, and very few of those will use more than a 512x512 texture size with a standard mipmapping implementation.

[And I also need to mention here that modern games don't use a single texture for a single polygon. You'll often have a high resolution texture that wraps around hundreds of polygons. But the point still stands that if the wrapped object only covers say a 500x1000 area of the monitor at 1080p — like a standing character at relatively close distance — wrapping that object with a 4096x4096 versus a 2048x2048 versus a 1024x1024 texture will typically result in nearly the same final rendered output.]
 
One thing i think about this card with 8GB is that will make devs work a little more to keep memory usabe under 8GB. There are Just too many users with cards with 8 ir 6GB. Look at Steam Survey... I know It is crap, but gives devs Idea of what ia their potential market.

One more thing. Outside USA and EU there are millions of people with less money. Here The minimum wage ia about $300. Most people earn under 500. So I see many buying used Xeon processors from Aliexpress. I'm a teacher and some of my high School students that have gamer PCs use these old xeons, i3 or R3 with a RX550 or GX950. That's all they can afford. One friend of mine that really likes spending on his gaming PC is keeping his R5 2600 with a 1060. Another Will buy a new PC and best thing he can find is a R5 4500 with a RTX3060 for about $1000. Just to play Starfield. He uses an old laptop with 1650 to play Dota and CoD. Don't think he can afford more than that. And I think this is the reality in countries like all in Latin America, Ásia and other poorer countries. IF the dev raises the bar too much, will lose too many gamers. Bethesda wants to sell quite some copies in latin America. Wont IF person needs a 3070 that didn't sell 100.000 in Brazil, or México. And can add that I have the same problem, I was also pretty bad at school in terms of learning but also in terms of some physical exercises I was no different from a person who doesn't love physical education, this discouraged me a lot, because I felt useless, then in college I managed a little bit to get up, it helped me a lot story of my life essay, it showed me perfectly what I am and where I should go to be successful, now I'm a successful man, I opened several IT related businesses and especially PC peripherals, this excites me a lot and because now I know who I am and where I have to go I feel much better, unfortunately this problem is quite widespread among young people, this is very sad, because I see how many young people don't know what is with their life and what they want from it. I hope that the education system has become better since then and there will be less like me.

I have a student that was saying he had a GX940 (not sure, but low end from that generation) and just upgrade to a RX5500 8GB (from Aliexpress, god only knows what was its past life). Runs happily anything he tries and only costed $100. He luckyly didn't pay customs fee (60% here, that's why a $200 card costa $320 here). I wont be surprised if 5060 vanilla comes with 8GB in a few years. But I hope It comes with more.
It's true that there are many users who have graphics cards with 8GB or less VRAM, and game developers do consider the hardware specifications of their target market. They strive to optimize their games to ensure they run smoothly on a wide range of systems, including those with lower-end graphics cards.🙄
 
Last edited: