Review Nvidia GeForce RTX 4060 Review: Truly Mainstream at $299

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

atomicWAR

Glorious
Ambassador
No it won't be. A console doesn't have any of the garbage that PCs do. The OS eats hundreds of MBs not GBs. It doesn't duplicate data like PCs do (loading the same data in system and graphical memory).
Not true the Xbox series OS is 2GB in size running in ram not mere megabytes as you claim.


Sony unlike MS has not said how much vram they use officially but according to devs unofficially its 2 to 2.5GB.


View: https://twitter.com/GigaBoots/status/1298299408363278337


So you're very much incorrect sorry to report.
 
Last edited:
  • Like
Reactions: bit_user
All the modern games have 4096x4096 textures loaded along with 2 lower res versions, small versions are then created on the fly with down sampling. You are correct that those 4K versions are rarely used but they are still being loaded into GPU memory on "Ultra" settings. When you load a texture you don't just load 1 version into memory, it automatically loads the other versions stored with it, textures are stored in native resolution along with 2~3 down sampled copies. Graphics engines are pretty smart about figuring out which one to rasterize but they'll still load into memory what they are told to load into memory. The culprit to all this wastefulness is the default "Ultra" settings everyone likes to use combined with the "can it run Crysis" mentality. Game creators know you reviewers are going to instantly go to "Ultra" to show how "demanding" their game is, so it's setup to be extremely wasteful of resources. Screen resolution doesn't really matter anymore, textures are the single biggest source of memory utilization and they are the same size reguardless of screen settings.

I've actually gone into games and proven this, go into an area on "Ultra kill my computer" and observe memory utilization. Exit, slide textures down one notch, go back into same area and notice GPU memory utilization plummeted. Need to pay attention if the game engine has three or four settings, if three then it's usually (1K/2K/4K) as default, if four the last two might both be 4K but with or without some sort of filtering technique applied. I could get Diablo 4 playing smoothly on 8GB VRAM on Ultra by simply lowering the texture slider. This is for games that are relatively new, you know the ones everyone is screaming "you need 12GB or higher". If a game is smart enough to manage it's own memory footprint then it's likely not going to sabotage itself in the memory department.
I'm not sure why you speak so definitively about what "all modern games" are doing. That is, ultimately, up to the game developers and artists. That was my whole point. Maybe a lot of games do have 4K textures now; I don't know for certain. I'd have to try and look at the files for the games, unpack files, and determine if there are 4K, 2K, etc. textures. That's more than I care to do.

4K textures are basically stupid, though. The mipmapping algorithm looks at a polygon and checks the dimensions. If they're bigger than 2048, in either dimension, then a 4K (4096x4096) texture gets selected. Obviously, that would basically never happen if you're running at 1920x1080, unless the viewport is so close to a game that half of a polygon covers the entire screen. Then it goes down to the next size (1024 check) and so on.

But you can use texture upscaling within an engine to go up to ~2x the base size with minimal loss in quality. This is why 4K is truly overkill. It won't matter unless you're using an 8K display. 2K textures are sufficient (and then some) for 4K displays. 1K textures are almost always sufficient for 1440p and 1080p displays.

My assumption, based on experience as a computer programmer, is that developers aren't complete idiots and thus games that advertise HD texture packs for 4K monitors aren't using 4K textures, they're using 2K textures that would potentially be helpful on 4K displays.

Or alternatively, some of the textures might by default be much lower than even 1K. Diablo IV knows that certain objects are going to only cover say 500 pixels at most on the screen. So maybe they have 256x256 textures for those objects. An HD pack might include 512x512 textures to provide an upgrade. Again: It's up to the developers. There's no absolute "this is what all games everywhere are doing" answer.
 
I'm not sure why you speak so definitively about what "all modern games" are doing. That is, ultimately, up to the game developers and artists. That was my whole point. Maybe a lot of games do have 4K textures now; I don't know for certain. I'd have to try and look at the files for the games, unpack files, and determine if there are 4K, 2K, etc. textures. That's more than I care to do.

4K textures are basically stupid, though. The mipmapping algorithm looks at a polygon and checks the dimensions. If they're bigger than 2048, in either dimension, then a 4K (4096x4096) texture gets selected. Obviously, that would basically never happen if you're running at 1920x1080, unless the viewport is so close to a game that half of a polygon covers the entire screen. Then it goes down to the next size (1024 check) and so on.

But you can use texture upscaling within an engine to go up to ~2x the base size with minimal loss in quality. This is why 4K is truly overkill. It won't matter unless you're using an 8K display. 2K textures are sufficient (and then some) for 4K displays. 1K textures are almost always sufficient for 1440p and 1080p displays.

My assumption, based on experience as a computer programmer, is that developers aren't complete idiots and thus games that advertise HD texture packs for 4K monitors aren't using 4K textures, they're using 2K textures that would potentially be helpful on 4K displays.

Or alternatively, some of the textures might by default be much lower than even 1K. Diablo IV knows that certain objects are going to only cover say 500 pixels at most on the screen. So maybe they have 256x256 textures for those objects. An HD pack might include 512x512 textures to provide an upgrade. Again: It's up to the developers. There's no absolute "this is what all games everywhere are doing" answer.

You are 100% dead on that using 4096x4096 textures is incredibly dumb unless on this massive screen, but for some reason many games recently seem to have that turned on for "Ultra" settings. Dial it own a notch and it goes back to the more common 2K as the upper limit. Look back at all the articles Toms did and see the games that "struggled" at 8GB VRAM that causes all the heartache recently. You are also correct that in the past devs were more conservative about using resources like disk space and not storing stuff in unreasonably high sizes, yet nowadays they don't seem to care about disk space, so yeah they do store them at 4096x4096 native with 2~3 levels of mipmapping prerendered. It's almost like being inefficient is bragging rights or something.

Having said that, 8K (4320p) monitors do exist and while extremely expensive, so were 4K once. From the dev's point of view, the "Ultra" settings might just assume everyone is running a 4090 or some future 5090 with an 4K or higher display, so they want their product to "age" well. Of course we can see what would then happen should a reviewer select "Ultra" to test a more modest card on. The point I've been hammering home is that "Ultra" presets absolutely are not a reasonable default and are setup to push a system unreasonably hard. Gamers do not need 12GB+ of VRAM to play games, it's a simple as going in and lowering a single slider one or two notches, with no effect on quality on current displays.
 
Last edited:
You are 100% dead on that using 4096x4096 textures is incredibly dumb unless on this massive screen, but for some reason many games recently seem to have that turned on for "Ultra" settings. Dial it own a notch and it goes back to the more common 2K as the upper limit. Look back at all the articles Toms did and see the games that "struggled" at 8GB VRAM that causes all the heartache recently. You are also correct that in the past devs were more conservative about using resources like disk space and not storing stuff in unreasonably high sizes, yet nowadays they don't seem to care about disk space, so yeah they do store them at 4096x4096 native with 2~3 levels of mipmapping prerendered. It's almost like being inefficient is bragging rights or something.

Having said that, 8K (4320p) monitors do exist and while extremely expensive, so were 4K once. From the dev's point of view, the "Ultra" settings might just assume everyone is running a 4090 or some future 5090 with an 4K or higher display, so they want their product to "age" well. Of course we can see what would then happen should a reviewer select "Ultra" to test a more modest card on. The point I've been hammering home is that "Ultra" presets absolutely are not a reasonable default and are setup to push a system unreasonably hard. Gamers do not need 12GB+ of VRAM to play games, it's a simple as going in and lowering a single slider one or two notches.
While I don’t disagree with what you are saying, I mean let’s be real you don’t need to run games on highest textures and ultra detail for them to look good, the fact is that the 4060 is still not a great product in my opinion at least.

Just my opinion, but if savings are the measuring stick then the 4060 is doa since the 6600xt, 6650xt and 7600 can match or beat it in many games for less money. Realistically how much ray tracing can be done on this level card anyway?

If one is willing to pay to get 300 bucks, then I think it’s worth spending a hair more for a 6700xt or 6750xt.

However if they bring the price on the 4060 closer to 225-250, then it’s a more interesting card and could be more if a candidate as a 1060 type replacement. At their current pricing it just seems like there are better cards available.

That said it does seem like devs could do better with optimizations, but I could see if they are getting pushed to constantly keep pushing games out with more content, perhaps it makes sense to keep using the same game engines and textures If speed of release is a major factor that may be getting pushed on them from management.
 
  • Like
Reactions: Thunder64
While I don’t disagree with what you are saying, I mean let’s be real you don’t need to run games on highest textures and ultra detail for them to look good, the fact is that the 4060 is still not a great product in my opinion at least.

Just my opinion, but if savings are the measuring stick then the 4060 is doa since the 6600xt, 6650xt and 7600 can match or beat it in many games for less money. Realistically how much ray tracing can be done on this level card anyway?

If one is willing to pay to get 300 bucks, then I think it’s worth spending a hair more for a 6700xt or 6750xt.

However if they bring the price on the 4060 closer to 225-250, then it’s a more interesting card and could be more if a candidate as a 1060 type replacement. At their current pricing it just seems like there are better cards available.

That said it does seem like devs could do better with optimizations, but I could see if they are getting pushed to constantly keep pushing games out with more content, perhaps it makes sense to keep using the same game engines and textures If speed of release is a major factor that may be getting pushed on them from management.

Oh I think the entire 40x series is hot garbage for consume value. Just wanting to ensure people understand that they will have an amazing experience even with 8GB of graphics VRAM, since TY channels and review sites have convinced everyone 12GB+ is "needed" by always running "Ultra melt my PC". They select "Ultra" and the game CTD's on the 8GB card, they then declare "8GB isn't enough for playing this game", even though that setting is inefficient with resources. Do it enough and everyone suddenly thinks they need newer 12GB+ cards. I don't want to get tin foil hat like, but yeah.
 

tracker1

Distinguished
Jan 15, 2010
42
25
18,535
tracker1.dev
I'm with most of the rest of the commenters. DOA, if you're going new, the RX 6700 XT is roughly the same price with more vram. Probably why NVidia early preview didn't want 1440-4k gaming mentioned.

Either save $100 for the rx6600, get the 6700xt for the same price or save for rx6800. NVidia isn't worth it below the 4080. And even then they're over priced. I say this having bought a 3070 and 3080 during the pandemic.

The used market is relatively good right now. But buyer beware as always. If you're in a major city, you should be able to find something 2080Ti or 3000 series NVidia locally.

AMD pretty much owns the sub $600 price segments right now. If your main concern is gaming, AMD is a much better value. Intel A series drivers are pretty stable and pretty price competitive as well if you avoid the scalpers.

On scalpers, they're still out there... Research and know the current pricing. Don't buy from market scalpers.
 
  • Like
Reactions: bit_user

Tac 25

Estimable
Jul 25, 2021
1,391
421
3,890
Why is it always mutual exclusivity with your types? I have always used both consoles and PCs, dating back to Atari 2600 and Intel 486 days (to be clear the PC was my dad's I'm not that old). Let me tell you a secret, you can be a PC enthusiast and an avid console gamer at the same time.

and why would I want to be an avid console gamer? lol
 
Last edited:

adunlucas

Prominent
Nov 5, 2022
8
13
515
One thing i think about this card with 8GB is that will make devs work a little more to keep memory usabe under 8GB. There are Just too many users with cards with 8 ir 6GB. Look at Steam Survey... I know It is crap, but gives devs Idea of what ia their potential market.

One more thing. Outside USA and EU there are millions of people with less money. Here The minimum wage ia about $300. Most people earn under 500. So I see many buying used Xeon processors from Aliexpress. I'm a teacher and some of my high School students that have gamer PCs use these old xeons, i3 or R3 with a RX550 or GX950. That's all they can afford. One friend of mine that really likes spending on his gaming PC is keeping his R5 2600 with a 1060. Another Will buy a new PC and best thing he can find is a R5 4500 with a RTX3060 for about $1000. Just to play Starfield. He uses an old laptop with 1650 to play Dota and CoD. Don't think he can afford more than that. And I think this is the reality in countries like all in Latin America, Ásia and other poorer countries. IF the dev raises the bar too much, will lose too many gamers. Bethesda wants to sell quite some copies in latin America. Wont IF person needs a 3070 that didn't sell 100.000 in Brazil, or México.

I have a student that was saying he had a GX940 (not sure, but low end from that generation) and just upgrade to a RX5500 8GB (from Aliexpress, god only knows what was its past life). Runs happily anything he tries and only costed $100. He luckyly didn't pay customs fee (60% here, that's why a $200 card costa $320 here). I wont be surprised if 5060 vanilla comes with 8GB in a few years. But I hope It comes with more.
 
Oh, but wait! We have yet to see a 96-bit or 64-bit card!

Since the RTX 3050 has 8 GB, maybe the RTX 4050 has 8 GB, but using 64-bit bus? Either that, or they're going to entirely skip the 50-tier. Then, we might get a RTX 4030 @ 96-bit with 6 GB.
I am REALLY curious to see what Nvidia actually does. I think a desktop RTX 4050 might arrive, and I wouldn't be shocked if it uses a 96-bit interface and 6GB. We'll have to see. Maybe Nvidia will throw us a bone and at least stick with 128-bit and 8GB like the previous generation, because the AD107 supports that, but I don't know how far down they'd have to cut GPU cores to warrant the 4050 name.
 
  • Like
Reactions: palladin9479

Tac 25

Estimable
Jul 25, 2021
1,391
421
3,890
Damn and I thought you were being obtuse on purpose, but seems like I need to spell it out for you.
The answer is that you can enjoy both, consoles and PC without having to choose between one.

yeah, but that is for you.

All the games I want and enjoy playing to play are already on my pc. So why do I need to buy a conole? It will just gather dust here.
 
  • Like
Reactions: bit_user

bit_user

Titan
Ambassador
The mipmapping algorithm looks at a polygon and checks the dimensions. If they're bigger than 2048, in either dimension, then a 4K (4096x4096) texture gets selected.
This happens on a per-pixel basis, often along with interpolating between the two resolutions on either side, so you don't get a hard edge at the transition from one texture resolution to another.

Obviously, that would basically never happen if you're running at 1920x1080, unless the viewport is so close to a game that half of a polygon covers the entire screen.
No, textures are often far larger than individual polygons. Even then, if I use a texture map for a wall, and the wall is 20 feet high, then when I'm standing next to it what's filling my viewport is actually just a rather small subset of the entire texture.

That's obviously a simplistic example, but it illustrates the point that 4k textures aren't necessarily a waste, even at 1080p. It all depends on how the texture is used.

This is why 4K is truly overkill. It won't matter unless you're using an 8K display. 2K textures are sufficient (and then some) for 4K displays. 1K textures are almost always sufficient for 1440p and 1080p displays.
It would probably make a good article to compare screenshots of a few games vs. different texture resolutions, to actually test how noticeable the difference is.
 

bit_user

Titan
Ambassador
Damn and I thought you were being obtuse on purpose, but seems like I need to spell it out for you.
The answer is that you can enjoy both, consoles and PC without having to choose between one.
If someone is trying to be frugal, I can see the point of sticking with one or the other. And if you already have a decent PC for other reasons/uses, then a dGPU is likely the only thing you really need to enable PC gaming.

I'm with you, though. I don't game on my PCs. If I wanted to do more gaming, I'd probably pick up a PS5. Not because of horsepower, but I know all the games for it are optimized to run well on it. More importantly, I prefer living room gaming which isn't where my big workstation machine is located.
 
  • Like
Reactions: Elusive Ruse

bit_user

Titan
Ambassador
I am REALLY curious to see what Nvidia actually does. I think a desktop RTX 4050 might arrive, and I wouldn't be shocked if it uses a 96-bit interface and 6GB.
Even after the RTX 3050 shipped with 8 GB, though? That's why I dismissed that option, outright.

We'll have to see. Maybe Nvidia will throw us a bone and at least stick with 128-bit and 8GB like the previous generation, because the AD107 supports that, but I don't know how far down they'd have to cut GPU cores to warrant the 4050 name.
The tricky thing is going to be how they get costs down enough to make the RTX 4050 meaningfully cheaper than the RTX 4060.
 
No, textures are often far larger than individual polygons. Even then, if I use a texture map for a wall, and the wall is 20 feet high, then when I'm standing next to it what's filling my viewport is actually just a rather small subset of the entire texture.

That's obviously a simplistic example, but it illustrates the point that 4k textures aren't necessarily a waste, even at 1080p. It all depends on how the texture is used.

Maybe back in the early to mid 2000's, nowadays walls are multi-textured with 2~4 or more textures being rendered next to each other instead of one massive texture. It's how game devs can add flavor and atmosphere to walls, mix and matching different textures that are similar but not entirely the same. We have to get extremely asinine to create a scenario where 4096x4096 textures are actually useful on anything short of a 4320p display.
 

bit_user

Titan
Ambassador
Maybe back in the early to mid 2000's, nowadays walls are multi-textured with 2~4 or more textures being rendered next to each other instead of one massive texture.
It was a simplistic example to illustrate the point that you can't assume hi-res textures add nothing to visual fidelity. Another such example could be floor textures, where 4k might not be a waste if the floor is large enough.

The only way we're going to get good answers is via 1:1 identical screen shots (except for texture detail) that you can A/B flip between. I honestly don't care how any non-game designers think games use textures. Show me the pixels!
 

MisterZ

Distinguished
Nov 25, 2012
74
5
18,635
I noticed that in the gaming benchmarks, the 3060 is shown as getting 55.2 fps in the 1080p game average. Yet when it was tested in October last year, it was getting 98.5 fps in the same 13 game average test? Ploise explain?
 
  • Like
Reactions: bit_user

sherhi

Distinguished
Apr 17, 2015
80
52
18,610
One thing i think about this card with 8GB is that will make devs work a little more to keep memory usabe under 8GB. There are Just too many users with cards with 8 ir 6GB. Look at Steam Survey... I know It is crap, but gives devs Idea of what ia their potential market.

One more thing. Outside USA and EU there are millions of people with less money. Here The minimum wage ia about $300. Most people earn under 500. So I see many buying used Xeon processors from Aliexpress. I'm a teacher and some of my high School students that have gamer PCs use these old xeons, i3 or R3 with a RX550 or GX950. That's all they can afford. One friend of mine that really likes spending on his gaming PC is keeping his R5 2600 with a 1060. Another Will buy a new PC and best thing he can find is a R5 4500 with a RTX3060 for about $1000. Just to play Starfield. He uses an old laptop with 1650 to play Dota and CoD. Don't think he can afford more than that. And I think this is the reality in countries like all in Latin America, Ásia and other poorer countries. IF the dev raises the bar too much, will lose too many gamers. Bethesda wants to sell quite some copies in latin America. Wont IF person needs a 3070 that didn't sell 100.000 in Brazil, or México.

I have a student that was saying he had a GX940 (not sure, but low end from that generation) and just upgrade to a RX5500 8GB (from Aliexpress, god only knows what was its past life). Runs happily anything he tries and only costed $100. He luckyly didn't pay customs fee (60% here, that's why a $200 card costa $320 here). I wont be surprised if 5060 vanilla comes with 8GB in a few years. But I hope It comes with more.
I am from that "better" part of the world, EU country, average wage is 1300€ (heavily influenced by capital city, other regions have like 1000€), minimum is 700€ and that's before taxes and mandatory insurance, netto those 1300 are about 950-1000ish €. From my experience people just buy consoles and it's mostly because of GPU prices (just my personal experience from talking to dozens of gamers around me). This is not just game developers doing, let's NOT present it as some kind of generous gesture from Nvidia/AMD duopoly to constrain VRAM to 8gbs in hopes of forcing developers to optimise more....no no no, this just an upselling strategy. Slapping extra 4gbs of ram adds very little extra cost to the card, they just make it some kind of luxurious aspect of a product but it isn't. They are taking a risk of slowing down whole PC HW and gaming market but it seems they don't care.
 

sherhi

Distinguished
Apr 17, 2015
80
52
18,610
LOL. Consoles don't do native 4k rendering. They use tricks like checkerboard rendering and upscaling akin to FSR.

So, for a fair comparison, you should be using DLSS on this card.
Yes digital foundry made a video about it, comparing a handful of selected games that do support that technology and it just proved my point anyway. That "OMFG lol +50% compute power go brrrr" translates into...basically similar result in those handpicked games UNLESS you run out of VRAM (which conveniently never happens in these tests, only when HW unboxed does the testing...strange). Much wow, very impressed, after 3 years and for 360-400€ price tag and not even close to 550€ for whole gaming system (where is it? Can you show me such build? Still waiting). 250€, ideally 200€ sure would be a great price for this card which has no future.
 

bit_user

Titan
Ambassador
Slapping extra 4gbs of ram adds very little extra cost to the card,
It's not like they can just connect up some DRAM chips to some of the others and call it a day. There are a limited number of ways to add memory capacity. If you're not doubling the capacity, then you're adding memory channels, and the costs of doing that do indeed add up.

not even close to 550€ for whole gaming system (where is it? Can you show me such build? Still waiting).
I already said PC gaming doesn't offer the best price/performance. I'll even quote myself, since you already seem to have forgotten:

PC gaming isn't about saving money vs. consoles. That's generally a losing proposition.
 
I am from that "better" part of the world, EU country, average wage is 1300€ (heavily influenced by capital city, other regions have like 1000€), minimum is 700€ and that's before taxes and mandatory insurance, netto those 1300 are about 950-1000ish €. From my experience people just buy consoles and it's mostly because of GPU prices (just my personal experience from talking to dozens of gamers around me). This is not just game developers doing, let's NOT present it as some kind of generous gesture from Nvidia/AMD duopoly to constrain VRAM to 8gbs in hopes of forcing developers to optimise more....no no no, this just an upselling strategy. Slapping extra 4gbs of ram adds very little extra cost to the card, they just make it some kind of luxurious aspect of a product but it isn't. They are taking a risk of slowing down whole PC HW and gaming market but it seems they don't care.

Not that simple, SDRAM, which both DDR and GDDR evolved from, is configured in 32-bit memory channels, and unless you want extremely erratic performance, each channel needs identical memory configuration. These ... cards... having 128-bit memory interface means 4 32-bit channels, 4 paths to plug memory modules in. Each path has 2GB worth of GDD6 memory and thus the entire card has 8GB total. If they added another 32bit memory channel then it would be 160-bit and 10GB, add yet another and we get 192-bit and 12GB. They could also instead choose to use multiple memory chips per channel but that limits speed since the signal has to now reach twice as far. The whole reason we always tell people to use matched sticks of system memory.

This is why 30/40/50 tier cards had 128bit memory interfaces, it was the cheapest and easiest way to equip them with high speed memory, and why the 60 series cards having 128 is a travesty. In fact the 70 is just as bad for having a 192 bit bus instead of 256.