I'm not sure why you speak so definitively about what "all modern games" are doing. That is, ultimately, up to the game developers and artists. That was my whole point. Maybe a lot of games do have 4K textures now; I don't know for certain. I'd have to try and look at the files for the games, unpack files, and determine if there are 4K, 2K, etc. textures. That's more than I care to do.
4K textures are basically stupid, though. The mipmapping algorithm looks at a polygon and checks the dimensions. If they're bigger than 2048, in either dimension, then a 4K (4096x4096) texture gets selected. Obviously, that would basically never happen if you're running at 1920x1080, unless the viewport is so close to a game that half of a polygon covers the entire screen. Then it goes down to the next size (1024 check) and so on.
But you can use texture upscaling within an engine to go up to ~2x the base size with minimal loss in quality. This is why 4K is truly overkill. It won't matter unless you're using an 8K display. 2K textures are sufficient (and then some) for 4K displays. 1K textures are almost always sufficient for 1440p and 1080p displays.
My assumption, based on experience as a computer programmer, is that developers aren't complete idiots and thus games that advertise HD texture packs for 4K monitors aren't using 4K textures, they're using 2K textures that would potentially be helpful on 4K displays.
Or alternatively, some of the textures might by default be much lower than even 1K. Diablo IV knows that certain objects are going to only cover say 500 pixels at most on the screen. So maybe they have 256x256 textures for those objects. An HD pack might include 512x512 textures to provide an upgrade. Again: It's up to the developers. There's no absolute "this is what all games everywhere are doing" answer.
You are 100% dead on that using 4096x4096 textures is incredibly dumb unless on this massive screen, but for some reason many games recently seem to have that turned on for "Ultra" settings. Dial it own a notch and it goes back to the more common 2K as the upper limit. Look back at all the articles Toms did and see the games that "struggled" at 8GB VRAM that causes all the heartache recently. You are also correct that in the past devs were more conservative about using resources like disk space and not storing stuff in unreasonably high sizes, yet nowadays they don't seem to care about disk space, so yeah they do store them at 4096x4096 native with 2~3 levels of mipmapping prerendered. It's almost like being inefficient is bragging rights or something.
Having said that, 8K (4320p) monitors do exist and while extremely expensive, so were 4K once. From the dev's point of view, the "Ultra" settings might just assume everyone is running a 4090 or some future 5090 with an 4K or higher display, so they want their product to "age" well. Of course we can see what would then happen should a reviewer select "Ultra" to test a more modest card on. The point I've been hammering home is that "Ultra" presets absolutely are not a reasonable default and are setup to push a system unreasonably hard. Gamers do not need 12GB+ of VRAM to play games, it's a simple as going in and lowering a single slider one or two notches, with no effect on quality on current displays.
Last edited: