Generally In games, very large surfaces (for example in huge rooms/walls etc) will use an entire mosaic of large textures - and those that are close to you, you will almost certainly benefit from high res textures because they don't make the mosaic so small to only be usable on 4k+ monitors intentionally - since that would require exponentially more artistry work - it's easier to repeat them as much as they can.I'm not saying a 32x32 pixel image won't resolve at full resolution on an 8k monitor. I'm saying the opposite. I'm saying that a 1920x1080 monitor is physically incapable of resolving at 4096x4096 texture at 1:1.
Sure, any texture can look pixilated if it scales infinitely large as you get infinitely close to it. If game designers are letting the camera get infinitely close to their textures, then that is an entirely different problem with how they are using their engine.
They could also just be taking very big textures and scaling them up to cover a very large area, like the ground for an entire region. But I think that fell out of style with a lot of developers sometime around the Idtech 5/RAGE era.
Game developers do often take too-small textures and stretch them out, which would have looked better if they had used an appropriately large texture, but that is more of an issue with bad art.
I'm making a specific (and overly general) point about what games are actually doing with textures at their highest settings. They're putting giant textures on everything, even tiny objects, regardless of how close it is to the camera. That 200 pixel wide gun? 4ktexture. Each of those 8-pixel wide bullet casings in a pile on the ground 25 meters away? 4k textures. They get down sampled by the time they reach your monitor, but that doesn't change the fact they are loading these ridiculously massive textures into memory which are wasting a huge amount of space/bandwidth and ultimately do not and cannot increase visual quality at the resolutions we play games at. That is why (per the chart posted in the discussion above) RE4 still uses an absurd 12.49GB @ 1080 vs 13.85 @ 4k. Because they are both set to load-in the same biggest-possible textures.
If a card doesn't have the memory capacity/speed to handle 4k ultra textures, then it doesn't have the memory to handle 1080p ultra textures. Because they are the same textures.
It's not even a failure of the dev to optimize. It's gamers deliberately choosing bad settings then complaining when the game doesn't run well. They're kneecapping their "HD" targeted cards into a UHD bottleneck without actually gaining anything.
Your point about what games are doing with textures is semi-correct, but worse than you think. Most games use LoD (level-of-detail) that will load a different texture at certain ranges of an object. So in any given scene you don't just have 1 texture loaded for a material, you have several. And not just the image texture, you have layers like specular, reflection, metallic etc. These all need to be loaded ready to change at any point in time the object distance meets the requirement for a change in LoD threshold. You're absolutely right as far as there are a lot of textures being loaded at all times that are not always being used fully.
However, you're not correct in thinking as per your original comment (irrespective of whether a card can handle it) that massive textures don't benefit lower resolution screens more often that not, in an explorable 3D scene.
The real victim here is the average gamer who unfortunately these days doesn't really know these things and then wonders why a game isn't running well on a card they thought was great, and has had years of being fed slogans like "HD" and "ultra" as a sign for something better.