All the modern games have 4096x4096 textures loaded along with 2 lower res versions, small versions are then created on the fly with down sampling. You are correct that those 4K versions are rarely used but they are still being loaded into GPU memory on "Ultra" settings. When you load a texture you don't just load 1 version into memory, it automatically loads the other versions stored with it, textures are stored in native resolution along with 2~3 down sampled copies. Graphics engines are pretty smart about figuring out which one to rasterize but they'll still load into memory what they are told to load into memory. The culprit to all this wastefulness is the default "Ultra" settings everyone likes to use combined with the "can it run Crysis" mentality. Game creators know you reviewers are going to instantly go to "Ultra" to show how "demanding" their game is, so it's setup to be extremely wasteful of resources. Screen resolution doesn't really matter anymore, textures are the single biggest source of memory utilization and they are the same size reguardless of screen settings.
I've actually gone into games and proven this, go into an area on "Ultra kill my computer" and observe memory utilization. Exit, slide textures down one notch, go back into same area and notice GPU memory utilization plummeted. Need to pay attention if the game engine has three or four settings, if three then it's usually (1K/2K/4K) as default, if four the last two might both be 4K but with or without some sort of filtering technique applied. I could get Diablo 4 playing smoothly on 8GB VRAM on Ultra by simply lowering the texture slider. This is for games that are relatively new, you know the ones everyone is screaming "you need 12GB or higher". If a game is smart enough to manage it's own memory footprint then it's likely not going to sabotage itself in the memory department.
I'm not sure why you speak so definitively about what "all modern games" are doing. That is, ultimately, up to the game developers and artists. That was my whole point. Maybe a lot of games do have 4K textures now; I don't know for certain. I'd have to try and look at the files for the games, unpack files, and determine if there are 4K, 2K, etc. textures. That's more than I care to do.
4K textures are basically stupid, though. The mipmapping algorithm looks at a polygon and checks the dimensions. If they're bigger than 2048, in either dimension, then a 4K (4096x4096) texture gets selected. Obviously, that would basically never happen if you're running at 1920x1080, unless the viewport is so close to a game that half of a polygon covers the entire screen. Then it goes down to the next size (1024 check) and so on.
But you can use texture upscaling within an engine to go up to ~2x the base size with minimal loss in quality. This is why 4K is truly overkill. It won't matter unless you're using an 8K display. 2K textures are sufficient (and then some) for 4K displays. 1K textures are almost always sufficient for 1440p and 1080p displays.
My assumption, based on experience as a computer programmer, is that developers aren't complete idiots and thus games that advertise HD texture packs for 4K monitors aren't using 4K textures, they're using 2K textures that would
potentially be helpful on 4K displays.
Or alternatively, some of the textures might by default be much lower than even 1K.
Diablo IV knows that certain objects are going to only cover say 500 pixels at most on the screen. So maybe they have 256x256 textures for those objects. An HD pack might include 512x512 textures to provide an upgrade. Again: It's up to the developers. There's no absolute "this is what all games everywhere are doing" answer.