I'm not sure why you speak so definitively about what "all modern games" are doing. That is, ultimately, up to the game developers and artists. That was my whole point. Maybe a lot of games do have 4K textures now; I don't know for certain. I'd have to try and look at the files for the games, unpack files, and determine if there are 4K, 2K, etc. textures. That's more than I care to do.
4K textures are basically stupid, though. The mipmapping algorithm looks at a polygon and checks the dimensions. If they're bigger than 2048, in either dimension, then a 4K (4096x4096) texture gets selected. Obviously, that would basically never happen if you're running at 1920x1080, unless the viewport is so close to a game that half of a polygon covers the entire screen. Then it goes down to the next size (1024 check) and so on.
But you can use texture upscaling within an engine to go up to ~2x the base size with minimal loss in quality. This is why 4K is truly overkill. It won't matter unless you're using an 8K display. 2K textures are sufficient (and then some) for 4K displays. 1K textures are almost always sufficient for 1440p and 1080p displays.
My assumption, based on experience as a computer programmer, is that developers aren't complete idiots and thus games that advertise HD texture packs for 4K monitors aren't using 4K textures, they're using 2K textures that would potentially be helpful on 4K displays.
Or alternatively, some of the textures might by default be much lower than even 1K. Diablo IV knows that certain objects are going to only cover say 500 pixels at most on the screen. So maybe they have 256x256 textures for those objects. An HD pack might include 512x512 textures to provide an upgrade. Again: It's up to the developers. There's no absolute "this is what all games everywhere are doing" answer.