In that sense, yes it does. As do DSR resolutions that are later down-scaled to fit a smaller resolution.
But that difference pales in comparison to image detail. A plain-Jane field of wheat blowing around in the breeze contains a massive amount of info, every foreground stalk, every shadow, every shade, every motion variable, every dimension per frame. Doesn't matter 1080p or 4k, it's still massive vram usage. Vs a minecraft blocky, solid color, no shades, basic shadows, almost 2D like appearance, even 4k doesn't use as much vram as that 1080p field of wheat.
It's why Cyberpunk2077 is so brutal. The sheer amount of photo realism, shadows, lighting affects, reflections and objects kills vram. Coupled with its standard U-Play lack of decent optimization, AC series isn't any better. I played I, II, III on a 3770k at 4.9GHz with a 24% OC on a gtx970 and while I enjoyed the story lines, the actual game play was far from fluid. Anything above Low just felt clunky. At a realistic 3.5Gb of vram.