I'm kinda kidding with this comment, but with a resolution like that, you might be able to do away with anti-aliasing. Haha! Though it's true, I think. The gargantuan cost of the screen itself is enough to not justify that, for now.
Not to mention the performance to quality ratio compared to just turning on MSAAx4. I'm not sure what would look better though at a given pixel density, MSAAx4 or 1080p x4 (i.e. 4K). I'm thinking that latter, but how about SSAAx4 (big performance-eater) or MSAAx8? I wonder.
Higher resolutions use up more VRAM due to a larger frame buffer and also higher resolution texture files, and maybe even more polygons for models, I believe. Well, unless the game scales up the game's field of view with resolution, I think, but that still results in having to possibly load up more textures and models onto a frame.
I remember learning that SSAA and MSAA (not sure about other AA technologies) eat up a lot of VRAM due to having to render a certain frame at higher resolution to a buffer (like 4K resolution for a 1080p frame at AAx4) in order to get the necessary samples in order to "blur" edges.
Graphics cards have been getting more and more VRAM as time passes by as it seems, but as for memory bandwidth, we may have to wait for the DDR4 versions of GDDR (maybe 6-7) or maybe Rambus' XDR2 (just remembered rumors last year of how the HD 7970, I think, might've had XDR2).
I'm not sure how the performance of different kinds and levels of AA at 1080p compare to rendering with just 4K, but by what I've been reading in the comments, it seems like a lot of people think the latter's way more performance-intensive, and there are benchmarks to prove it I bet.
Sorry, just wanted to share some thoughts I had. Don't you guys think it's interesting? Hehe... I may be wrong about somethings, especially how those AA's work, but you're free to look into it more if interested. 🙂