srmojuze :
VRAM especially with HBM2 means 8K and 16K textures are not too far away.
Why are you concerned about huge textures? GPUs have all that compute horsepower in order to
procedurally generate textures
as needed. The resolution of procedural textures is limited only by the precision of the datatypes used to specify the texture coordinates.
Sure, not everything lends itself to coding as a procedure, but most things can at least be decomposed into a macro scale texture map, and smaller, repeating textures. I just don't see any need for 8k or 16k textures. I mean 4k makes sense for mapping screens, windows, and video images onto things, but that's about it.
srmojuze :
So yeah in terms of AR you could be walking on the street, the AR device computes the surrounding including light calculation, then renders say a photoreal person which is realtime global illumination-lit and textured, along with suitable post-processing based on the physical environment you are in (say a dusty grey day vs bright blue sunny skies). At that point you can legitimately say it is "mixed reality" since one would not be able to tell the difference between a real person standing there and the rendered artificial character.
I think that's not what anyone means by "mixed reality".
Anyway, there's a lot you're oversimplifying. Light source estimation will never be perfect, in unconstrained AR applications. It can be "good enough", in most cases, so that rendered objects don't seem jarringly out of place.
But you're also glossing over the whole display issue. Hololens doesn't block the light arriving through the visor. So, you'd be talking about something like Intel's Project Alloy, which is a VR-type HMD + cameras.
Safe to say, it'll be a while before we need to worry about a "Matrix" scenario. I don't even see it happening with conventional silicon semiconductors. Maybe carbon nanotube-based computers, or something else beyond lithography.