The Meta 2 Augmented Reality Dev Kit, Hands On

Status
Not open for further replies.
This is why I'm not jumping on the VR train just yet.

lol oook...think it might be a little hardcore to wait for refined AR and the software to match?

I've been around the industry a LONG time. VR is not ready for mainstream, and is being pushed out too early in my opinion. It is far from a mature technology, and I doubt it will last long in it's current form before it gets replaced with something that smart developers will focus their time on. AR just makes more sense in the real world. HUD tech has already given paths to AR for many years, and has allowed people like myself to already develop applications for such technology. So no, I don't have to wait.
 
The only way AR will make VR redundant is if it can give a full immersive world experience or better than VR including a much higher field of view with no leak of the real world environment such as light and even sound. That is the whole point of immersion in a VR environment.Is that going to happen anytime soon?
 
I'm really looking forward to AR stuff, I've seen it done on like the 3ds with those card things so the technology is mostly there(I really liked that you could create a hole in something via one of those QR code lookalikes that I can't remember the name of) Also I think this will require a lot less power than VR does because it doesn't need to be 1080p for each eye and 90 FPS to prevent motion sickness.

What I think it would do best for would be some kind of virtual sandbox game, not an actual sandbox but like a literal sandbox with sand that you just AR onto a desk. Or maybe like warhammer, it would be so awesome to set down some stuff on a table and AR in a game of warhammer around your office furniture or something.
 


I don't think many people will question the statement "VR is being pushed out a bit early" or that "it is far from a mature technology". If you know about some affordable AR hardware with some fun applications I hope you will share with us. I am not being sarcastic, please share.

 
What I really want to know is whether the synthetic content shown by the Meta 2 HMD has the same depth of field as objects in real space at the same distance.

Maybe an easier question to answer, in retrospect, would be the range of depths at which the synthetic content shown in the Meta 2 HMD had. Was it a fairly narrow range, or did they demo things ranging from a couple feet all the way to 100 feet?

If the device has a fixed depth of field, as I'm guessing, then they'll probably restrict the depth of synthetic content to a fairly narrow range. This will limit the sort of applications it can support.
 
Status
Not open for further replies.