News Xreal's $700 Air 2 Ultra AR glasses put Apple Vision Pro and Meta Quest 3 in its crosshairs

Status
Not open for further replies.
Any comparisons between this and Apple's Vision Pro are superficial, at best.

What truly sets apart Vision Pro is the sophistication & refinement of its algorithms. It can place virtual objects somewhat convincingly in the real world, which is extremely hard to do well. It not only requires accurate tracking and depth extraction, but also light source estimation. Furthermore, that requires a lot of compute power, which Vision Pro gets from its M2 SoC + R1 ASIC.
 
  • Like
Reactions: jp7189
this article is just a mess.

these are not competing with QUest 3. Quest 3 is a VR headset.

these are more AR.

and comapred to the apple ehadset...it liekly isnt even in same ballpark in AR for that.
The article just states what the company is aiming at.

And Meta Quest 3 does have AR as well. So, someone looking for AR only or primarily, may be interested. E.g. to me personally, such glasses would be more appealing for use while traveling (in a train), i.e. to watch a movie or browsing (if the text is clear enough).

And the Vision Pro may be better, but also a lot more expensive. And depending on what one wants to use it for, it may be way over the top. I.e. again personally, I got me a Pico 4 last week for VR. The resolution is just about the same as with Meta Quest 3, and I use my PC with that for stuff beyond e.g. Ultimechs and Moon Rider XYZ. So if I also go AR, I wouldn't need it to be able to run everything (on its own), and other factors would be relevant, such as the usability while traveling (and from the price difference I could easily get me a top GPU, and also haptic gloves).
 
I just want an AR monitor that floats above my laptop screen. From all accounts, a static AR display will give you motion sickness, but is 6DoF necessary? Can it work with 3? Is the FoV on this enough? Anyone know?

Eithe way, this 1080p per eye probably isn't sufficient with 6 dof.
 
Nevermind the AR sensors or the processing power or anything else on the Xreal glasses. The first thing they need in order to be competitive in ANY way is to widen their FOV. I have a pair of the first gen nreal air and the fact that it maxes out at what is the equivalent of a 27 inch monitor 2 ft away from your face in terms of fov makes it impossible to put things in the extreme periphery of vision. The Q3 has a 100 degree fov, Vision Pro is estimated to be about the same iirc. These are increased to 52 which I assume also increases the vertical fov if they keep the aspect ratio of the screens the same. This is still going to require elements to be closer to center than either of the headsets that they supposedly target.

I tried using their sidescreen mirroring and it was just too close to center. This is of course personal preference but I just don't see it being a good experience until they can widen the fov so things don't interrupt the front and center view.
 
  • Like
Reactions: bit_user

Warning: If you own a PC, Android or an non-asian male head, you may not want to buy anything from Xreal!

With that out of the way, let's go deeper.

I got curious enough from a former article like this to buy an Air² Pro 2 and I was seriously disappointed, should have really returned them, and didn't because I run some kind of curiosity cabinet as part of my job, which includes near misses, ...some of which aren't even that close.

And Brandon, "crosshairs" imply even a remote possibility to hit, and here this is clearly not the case: your headlines for Xreal are misleading.

So here are the details:

Software

PC software drivers are part of the product, but not included.

Windows drivers have been promised, promised again "very soon now", been published as BETAs, but failed to arrive or work. I've tried with a vast variety of hosts from laptops with Alt-DP/USB3/4 ports to an RTX2080ti, one of the very few cards that supports an accelerated Alt-DP+USB port (see below for the longer story) and the Air² just fails at setup phase.

The only operational mode that works with PCs is using the glasses as a monitor in one of two modes:
1. THD at up to 120Hz with the displays for both eyes getting a mirror image
2. 32:9 at up to 60Hz with the display split in the middle so each eye gets half

That doesn't require any software drivers to work, but is a far cry from anything "augmented reality".

That can be a useful thing to own and operate e.g. in a train, but that's not what Xreal is advertising.

What exactly they are advertising is actually a little hard to fathom, because the material is mostly extreme detail rendering and beautified smiles, very little concrete features or explanations.

But it seems to entail or at least include the ability to create a wall of displays in a virtual space around you with the ability to move between them by turning your head.

That facility does not exist on a PC for lack of working software, even if it is advertised.

That is supposed to change "real soon now" but never did judging from the resports on the Xreal forum, which I can only recommend you dive into before buying: not doing that beforehand was my main mistake.

Hardware

The first issue is IPD: its design point seems to be 60mm eye-to-eye and mine are 68mm apart, so the outer edges on both eyes are fuzzy. It may be ok for a movie, but it precludes desktop work, reading, writing, coding, browsing etc. because of the eye strain and the need to basically switch eyes on every line.

IPD cannot be changed on either side and there is only one version of the glasses that, according to the data I was able to gather, would work well with Asian males and Western females: Asian females and Western males would require either two additional sets of glasses or adjustability for ergonomic desktop-augmentation/replacement use.

What is adjustable is the riding height on your nose, but that may also not be enough for your head: that part worked for me, once I discovered how to adjust the angle of the temples.

The display within the glasses covers the upper central portion of your vision and that area is far from transparent when the displays are off: before reality is augmented in any way, quite a lot of it is subtracted first. Demandinig 100% transparency in the active display area may be impossible at the budget of these devices, but the level of blocking is too high to walk around safely with the glasses put.

Unfortunately that continues with the lower part of the glasses, which in theory should allow all of reality to pass through, unless you decide to turn it of in three levels with the "Pro" variant. Here simply having nothing at all or glasses that are really transparent would save some use cases.

As things are, reality has little chance to pass through at the zero blockage setting nor does it get entirely blocked at the 100% setting: Xreal delivers an extra cover for a reason, which --unlike your typical shades--isn't something you'd mount or dismount while wearing the glasses.

That pretty much eliminates one of the prime use cases I had in mind when I bought the glasses: using them as a secondary (and private) display during laptop work, or in fact while dealing with paper on a desk: far too little gets through and keeping the glasses clean would be a challenge with extended use.

PC platform issues

The Air requires a display port to feed the displays and a USB port to send back sensor (gyro/compass) data to the host who computes the spatial projections in active mode in a single cable.

And in active mode the host is responsible to compute the spatial projection.

Turns out there are practically no qualifying PCs that offer such a port and the graphics power.

Only one of my many GPUs, an RTX 2080ti, offers a combined USB-(Alt)DP port, which should work with Xreal's beta drivers (it doesn't because the latest BETA drivers from June 2023 don't support the Air² "yet").

None of the older and none of the younger dGPUs offer USB-C display outputs (which include the USB inputs), so Xreal doesn't get the sensor data is needs.

The laptops that do offer Alt-DP and USB on their USB-C ports (which are also often Thunderbolt ports), do offer the required interfaces but then fail to deliver the necessary graphics power to create the virtual screen projections at the required performance.

It's not that a giant performance seems required, because a current smartphone should be good enough. But from what I could gather from the sparse comments the (single?) Xreal deveoper posted in their forum, the typical Intel HD iGPUs do not qualify, so far they recommend an Nvidia GPU... which don't tend to have USB outputs.

Android platform issues

Google!


Google's drive to turn the open source Android ecosystem into a locked down Apple clone is closing the doors on things like functioning display port outputs on Android phones.

When Xreal started with Android 11, many modern Snapdragon based phones had both sufficient power and a working DP+USB3 port.

Since then new Android releases and newer generation hardware have closed doors that Xreal depends on.

There are no signs of that trend reversing and Android is disfunctional per [Google] design, another detail that Xreal fails to mention.

That situation would require regulatory pressure and custom ROMs to improve.

My personal judgement

Xreal's Air² Pro, just like all prior generations are not a product today. And pushing out "new" generations of what is sold as a product, when they may be at best judged extra iterations of a beta, doesn't bode well for any of those beta iterations ever becoming useful at a product level.

Xreal are heroes for trying to push the envelope who become the innocent victim of the giants persuing an ever more narrow and closed future for "their" platforms. But when they continue to push hardware that simply cannot perform at the level of their dreams to consumers, they turn from victim to villain simply by overselling.

The usefulness of Xreal is extremely limited today. If that niche is big enough for you go ahead and make sure you have a return option to protect you.

Otherwise I can only recommend you dive deep into the forum or stay away until they can deliver what you think they advertise.
 
Last edited:
I just want an AR monitor that floats above my laptop screen. From all accounts, a static AR display will give you motion sickness, but is 6DoF necessary? Can it work with 3? Is the FoV on this enough? Anyone know?

Eithe way, this 1080p per eye probably isn't sufficient with 6 dof.
I have the Air² Pro glasses (and quite a few VR headsets).

Motion sickness AFAIK is caused by your in-ear balance sensors and your visuals not being aligned sufficiently well.

In passive mode, where the Air simply acts as an extra display, that's not an issue as your brain pretty much treats it like dirt on your glasses that follows your head movement and is static relative to eye movement.

In active mode, where the projected perspective should change with your head movement, the fact that you can still see the rest of your environment in the areas the glasses to not cover, should keep your brain happy: your laptop screen would ground you.

But even with the Air fully blocking outside inputs (with the extra cover) I've not experienced any issue there, because the sensors are good enough not to require inside-out cameras just for orientation.

I did not try to walk around in full block mode...

The main issues with the Air for the use case you describe is that fact that they are not transparent enough in the lower part for a laptop display to get through: perhaps if your screen can throw a 1000 nits, it might work, but with all of my laptops (who can be far too bright at max settings otherwise), that same max setting which is uncomfortable without the Air is too dark with it.

And that is for black-on-white word processing or browsing.

And then there is the issue of software: the floating display require a GPU to perform the projection in the virtual space. From what little Xreal is posting about why Windows drivers are delayed, laptop iGPUs are typically not powerful enough to do that job at sufficient performance.

If you use the extra Beam box you'll only get a single display and at 70Hz refresh, but it shows that the quality of the sensors in the Air is good enough to provide perfect orientation without any perceiveable drift: if the hardware driving the glasses is fast enough, motion sickenss shouldn't be a concern.

I've walked around with the Beam connected to my phone to feed video and spatial orientation of the screen was extremely stable even with (relatively) rapid head movements.

The main issue is that the glasses simply are subtracting too much reality before they do any augmentation: they are too intransparent where they should be fully so.
 
Last edited:
Xreal glasses are purely passive displays and sensors without local processing or electrical power.

They are either two side-by-side screens with THD@60Hz or one THD mirrored screen at up to 128Hz operated from a Display Port input on a USB-C connector.

They then include gyros/magnetic field/acceleration sensors for spatial orientation (Air/Air²) and for the Ultras and the Nreal variant they also include cameras: all of these require a USB3 capable back-channel on the same USB-C port whence they receive display output.

The host is resposible for everything, from electrical power to all the magic you see on-screen. So potentially the abilities are quite huge and capable of growing over time, ...as you replace the host.

For all the other problems and limitations please see my previous post, which is about the Air² which lacks the camera inputs.

I won't believe Xreal's gesture recognition claim until I've seen it work and measured just how much host power it consumes.

I own a Magic Leap, which uses a discrete set of infra-red time-of-flight sensors to generate a 3D digital twin of your hands for the ultimate real-time 3D interactivity.

Somewhat similar to LIDAR on cars that creates a pixel cloud 3D box input which at a much higher level of abstraction allows to create gesture data using a skeletal machine learning model underneath than when you try the same from (flat) inputs, even if you could use stereo vision. And yet it took quite a bit of host processing power to make that work to what was still a rather limited degree.

Magic Leap needs a good view at your fingers to create a matching digital twin at good quality. So when they offered a special holder to combine with them with my Oculus DK2 headset I was very enthusiastic... until I realized that the top-down view from the headset on my hands mostly obscured the fingers: there aren't terribly many jobs you do with your fingers facing you...

For a long time the industry believed that you'd need both, 3D and 2D cameras and sensor fusion to generate quality digital twins of your environment for orientation, gesture recognition etc. and I don't know just how many sensors and cameras of each type Apple wound up putting into their "glasses".

But you can be sure that Xreal requires replicating at least that computing power, perhaps even more if they need to infer sensors via ML that they don't physically have and that requires a huge and well integrated software stack on top of hardware most laptops, the only PCs with the proper ports an Air requires, don't have.

So if working gesture recognition at a certain level of detail and precision is your expected use case, make sure you wait until they can demo it working before you buy.
 
I can only see Asus selling an Air clone behind that link... Did you have another one in mind?

Of course the clone is interesting on its own, especially with the "M1" moniker, which hints at the only platform that seems to work well with the Xreal glasses, too.

But I'm not an iSlave...
Lenvovo has an accessory for their Go, the Spacetop "laptop", and others.
 
I just wanted to say: thanks for posting your detailed assessment.
You're welcome and actually that's good reminder that I need to point out some updates to remain fair.

First of all, I've splattered the Xreal community forum with plenty of observations, comments and--yes--accusations, some of which turned out to be wrong or exaggerated.

They can all be found under the very same nom de guerre as on this forum.

But so far the basic fact remains, that they vastly oversold a product that they delivered piecemeal and not yet complete while they refuse the 30 day return, because they count that only from the deliver of the hardware not the full package.

I am currently executing a VISA refund request and we'll see how it goes. No idea why I didn't go with PayPal, becaue that typically works faster in such cases.

So what's changed?
  1. They delivered another BETA for Windows, which is supposed to work with the Air².
  2. I got a TB4 add-in card for €80 and a cable for €40 (discovered that option later) either of which allow using all types of GPUs that don't have a USB-C port that is directly wired to the GPU's display port output.
  3. I found one (out of my three) OnePlus Android mobile phones, that works with the glasses, if you sideload the application. The OnePlus 7 Pro is judged as 'incompatible' by the Play Store, while the OnePlus 8 doesn't work and the OnePlus 11 only has USB 2[!#@%!~=8-O)]
But
  1. That BETA driver delivers horrible performance even on my RTX4090 with a 16-core Ryzen, far, far worse than the Beam. And it's pretty near the same ~10Hz effective refresh as on an i7-12700H or Ryzen 7 5800U laptop, an ARC A770m NUC12 and various RTX GPUs in between. Lack of GPU power can't be the issue
  2. That adapter allowed me to extend the range of GPUs across a wide field as described above
  3. That proved that "Nebula" on Android and "Nebula" on PCs or even the Beam are so nebulous that they zero common ground in terms of functionality. Nebula for Android only runs some apps nobody will design for yet another walled guarden and none of your existing apps work, very much unlike Google's DreamCast. The only remotely sane use is to use them in passive mode with glasses as pure display. That's the best way so far to run 3D movies, but not enough to spend €600 in my book.
Digging through the forum also clears up some other confusions: their engineers didn't design these glasses for "augmented reality". They were originally honest enough to understand that these were at best "mixed reality" glasses.

And they renamed than to augmented much later, either because somebody in marketing thought that was way cooler, or because Microsoft had just let the world know that they decided to sunset mixed reality with Windows 12, something that made me love that company even more.

But that didn't turn them into augmented reality devices, because they won't let enough reality pass through them in any operations mode, making them largely useless, except as a gimmick.

Since they are overselling to the point of dishonesty, they need to get some proper feedback.
 
  • Like
Reactions: bit_user
Status
Not open for further replies.