News Apple's Vision Pro XR Headset Uses 90Hz Micro-OLED Displays

$3500 and it doesn't even do 120 hz? Seriously?
I guess that's about par for an Apple product.
Weeeell, in absolute fairness, 90Hz for an HMD is ok. What matters is not having tearing and stutters more than "max refresh/FPS".

IIRC, Mr Carmack kind of demonstrated/argued 90Hz is the "practical minimum" for an HMD to not cause nausea in full immersion.

I have the Index set to 120Hz, but I can live with ~40 FPS as long as they're consistent (ish) and there's no massive stutter under that value. But that's me. Mileage definitely varies in that dept.

Regards.
 

lmcnabney

Prominent
Aug 5, 2022
192
190
760
The inference to 'up to 96hz' displaying 24hz content is troubling. That means either frame interpolation (creating the soap opera effect and multiplying moving objects) as used on consumer UHD TVs or AI generated fake frames as created by DLSS3 processing. Both add latency due to processing which is BAD for AR/VR solutions and nausea.
This may be why the apps don't appear to encourage user movement and instead focus on 'sit and consume' usage.
 

Giroro

Splendid
This thing could have a billion pixels per eye at 1000 Hz. It won't matter.
Nobody is going to buy one until Apple figures out what these things are supposed to actually do.
 

JamesJones44

Reputable
Jan 22, 2021
787
722
5,760
Interesting that is using a micro LED display over OLED. That's a good part of the reason for the cost most likely... and the fact that it's Apple of course.
 

RichardtST

Respectable
May 17, 2022
239
267
1,960
Weeeell, in absolute fairness, 90Hz for an HMD is ok. What matters is not having tearing and stutters more than "max refresh/FPS".

IIRC, Mr Carmack kind of demonstrated/argued 90Hz is the "practical minimum" for an HMD to not cause nausea in full immersion.

I have the Index set to 120Hz, but I can live with ~40 FPS as long as they're consistent (ish) and there's no massive stutter under that value. But that's me. Mileage definitely varies in that dept.

Regards.

We must play vastly different games. 40hz would be horrifying in Beat Sabre or some of the other live-action games I tend to play. Huge differences between 40 and 60 and 90 and 120. I'll take 120 any day. I can see the stepping and individual frames. I may be old as dirt, but I'm not slow yet! :)

And why on earth would you need 96hz to watch a 24hz video? Are they scamming people again?
Of course they are...
 
  • Like
Reactions: helper800

edzieba

Distinguished
Jul 13, 2016
559
558
19,760
The inference to 'up to 96hz' displaying 24hz content is troubling. That means either frame interpolation (creating the soap opera effect and multiplying moving objects) as used on consumer UHD TVs or AI generated fake frames as created by DLSS3 processing. Both add latency due to processing which is BAD for AR/VR solutions and nausea.
This may be why the apps don't appear to encourage user movement and instead focus on 'sit and consume' usage.
Not familiar with standard best-practices for VR displays, I take it?

Every frame undergoes processing prior to scanout. First, the frame is offset (rotational timewarp) based on the most recent fused head pose - because since you started rendering the frame that pose is several ms stale, and several ms of latency is unacceptably slow for VR. Next, the frame is then warped to account for the position change of the head (spatial timewarp or 'spacewarp') which requires pixel synthesis to acount for the differential movement of foreground and background objects (parallax). In practice, these two steps occur simultaneously as a 6-axis transform, and they also take into account the optical flow field of the frame in order to correctly displace objects in motion. Finally, the frame is warped to account for the non-rectilinear optics used to view it, which means applying an invers-pincushion warp that is wavelength dependant (you warp the 3 colour channels separately to counter chromatic aberration in the optics).
Now, because every frame undergoes synthesis, there is very little practical difference between a frame that started rendering 10ms ago and a frame that started rendering 20ms ago in terms of the display timeline. That means as your warps are asynchronous and render time independent, you can use them for complete frame synthesis in case of a render miss - i.e. if the display is due for a scanout but a new frame has not completed rendering, you apply the up-to-date transform to the old frame and still generate a valid frame for display.
On top of that, there is no reason you cannot apply that same technique to every other frame, or every third frame, in order to double or triple the effective render rate to produce a given display update rate.

All these techniques have been implemented years ago and are in active use even on mobile devices (e.g. the Quest series).
 

newtechldtech

Respectable
Sep 21, 2022
355
127
1,860
$3500 and it doesn't even do 120 hz? Seriously?
I guess that's about par for an Apple product.

LOL .. you know this device works on M2 SOC right ? it will not run games at 120fps not even 90 fps with that SOC and for dual 4K screens (4K per eye) ? even RTX 4090 cant run two 4K screens at 60 fps ...

Anti Apple people lost their minds completely ...
 

RichardtST

Respectable
May 17, 2022
239
267
1,960
LOL .. you know this device works on M2 SOC right ? it will not run games at 120fps not even 90 fps with that SOC and for dual 4K screens (4K per eye) ? even RTX 4090 cant run two 4K screens at 60 fps ...

Anti Apple people lost their minds completely ...
"Can't run two 4k screens at 60hz". We have a winner for "noob of the year"....
It all depends on the content that is being displayed. My 1080ti can drive 4k60.
I've done it. It works. Most games get scaled down until they CAN do 60 hz because
anything less is painful.
 
  • Like
Reactions: bit_user

RichardtST

Respectable
May 17, 2022
239
267
1,960
Not familiar with standard best-practices for VR displays, I take it?

Every frame undergoes processing prior to scanout. First, the frame is offset (rotational timewarp) based on the most recent fused head pose - because since you started rendering the frame that pose is several ms stale, and several ms of latency is unacceptably slow for VR. Next, the frame is then warped to account for the position change of the head (spatial timewarp or 'spacewarp') which requires pixel synthesis to acount for the differential movement of foreground and background objects (parallax). In practice, these two steps occur simultaneously as a 6-axis transform, and they also take into account the optical flow field of the frame in order to correctly displace objects in motion. Finally, the frame is warped to account for the non-rectilinear optics used to view it, which means applying an invers-pincushion warp that is wavelength dependant (you warp the 3 colour channels separately to counter chromatic aberration in the optics).
Now, because every frame undergoes synthesis, there is very little practical difference between a frame that started rendering 10ms ago and a frame that started rendering 20ms ago in terms of the display timeline. That means as your warps are asynchronous and render time independent, you can use them for complete frame synthesis in case of a render miss - i.e. if the display is due for a scanout but a new frame has not completed rendering, you apply the up-to-date transform to the old frame and still generate a valid frame for display.
On top of that, there is no reason you cannot apply that same technique to every other frame, or every third frame, in order to double or triple the effective render rate to produce a given display update rate.

All these techniques have been implemented years ago and are in active use even on mobile devices (e.g. the Quest series).
It's easier than that. You simply request all your positional data N milliseconds ahead. Since 60 hz is roughly 16 ms, and because frame generation and lag take up another frame or so, asking for positional data roughly 30ms in the future works out pretty nicely (for a 60hz game loop). The controller automagically uses a couple nifty little algorithms (speed, acceleration, angular momentum, etc, etc) to make its best guess where everything will be at the requested time in the future. Pleasantly enough, it is pretty darn good at it and the result is that the screen you see is nearly exactly where your body feels it. You don't feel lag because its all matched up in time.

If your content only updates at 24hz then displaying 96hz will just get you artificial fake frames like DLSS, or, if fake frame generation is turned off, you'll just get new screen data at 24hz. Whichever way you slice it, it is still 24hz data, not 96hz. A 24hz video is a 24hz video.
 
  • Like
Reactions: helper800
"Can't run two 4k screens at 60hz". We have a winner for "noob of the year"....
It all depends on the content that is being displayed. My 1080ti can drive 4k60.
I've done it. It works. Most games get scaled down until they CAN do 60 hz because
anything less is painful.
A raspberry pi can display a 4k60 desktop, or does that not count? Either way, its exactly as you say, it what is being displayed that counts. Gaming at 4k per eye? Most likely not a great idea. Anything else that is less graphically demanding should be perfectly fine.
 
We must play vastly different games. 40hz would be horrifying in Beat Sabre or some of the other live-action games I tend to play. Huge differences between 40 and 60 and 90 and 120. I'll take 120 any day. I can see the stepping and individual frames. I may be old as dirt, but I'm not slow yet! :)

And why on earth would you need 96hz to watch a 24hz video? Are they scamming people again?
Of course they are...
I never said I play (or tolerate) Beat Saber, which is a fast paced VR game, at 40FPS; or any specific game for that matter. I'm talking from the perspective of VRChat mostly, where you don't move too much, so a lower framerate is not too terrible (30FPS is my limit) and instead consistency starts to matter more. That is where the Vision Pro may sit at the beginning? I think? Somewhat slower paced experiences and, if you notice the advertising material, that's more or less what they portray. I don't remember seeing anything where the person was moving too much, all around, using the HMD.

Regards.
 

edzieba

Distinguished
Jul 13, 2016
559
558
19,760
It's easier than that. You simply request all your positional data N milliseconds ahead. Since 60 hz is roughly 16 ms, and because frame generation and lag take up another frame or so, asking for positional data roughly 30ms in the future works out pretty nicely (for a 60hz game loop). The controller automagically uses a couple nifty little algorithms (speed, acceleration, angular momentum, etc, etc) to make its best guess where everything will be at the requested time in the future. Pleasantly enough, it is pretty darn good at it and the result is that the screen you see is nearly exactly where your body feels it. You don't feel lag because its all matched up in time.

If your content only updates at 24hz then displaying 96hz will just get you artificial fake frames like DLSS, or, if fake frame generation is turned off, you'll just get new screen data at 24hz. Whichever way you slice it, it is still 24hz data, not 96hz. A 24hz video is a 24hz video.
Yeah, forward prediction of head and hand pose is done by everyone as standard, but is not good enough, hence why everyone uses post-warp in addition to forward prediction. 16m/30ms old guesses are simply too stale to be acceptable unmodified.

By your definition, every single frame you see in any VR HMD is a 'fake frame'. If it were not, the experience would be crap compared to HMDs which implement post-warp correctly.
 

newtechldtech

Respectable
Sep 21, 2022
355
127
1,860
"Can't run two 4k screens at 60hz". We have a winner for "noob of the year"....
It all depends on the content that is being displayed. My 1080ti can drive 4k60.
I've done it. It works. Most games get scaled down until they CAN do 60 hz because
anything less is painful.

..total rubbish